All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

I have not modified it's settings. It worked once & it just broke down.  It is installed on the Cluster Master server. Need your help please.
Hello, Pls could you provide the integration steps for Kaspersky EDR Optimum and Kaspersky Sandbox with Splunk
Hi all, I know it is possible to only show rows / panels if a token is set,   <row depends="$token$">   Is it possible to only show the row if the token has a specific value?   <row depends="... See more...
Hi all, I know it is possible to only show rows / panels if a token is set,   <row depends="$token$">   Is it possible to only show the row if the token has a specific value?   <row depends="$token$ == 1">   Thanks for any help
Hi all I am new to Splunk. I want to learn Search Processing. Can anyone give me some example of Search Processing Query. Like filter of IP Range , Outbound communication, Inbound communication by t... See more...
Hi all I am new to Splunk. I want to learn Search Processing. Can anyone give me some example of Search Processing Query. Like filter of IP Range , Outbound communication, Inbound communication by the Public IP.  
Hi there! I am trying to join an event table (E1) with a summary table (S1). S1 is just a summary table containing stats derived from the event table (E1). I am trying to accomplish this cause I hav... See more...
Hi there! I am trying to join an event table (E1) with a summary table (S1). S1 is just a summary table containing stats derived from the event table (E1). I am trying to accomplish this cause I have to compare the stats to each event. The query runs smoothly but won't give me the correct stats. Whenever I try to run them separately, the results are correct but when joined together as with the query below, it gives the wrong answer. For context, it gives bigger values for the stats. Hope anybody can help! Thank you in advance! Please see query below.      index=test sourcetype=aws* earliest=-0.5d@d | search source=*RDS* metric_name=AbortedClients | bin span=5m _time | stats count as DataCount by _time, metric_name | table _time, metric_name, DataCount | join left=L right=R where L.metric_name = R.metric_name [ | search source=*RDS* metric_name=AbortedClients | bin span=5m _time | stats count as DataCount by _time, metric_name | stats sum(DataCount) as TotalCount, avg(DataCount) as Average, stdev(DataCount) as StanDev, p25(DataCount) as P_25, p50(DataCount) as P_50, p75(DataCount) as P_75 by metric_name | eval IQR = P_75 - P_50 | eval LB = P_25 - (IQR*1.5) | eval UB = P_75 + (IQR*1.5) | eval OneThres = Average + (2 * StanDev) | table metric_name, TotalCount, Average, StanDev, P_25, P_50, P_75, IQR, LB, UB, OneThres ]      
Hi I am trying to understand how indexes and sourcetype are defined. Let's say I have an app with a web component and a database component. Should the web component and db component be different in... See more...
Hi I am trying to understand how indexes and sourcetype are defined. Let's say I have an app with a web component and a database component. Should the web component and db component be different indexes?  And the sourcetype is a category within each index? Does Splunk automatically determine the sourcetype based on the data it ingested? Or is this something that is manually done?
hi all, I have multiple string that are regex, i want to find logs that match with this string. this is a example of my regex: (?i)union.*?select.*?from (?i:\b(?:(?:m(?:s(?:ys(?:ac(?:cess(?:objec... See more...
hi all, I have multiple string that are regex, i want to find logs that match with this string. this is a example of my regex: (?i)union.*?select.*?from (?i:\b(?:(?:m(?:s(?:ys(?:ac(?:cess(?:objects|storage|xml)|es)|(?:relationship|object|querie)s|modules2?) and when i write  index="xyz" | regex "(?i)union.*?select.*?from | (?i:\b(?:(?:m(?:s(?:ys(?:ac(?:cess(?:objects|storage|xml)|es)|(?:relationship|object|querie)s|modules2?)" didn't show true result. how can i write it? please help me.
I have following events in the log. Although there are lot of rows in it but I interested in these rows only and in extracting "time: and anything after "subject:"     --- 2020.1.02 Windows Server... See more...
I have following events in the log. Although there are lot of rows in it but I interested in these rows only and in extracting "time: and anything after "subject:"     --- 2020.1.02 Windows Server 2016 2021-09-11T11:01:19,865 ERROR pool-11-thread-3 Problem creating batch from the downloaded mail with subject: RE: Hello this is first email --- 2020.1.02 Windows Server 2016 2021-09-11T11:01:19,865 ERROR pool-11-thread-3 Problem creating batch from the downloaded mail with subject: Re: Hello this is second email --- 2020.1.02 Windows Server 2016 2021-09-11T11:01:19,865 ERROR pool-11-thread-3 Problem creating batch from the downloaded mail with subject: Re: Hello this is third email ---     So need to a create a report like this - Time Subject 2016 2021-09-11 11:01:19 RE: Hello this is first email 2016 2021-09-11 11:01:21 Re: Hello this is second email 2016 2021-09-11 11:01:22 Re: Hello this is third email   Thanks!
I want to anonymize one sourcetype before routing it to 3rd party system with Syslog. what is the proper config for props, transforms, and outputs config files? When is use SEDCMD in props.conf, all... See more...
I want to anonymize one sourcetype before routing it to 3rd party system with Syslog. what is the proper config for props, transforms, and outputs config files? When is use SEDCMD in props.conf, all events (for me and 3rd party system) will be anonymized, and when I use REGEX in transforms.conf for anonymizing, I  can't route events to another system at the same transform stanza because FORMAT field should be used with _raw value.
I have the below test raw logs CEF:0|Forcepoint|Forcepoint DLP|8.8.0|55564097|DLP Syslog|2| act=Permitted duser=destuser@gmail.com fname=testfile.PDF.TXT - 11.01 KB msg=EndPoint Operation suser=User... See more...
I have the below test raw logs CEF:0|Forcepoint|Forcepoint DLP|8.8.0|55564097|DLP Syslog|2| act=Permitted duser=destuser@gmail.com fname=testfile.PDF.TXT - 11.01 KB msg=EndPoint Operation suser=User, Test cat=Test Category sourceServiceName=Endpoint Printing analyzedBy=Policy Engine testengine loginName=testuser1 sourceIp=N/A severityType=LOW sourceHost=testhost productVersion=8.0 maxMatches=0 timeStamp=2021-09-01 15:58:50.624 destinationHosts=N/A eventId=4762037341417287789 CEF:0|Forcepoint|Forcepoint DLP|8.8.0|55564097|DLP Syslog|2| act=Permitted duser=destuser@gmail.com fname=testfile.PDF.TXT - 11.01 KB msg=EndPoint Operation suser=User, Test cat=Test Category sourceServiceName=Endpoint Printing analyzedBy=Policy Engine testengine loginName=domain\\testuser sourceIp=N/A severityType=LOW sourceHost=testhost productVersion=8.0 maxMatches=0 timeStamp=2021-09-02 15:58:50.624 destinationHosts=N/A eventId=4762037341417287788 CEF:0|Forcepoint|Forcepoint DLP|8.8.0|55564097|DLP Syslog|2| act=Permitted duser=destuser@gmail.com fname=testfile.PDF.TXT - 11.01 KB msg=EndPoint Operation suser=User, Test cat=Test Category sourceServiceName=Endpoint Printing analyzedBy=Policy Engine testengine loginName=tuser sourceIp=N/A severityType=LOW sourceHost=testhost productVersion=8.0 maxMatches=0 timeStamp=2021-09-04 15:58:50.624 destinationHosts=N/A eventId=4762037341417287787 CEF:0|Forcepoint|Forcepoint DLP|8.8.0|55564097|DLP Syslog|2| act=Permitted duser=destuser@gmail.com fname=testfile.PDF.TXT - 11.01 KB msg=EndPoint Operation suser=User, Test cat=Test Category sourceServiceName=Endpoint Printing analyzedBy=Policy Engine testengine loginName=N/A sourceIp=N/A severityType=LOW sourceHost=testhost productVersion=8.0 maxMatches=0 timeStamp=2021-09-03 15:58:50.624 destinationHosts=N/A eventId=4762037341417287786   I am trying to use rex to extract a field called loginName, in which the regex will capture all entries after the "loginName=" text. I have tried ...| rex field=_raw "(loginName=)(?<loginName>[^\=]+)(?=\s)", but it does not capture all events. Please assist.
Hello guys, I have the VPN log and network log. - In VPN log's it's possible to show IP and USERNAME  - In Network log it's possible to show what's site the IP access.  I need to comparare 2 fi... See more...
Hello guys, I have the VPN log and network log. - In VPN log's it's possible to show IP and USERNAME  - In Network log it's possible to show what's site the IP access.  I need to comparare 2 fields ( IP VPN  [src_ip] , IP Network [SRC])  if the field is the same i will add the user. I Tried this:       index=security host=homolog (sourcetype=vpn_log OR sourcetype=network_log) | where src_ip=SRC | eval username_acess=user | table username_acess,SRC,dst       But doesnt work. Another way is:       | eval field1=SRC,field2=src_ip | eval results1=if(field1=field2,"Yes","No") | eval results2=if(match(field1,field2),"Yes","No") | where match(field1,field2)       I think the error is because the sourcetype is different. Could you help me ?         
For example: |  tstats count from datamodel=test where * by test.url, test.user  | rename test.* AS * | search NOT     [ | inputlookup list_of_domains        | rename domain AS url        | fie... See more...
For example: |  tstats count from datamodel=test where * by test.url, test.user  | rename test.* AS * | search NOT     [ | inputlookup list_of_domains        | rename domain AS url        | fields url ] | table url, user   I want this to show me the urls from the DM that do NOT appear in the lookup, and then give me the corresponding usernames from the DM. But, this is not working properly. When I run this search, I still see some of the urls that are in the lookup. Please help!
  I have this query and I want to add another data series/line to this chart. How can I do it? index="eniq_voice" |where localDn="ManagedElement=TO5CSCF01" |bucket _time span=15m |stats ma... See more...
  I have this query and I want to add another data series/line to this chart. How can I do it? index="eniq_voice" |where localDn="ManagedElement=TO5CSCF01" |bucket _time span=15m |stats max(CPULoad_Total) as CPULoad_Total by localDn _time |timechart max(CPULoad_Total) as CPULoad_Total I want to add this to the query with a linechart: |stats max(CPULoad_Max) as CPULoad_Max by localDn _time |timechart max(CPULoad_Max) as CPULoad_Max by localDn _time
Which do you use or side with please? Which do you think is the best for functionality & using bandwidth? Thank u for your time & consideration?
I keep getting an error message in our messages section at the top, stating that Search head cluster member ____ is having problems pulling configurations from the search head cluster captain.  Chang... See more...
I keep getting an error message in our messages section at the top, stating that Search head cluster member ____ is having problems pulling configurations from the search head cluster captain.  Changes from the other members are not replicating to this member, and changes on this member are not replicating to other members. Consider performing a destructive configuration resync on this search head cluster member.    I've performed destructive resync several times, and also when I use the show kvstore status function from any searchhead including the supposedly affected one it claims there are no actual issues with the searchhead, it just says it is a non-captain KV store member just like the rest.  If I try to manually close out the error message it just comes back. What else can I look into here?
Hello, I'm looking to reference a specific artifact from the Phantom Playbook Visual Editor. For example, a Phantom: Update Artifact block takes two parameters: artifact_id and cef_json. The list of... See more...
Hello, I'm looking to reference a specific artifact from the Phantom Playbook Visual Editor. For example, a Phantom: Update Artifact block takes two parameters: artifact_id and cef_json. The list of default datapaths for artifact_id all follow the format of artifact:*.<field>, where the wildcard causes the update to occur on ALL artifacts. I would instead like to reference the first artifact in the container, so that only the first artifact is updated. Is there a way to construct the datapath to accomplish this?   The current workaround I have for this is to use a Custom Function to output the first artifact object of the container, but this only creates a snapshot of the artifact object at the time the function is called; If I update the artifact after calling the function, I'll need to call the function again to get the updated artifact object values. The closest thing I've seen to this is the phantom.collect() API call, in which you can specify a datapath with a specific label (ie. phantom.collect(container, "artifact:uniqueLabel")), where you can specify a label and only the artifacts with the given label is returned, but this same syntax does not work in the Playbook Visual Editor.
I'm trying to extract field That looks like "Alert-source-key":"[\"abcdd-gdfc-mb40-a801-e40fd9db481e\"]"     I have tried this "Alert-source-key":"(?P<Alert_key>[^"]+)" but i'm getting results lik... See more...
I'm trying to extract field That looks like "Alert-source-key":"[\"abcdd-gdfc-mb40-a801-e40fd9db481e\"]"     I have tried this "Alert-source-key":"(?P<Alert_key>[^"]+)" but i'm getting results like "[/" since it is checking for only 
Hi  Need help converting 210910085155 to yymmddhhmmss index=mydata | eval fields=split(EventMsg,",") | eval file_string=mvindex(fields,0) | eval CreatedDate=mvindex(fields,17) | table Creat... See more...
Hi  Need help converting 210910085155 to yymmddhhmmss index=mydata | eval fields=split(EventMsg,",") | eval file_string=mvindex(fields,0) | eval CreatedDate=mvindex(fields,17) | table CreatedDate CreatedDate =210910085155 need to covert it to Date
We have around 80+ accounts in AWS so far, and we spin up new accounts every so often. We're using the Splunk Add-on for AWS (#1876). Configuring each account manually is a chore. Is there any way t... See more...
We have around 80+ accounts in AWS so far, and we spin up new accounts every so often. We're using the Splunk Add-on for AWS (#1876). Configuring each account manually is a chore. Is there any way to automatically configure new accounts without having to use the UI?
Hello @jkat54 , @richgalloway    I am new to the add-on and am not able to figure out how to make API calls with this. Attempting to use the  OpenWeatherMap api below { OpenWeatherMap API - Free We... See more...
Hello @jkat54 , @richgalloway    I am new to the add-on and am not able to figure out how to make API calls with this. Attempting to use the  OpenWeatherMap api below { OpenWeatherMap API - Free Weather Data (for Developers) (rapidapi.com) }       | curl method=GET uri=https://community-open-weather-map.p.rapidapi.com/weather user=<mysplunkusername> pass=<mysplunkpassword> headerfield= { 'x-rapidapi-host': "community-open-weather-map.p.rapidapi.com", 'x-rapidapi-key': "API_key_from_rapid_API" } data={"q":"London","lat":"0","lon":"0","callback":"test","id":"2172797","lang":"null","units":"imperial","mode":"json"}     instead of getting the data i am getting below output. {attached screenshot} can you please tell me what am i doing wrong