All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hi Team, I have created one query to show case the count with date my query is below: index="abc*" sourcetype=600000304_gg_abs_ipc2 source!="/var/log/messages" "Total msg processed for trim reage f... See more...
Hi Team, I have created one query to show case the count with date my query is below: index="abc*" sourcetype=600000304_gg_abs_ipc2 source!="/var/log/messages" "Total msg processed for trim reage file:" | rex "Total msg processed for trim reage file:(?<records>\d+)" | timechart span=1d values(records) AS RecordCount Now the issue is that I am getting the counts on one single day like this: 2023-07-06                                                                       1                                                                                                  29                                                                                                  42 How can I create query for this.
Hi, I wanted to access Splunk Cloud API from my java application. Please let me know if there is any sample program available in this regard for reference. The official splunk documentation is not m... See more...
Hi, I wanted to access Splunk Cloud API from my java application. Please let me know if there is any sample program available in this regard for reference. The official splunk documentation is not much useful in this case.  Thanks  
Hello I have the following example: | makeresults count=3 | streamstats count | eval C=(random() % 9) + 1 | eval S1=(random() % 5) + 1 | eval S2=(random() % S1) + 1 | eval Connects=(random() %... See more...
Hello I have the following example: | makeresults count=3 | streamstats count | eval C=(random() % 9) + 1 | eval S1=(random() % 5) + 1 | eval S2=(random() % S1) + 1 | eval Connects=(random() % 5) + 1 | eval Consumer = "Consumer" . C | eval Service = "Service" . S2 | chart count(Consumer) over Consumer by Service | addtotals labelfield=Consumer fieldname=Total | addcoltotals labelfield=Consumer label=Sum This gives something similar to the following: Consumer Service1 Service2 Service3 Total Consumer1 0 0 1 1 Consumer2 2 1 0 3 Consumer5 0 1 0 1 Sum 2 2 1 5   Can someone please help me to add a new column with the percentage of "Total" based on the "Sum". for example: Consumer Service1 Service2 Service3 Total % Consumer1 0 0 1 1 20% Consumer2 2 1 0 3 60% Consumer5 0 1 0 1 40% Sum 2 2 1 5   Please note, the service columns are variable. The request is not urgent. Thank you and many greetings Res
Hi Team, How we can fetch the below keywords from raw logs: 2023-06-29 09:41:53.884 [INFO ] [pool-2-thread-1] ArchivalProcessor - finished reading file /absin/TRIM.ARCH.D062923.T052525 2023-07-... See more...
Hi Team, How we can fetch the below keywords from raw logs: 2023-06-29 09:41:53.884 [INFO ] [pool-2-thread-1] ArchivalProcessor - finished reading file /absin/TRIM.ARCH.D062923.T052525 2023-07-13 02:42:02.915 [INFO ] [pool-2-thread-1] FileSensor - Start Reading Account balance Data File, QACDU.D062623.T065000 2023-07-13 18:53:10.226 [INFO ] [pool-5-thread-1] FileSensor - Completed Account balance file processing, QACDU.D062623.T065000 records processed: 105932244, Kafka counter: 0
I have certain project IDs I'm trying to get a list of IP addresses from.
Hi Team, i want to check whether is it feasible to send data or txt files exist in a folder using splunk forwarder to remote system or syslog server  please suggest me steps to do it
Hello! I have a JSON payload whose _time field gets parsed no issue when I perform a manual upload, but when that same payload comes in through a HEC with the same sourcetype then it doesn't parse t... See more...
Hello! I have a JSON payload whose _time field gets parsed no issue when I perform a manual upload, but when that same payload comes in through a HEC with the same sourcetype then it doesn't parse the milliseconds. Sample payload: {"flowid":"dc59cf7376370faadfb89764e1896a1b","id":23431,"action":"upload","request":"","response":"{\"success\":false,\"correlation_id\":\"00-dc59cf7376370faadfb89764e1896a1b-d23cff8d675709d1-01\",\"status_code\":\"401\",\"message\":\"Request unsuccessful. The following errors were found.\",\"errors\":[{\"code\":\"E_TECHNICAL\",\"value\":\"A technical error prevented the success of the request.\"}]}","midid":"","dest":"","type":"GET","requesttime":"2023-07-12T10:17:32.4327504Z","externaltime": null,"externalresponsetime": null,"middlewaretime":"2023-07-12T10:17:32.4327504Z","logtime":"2023-07-12T10:17:32.6039773","globaltime":"2023-07-12T10:17:32.4333085","responsetime":"2023-07-12T10:17:32.4364843"} Sourcetype: [json_sourcetype] SHOULD_LINEMERGE = false TIME_PREFIX = \"logtime\"\:\" TIME_FORMAT = %Y-%m-%dT%H:%M:%S.%6N TZ = UTC TRUNCATE = 0 MAX_TIMESTAMP_LOOKAHEAD = 0 KV_MODE = json Has anybody faced this issue before? What could the problem be? Thank you and best regards, Andrew
I want to create an input where user will provide a value for the interval field in cron expression/value. But the interval field currently accept values in seconds. Can someone provide a solution fo... See more...
I want to create an input where user will provide a value for the interval field in cron expression/value. But the interval field currently accept values in seconds. Can someone provide a solution for this ?
Does Splunk Enterprise provides any API to retrieve or modify Incidents by RestAPI? Example: Get Incident information  Change Incident Status  Change Incident Severity  Change Incident Owner A... See more...
Does Splunk Enterprise provides any API to retrieve or modify Incidents by RestAPI? Example: Get Incident information  Change Incident Status  Change Incident Severity  Change Incident Owner Add Tag to incident
Hi All! I want to calculate the sum of failed and declined  | eval Msg=if((Failure_Message=="200 Emv error " OR Failure_Message=="NoAcquirerFoundConfigured "),"Failed","Declined") Now I want to ca... See more...
Hi All! I want to calculate the sum of failed and declined  | eval Msg=if((Failure_Message=="200 Emv error " OR Failure_Message=="NoAcquirerFoundConfigured "),"Failed","Declined") Now I want to calculate the sum of failed and declined in the next line. I am already doing stats count of other fields, so need to add this one with them. Here is the one I am working on, but the problem is its not giving the output for Failed and Declined - Here is my query- index=idx-stores-pos sourcetype=GSTR:Adyen:log | transaction host startswith="Transaction started" maxpause=90s | search Failure OR Success | eval Store= substr(host,1,7) | eval Register= substr(host,8,2) | rex field=_raw "AdyenPaymentResponse:.+\sResult\s:\s(?<Status>.+)" | rex field=_raw "AdyenPaymentResponse:.+\sReason\s:\s(?<Failure_Message>.+)" | rex field=_raw "AdyenPaymentResponse:.+\sMessage\s:\s(?<Error_Message>.+)\;" | replace "* " with * in Error_Message Failure_Message | eval Msg=if((Failure_Message=="200 Emv error " OR Failure_Message=="NoAcquirerFoundConfigured "),"Failed","Declined") | stats count(eval(Status="Success")) AS Success_Count count(eval(Status="Failure")) AS Failure_Count sum(eval(Msg="Failed")) AS Failed sum(eval(Msg="Declined")) AS Declined By Store Register | eval Total_Payment= Success_Count + Failure_Count | table Store Register Success_Count Failure_Count Total_Payment Failed Declined
Hello. I have a table with a column for Releases, in this case, a bunch of them does not have releases. I used the fillnull function in this specific field, but it's not working. In this table I hav... See more...
Hello. I have a table with a column for Releases, in this case, a bunch of them does not have releases. I used the fillnull function in this specific field, but it's not working. In this table I have other columns that I have null values as well, but for the other ones the fillnull worked. How can I do the same for the releases field?          
Hello Splunkers, Correct me if I'm wrong but it seems that when you install Splunk UF on a machine, some logs of the machine (specifically located in  $SPLUNK_HOME/var/log) will be forwarded by defa... See more...
Hello Splunkers, Correct me if I'm wrong but it seems that when you install Splunk UF on a machine, some logs of the machine (specifically located in  $SPLUNK_HOME/var/log) will be forwarded by default. For instance I see some default settings here  /opt/splunkforwarder/etc/system/default/inputs.conf : There is also similar config in this path : /opt/splunkforwarder/etc/apps/SplunkUniversalForwarder/default/inputs.conf I am wondering about the effects of _TCP_ROUTING = * Does it mean that those monitored paths will be sent to all tcp group defined in the outputs.conf files of my machine ? What would be the purpose of that ? Would you have a clean way to override that kind of config to send _internal logs only to one particular TCP group ? Thanks for your time, GaetanVP  
Hi community, I need help identifying where I got wrong. The following is my testing SPL: | makeresults | fields - _time | eval _raw="<Event xmlns='http://schemas.microsoft.com/win/2004/08/events/... See more...
Hi community, I need help identifying where I got wrong. The following is my testing SPL: | makeresults | fields - _time | eval _raw="<Event xmlns='http://schemas.microsoft.com/win/2004/08/events/event'><System><Provider Name='Microsoft-Windows-Security-Auditing' Guid='{54849625-5478-4994-a5ba-3xxxxxxxxx}'/><EventID>4662</EventID><Version>0</Version><Level>0</Level><Task>12804</Task><Opcode>0</Opcode><Keywords>0x8020000000000000</Keywords><TimeCreated SystemTime='2020-09-01T07:00:18.999999800Z'/><EventRecordID>35xxxx65</EventRecordID><Correlation ActivityID='{5xxxxxxxx-b61d-0004-afc0-ac531db6d901}'/><Execution ProcessID='1520' ThreadID='1628'/><Channel>Security</Channel><Computer>XXXXXXXXXXXXXX.riv</Computer><Security/></System><EventData><Data Name='SubjectUserSid'>NT AUTHORITY\SYSTEM</Data><Data Name='SubjectUserName'>XXXXXXX$</Data><Data Name='SubjectDomainName'>XXXXXXXX</Data><Data Name='SubjectLogonId'>0x3e7</Data><Data Name='ObjectServer'>WMI</Data><Data Name='ObjectType'>WMI Namespace</Data><Data Name='ObjectName'>ROOT\CIMV2\Security\MicrosoftTpm</Data><Data Name='OperationType'>Object Access</Data><Data Name='HandleId'>0x0</Data><Data Name='AccessList'>%%1552 %%1553 </Data><Data Name='AccessMask'>0x3</Data><Data Name='Properties'>-</Data><Data Name='AdditionalInfo'>Local Execute (ExecMethod)</Data><Data Name='AdditionalInfo2'>ROOT\CIMV2\Security\MicrosoftTpm:Win32_Tpm=@::GetOwnerAuthForEscrow</Data></EventData></Event>" | rex mode=sed "s/.*(?<eventId><EventID>4662<\/EventID>).*(?<userName><[Data Name='SubjectUserName']>*.*<\/Data>).*/\1\2/g" The result differs from what I want. I need data for the SubjectUserName, not the AditionalInfo2 data Can anyone help me with this, please? Thank you!
Status  Unit Count Duplicate IT 5 Failure BE 2 Success DE 6 Success IT 25 Success PT 18 Success DE 10 Success PT 5 Total   71   I am adding the col... See more...
Status  Unit Count Duplicate IT 5 Failure BE 2 Success DE 6 Success IT 25 Success PT 18 Success DE 10 Success PT 5 Total   71   I am adding the col total using the query | addcoltotals label=status lable =total But now I want to calculate the sum by Unit Like this- After calculating the sum want to create an alert if any of the unit is 0 then it will create an alert. IT 30 DE 16 PT 23 BE 2     Total 71
Hi All who can tell me  What's the different between splunk add on for vmware and splunk add on for vmware metrics. i'd like to monitor my vmware env. but see the document it's too complex.   
Hi all, I need a regex to grep a few bits from the following raw data:  <Event xmlns='http://schemas.microsoft.com/win/2004/08/events/event'><System><Provider Name='Microsoft-Windows-Security-Audit... See more...
Hi all, I need a regex to grep a few bits from the following raw data:  <Event xmlns='http://schemas.microsoft.com/win/2004/08/events/event'><System><Provider Name='Microsoft-Windows-Security-Auditing' Guid='{54849625-5478-4994-a5ba-3xxxxxxxxx}'/><EventID>4662</EventID><Version>0</Version><Level>0</Level><Task>12804</Task><Opcode>0</Opcode><Keywords>0x8020000000000000</Keywords><TimeCreated SystemTime='2020-09-01T07:00:18.999999800Z'/><EventRecordID>35xxxx65</EventRecordID><Correlation ActivityID='{5xxxxxxxx-b61d-0004-afc0-ac531db6d901}'/><Execution ProcessID='1520' ThreadID='1628'/><Channel>Security</Channel><Computer>XXXXXXXXXXXXXX.riv</Computer><Security/></System><EventData><Data Name='SubjectUserSid'>NT AUTHORITY\SYSTEM</Data><Data Name='SubjectUserName'>XXXXXXX$</Data><Data Name='SubjectDomainName'>XXXXXXXX</Data><Data Name='SubjectLogonId'>0x3e7</Data><Data Name='ObjectServer'>WMI</Data><Data Name='ObjectType'>WMI Namespace</Data><Data Name='ObjectName'>ROOT\CIMV2\Security\MicrosoftTpm</Data><Data Name='OperationType'>Object Access</Data><Data Name='HandleId'>0x0</Data><Data Name='AccessList'>%%1552 %%1553              </Data><Data Name='AccessMask'>0x3</Data><Data Name='Properties'>-</Data><Data Name='AdditionalInfo'>Local Execute (ExecMethod)</Data><Data Name='AdditionalInfo2'>ROOT\CIMV2\Security\MicrosoftTpm:Win32_Tpm=@::GetOwnerAuthForEscrow</Data></EventData></Event> I need: 1. <EventID>4662</EventID> 2. <Data Name='ObjectType'>WMI Namespace</Data> 3. <Data Name='ObjectName'>ROOT\CIMV2\Security\MicrosoftTpm</Data> 4. <Data Name='AdditionalInfo2'>ROOT\CIMV2\Security\MicrosoftTpm:Win32_Tpm=@::GetOwnerAuthForEscrow</Data> Thank you!
Is there anyone who can explain me strange behaivor of "values" function. I created statistic by "stats" with "values" function and it returned mvfield as I expected, but there was in one line where ... See more...
Is there anyone who can explain me strange behaivor of "values" function. I created statistic by "stats" with "values" function and it returned mvfield as I expected, but there was in one line where values in mvfield were separated by comma not by newline. I attached a screenshot of this. I tested on Splunk 8.2.7 and 9.0.0. If I replace colon in field "source" by something else, the behavior change.
Hello All, New to SPLUNK SOAR. Is there a function or developed by someone where i am trying to export the JSON (output of a splunk query) to CSV? The intent is to add the attachment to our incident... See more...
Hello All, New to SPLUNK SOAR. Is there a function or developed by someone where i am trying to export the JSON (output of a splunk query) to CSV? The intent is to add the attachment to our incident management system.
I wanted to read up on which roles does need the KV store, which might and wich do not. In the Admin Manual under About the app key value store, Disable the KV store is stated "... You can disable t... See more...
I wanted to read up on which roles does need the KV store, which might and wich do not. In the Admin Manual under About the app key value store, Disable the KV store is stated "... You can disable the KV store on indexers and forwarders, and on any installation that does not have any local apps or local lookups that use the KV store." I understand it this way: "You can disable the KV store on indexers and forwarders" no exeption or is there a list of exeption which is not mentioned? The "and on any installation that does not have any local apps or local lookups that use the KV store" part would mean, only then I would need it on instances other then indexers and forwarders if they run apps wich make use of the KV store. But I was assuming that a search head cluster would need a KV store and that SHC-D, Deployer and  Deployment Server would make use of the KV store itself.  Is there a more detailed doc on which instances utilizes the kv store for what ?
Hi, I have below scenario. Image_Name and Name_Space are being ingested with below variations in table A. Image_name is a multivalued field as shown. I tried using makemv delim but it doesnt work b... See more...
Hi, I have below scenario. Image_Name and Name_Space are being ingested with below variations in table A. Image_name is a multivalued field as shown. I tried using makemv delim but it doesnt work because there is no delimiter e.g. space between the two. I need them separated out as in table B. Thanks in advance! Table A: Image_Name Name_Space <none> c-ecm-dev/das-dynamic-filter-services c-ecm-dev <none> cs-webapps-sat NULL NULL NULL c-aoic-dev c-ecm-dev/das-dynamic-filter-services c-ecm-sat/irtf-das-service c-ecm-sat c-ecm-dev/das-dynamic-filter-services cpopen/ibm-watson-speech-catalog openshift-marketplace c-ecm-sbx/das-pay-gov-services iam-essar-aqt1/iam-essar-aqt1 NULL c-ecm-sbx/das-rendering-service sysdig cs-webapps-sbx/baldue-bwas c-ecm-dev/das-rendering-service c-ecm-dev   Table B: Image_Name Name_Space <none> c-ecm-dev c-ecm-dev/das-dynamic-filter-services c-ecm-dev <none> cs-webapps-sat NULL NULL NULL c-aoic-dev c-ecm-dev/das-dynamic-filter-services c-ecm-sat c-ecm-sat/irtf-das-service c-ecm-sat c-ecm-dev/das-dynamic-filter-services openshift-marketplace cpopen/ibm-watson-speech-catalog openshift-marketplace c-ecm-sbx/das-pay-gov-services NULL iam-essar-aqt1/iam-essar-aqt1 NULL c-ecm-sbx/das-rendering-service sysdig cs-webapps-sbx/baldue-bwas c-ecm-dev c-ecm-dev/das-rendering-service c-ecm-dev