All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

The last line is the one that determines if the event happened in the last 24 hours or not.  If the results fail that test then there's no need for the alert to trigger.  Sounds like it's working as ... See more...
The last line is the one that determines if the event happened in the last 24 hours or not.  If the results fail that test then there's no need for the alert to trigger.  Sounds like it's working as intended, but maybe the intentions aren't clear?
Hi Community, We are in the process of sending the alerts from Splunk to another application via REST API but response of the REST API is displaying in XML format as our other application would have... See more...
Hi Community, We are in the process of sending the alerts from Splunk to another application via REST API but response of the REST API is displaying in XML format as our other application would have expect  JSON format however we tried using Postman application as well but the response is same in XML so can any one suggest  JSON response REST API to get the alert details? Regards, Eshwar
Hi @splunkreal unfortunately it is not available yet in 9.2. Please stay tuned for future updates!  You may also keep your eye out for any announcements about its availability in Cloud, which woul... See more...
Hi @splunkreal unfortunately it is not available yet in 9.2. Please stay tuned for future updates!  You may also keep your eye out for any announcements about its availability in Cloud, which would indicate that it would be in the following on-prem release. 
I have an index that contains all the hits for our WAF and an index that contains the subsequent API call details for any of those hits that are an application calling one our APIs behind the WAF. Th... See more...
I have an index that contains all the hits for our WAF and an index that contains the subsequent API call details for any of those hits that are an application calling one our APIs behind the WAF. There is a shared identifier that the WAF passes to the API call so we can link them together and see what IP, user agent string, etc. made that API call. I am trying to pull data from both indexes together into a nice table so that our devs and our security folks can see what API calls are being made, who/what is calling them, and the payloads.  API search: index=api source=api_call | rename id as sessionID | fields apiName, payload, sessionID WAF search: index=waf | fields src_ip, requestHost, requestPath, requestUserAgent, sessionID My attempt to join them on the sessionID which is not working. It returns no results. index=api source=api_call | rename message.id as sessionID | fields apiName, message.payload, sessionID | join sessionID [search index=waf | fields src_ip, requestHost, requestPath, requestUserAgent, sessionID] | table apiName, message.payload, sessionID, src_ip, requestHost, requestPath, requestUserAgent I know joins are not very performative, so I'm open to alternatives that don't use it, but I'm not sure what those would be.
Hello, Is using makeresult same as using index, if my goal is only to obtain info_min_time and info_max_time from addinfo? If I only use | addinfo without makeresults, why did it gave me a lot of... See more...
Hello, Is using makeresult same as using index, if my goal is only to obtain info_min_time and info_max_time from addinfo? If I only use | addinfo without makeresults, why did it gave me a lot of results? Thank you so much
Hi @leobsksd, you have to copy the apps and data, not all the etc folder , especially license! I suppose that you want to install on a different computer or a different VM, not on the same instance... See more...
Hi @leobsksd, you have to copy the apps and data, not all the etc folder , especially license! I suppose that you want to install on a different computer or a different VM, not on the same instance in violation, it will not work:  on Windows you cannot reinstall Splunk on the same server, instead on linux you can install in a different folder, eventually deleting the old one without problems. Ciao. Giuseppe
I will try standing up a new instance of Splunk and copying over the $SPLUNK_HOME\var\lib\splunk folder to the new installation from the instance of Splunk that I have backed up.   Will this also co... See more...
I will try standing up a new instance of Splunk and copying over the $SPLUNK_HOME\var\lib\splunk folder to the new installation from the instance of Splunk that I have backed up.   Will this also copy over the configuration from the old server?  Part of the issue is that the old server had a free license which is violation, which is why I am working to setup a new Splunk server without that problem.  Good point about Linux.  I may try installing Splunk on that platform and see how it does. Leobsksd
Hello, Splunk forwarder powershell script is causing resource exhaustion and this script is running for 8-9 hours. Any idea on this issue? Thanks
I am trying to update DNSTwist Add-on for Splunk to it's latest version 1.0.4 but every time it is still showing me that the version update is available i.e. 1.0.4. I have tried it through the UI as... See more...
I am trying to update DNSTwist Add-on for Splunk to it's latest version 1.0.4 but every time it is still showing me that the version update is available i.e. 1.0.4. I have tried it through the UI as well by manually installing and replacing it to the latest version.  Is anyone else facing similar situation? Kindly suggest what needs to be done here.
Hello guys, I have below query which uses join. I see lots of examples how to replace that with stats, but I am not able to. I need to join on _time and another field called snat. Output should at le... See more...
Hello guys, I have below query which uses join. I see lots of examples how to replace that with stats, but I am not able to. I need to join on _time and another field called snat. Output should at least show client_ip Account_Name. Thanks index=_ad (EventCode=4625 OR (EventCode=4771 Failure_Code=0x18)) Account_Name=JohnDoe Source_Network_Address IN (10.10.10.10 20.20.20.20) | bucket span=1m _time | eval Source_Network_Address1 = case(EventCode==4771, trim(Client_Address, "::ffff:")) | eval SourceIP = Source_Network_Address | eval Account_Name4625= case(EventCode=4625,mvindex(Account_Name,1)) | eval Account_Name4771= case(EventCode=4771,Account_Name) | eval Account_Name = coalesce(Account_Name4771, Account_Name4625) | eval Source_Network_Address_Port = SourceIP+":"+Source_Port | rex field=ComputerName "(?<DCName>^([^.]+))" | rename Source_Network_Address_Port as snat | stats count by _time snat Account_Name EventCode DCName | join type=inner _time snat [search index=_network snat IN (10.10.10.10*,20.20.20.20*) | bucket span=1m _time | rex field=client "^(?<client_ip>.*?)\:(?<client_port>.*)" | stats count by _time snat client_ip] @woodcock @MuS
| rex "(?ms)Here is the list of packages installed on the remote Red Hat Linux system : (?<software>.*?)\nplugin_id"
Hello, I am attempting to write some regex with a lookahead. My event is pluginText: <plugin_output> Here is the list of packages installed on the remote Red Hat Linux system : libkadm5-1.18.2-2... See more...
Hello, I am attempting to write some regex with a lookahead. My event is pluginText: <plugin_output> Here is the list of packages installed on the remote Red Hat Linux system : libkadm5-1.18.2-26.el8_9|(none) Wed 17 Jan 2024 10:21:40 AM CST sssd-client-2.9.1-4.el8_9|(none) Wed 03 Jan 2024 06:05:06 AM CST plugin_id: 22869 I would like to capture everything before the plugin_id and after the "Here is the list of packages installed on the remote Red Hat Linux system :\n\n". So all of the software data. My plan is to first extract everything into a big field and then pipe it to another rex command and use max_mode=0 to extract the software into a MV field. I am having some trouble implementing this. Help is appreciated Thank you Nate    
Hi experts, We are in the process of sending the alerts from Splunk to another application via REST API but response of the REST API is displaying in XML format as our other application would have e... See more...
Hi experts, We are in the process of sending the alerts from Splunk to another application via REST API but response of the REST API is displaying in XML format as our other application would have expect  JSON format however we tried using Postman application as well but the response is same in XML so can any one suggest  JSON response REST API to get the alert details? Thank you in advance. Regards, Eshwar
There is no syntax for this. With spath you have to either provide a precise path or not provide path at all so the whole source field (_raw by default) is parsed. You can later do some magic based ... See more...
There is no syntax for this. With spath you have to either provide a precise path or not provide path at all so the whole source field (_raw by default) is parsed. You can later do some magic based most probably on foreach and matching field name to a pattern but that's kinda ugly and not very efficient. Anyway, it seems like a badly designed data schema because it looks as if you should rather have an array od objects instead of different objects. Conceptually - different objects are different types of entity so why would you want to treat them the same?
Hi @shakti , there are many python scripts in the bin folder. Anyway, upgrade the app anche check. Ciao. Giuseppe
Hello @elizabethl_splu  any news? Is it available in V9.2? Thanks.  
More words please. "after migration" can mean anything - an upgrade from version to version, an attempt to move the app between different environments, upgrade of the underlying Splunk version... We ... See more...
More words please. "after migration" can mean anything - an upgrade from version to version, an attempt to move the app between different environments, upgrade of the underlying Splunk version... We have no idea what happened in the first place. Then there's the issue of "page not loading" - what does it mean? Does it show any errors? Do other pages of the same app load properly and only config page doesn't load? Did you check your _internal index for errors?
Hello Everyone, I have created and alert which uses sendresults command to format the email notification. But the problem i have with this is, It does not have View Splunk Results link to view the ... See more...
Hello Everyone, I have created and alert which uses sendresults command to format the email notification. But the problem i have with this is, It does not have View Splunk Results link to view the splunk results. So i have add addinfo the search to grab search id and appended to the splunk url. https://<hostname>:8000/en-US/app/search/search?sid=scheduler__user__search__RMD5fa2e7e4e362d_at_".$info_sid$." | eval application_name = "<a href=https://<hostname>:8000/en-US/app/search/security_events_dashboard?form.field2=&form.application_name=" . application_name . ">" . application_name . "</a>" | eval email_subj="Security Events Alert", email_body="<p>Hello Everyone,</p><p>You are receiving this notification because the application has one or more security events reported in the last 24 hours..<br></p><p> Please click on the link available in the table to fetch events for specific application.</p> </p><p>To view splunk results <a href=https://<hostname>:8000/en-US/app/search/search?sid=scheduler__user__search__RMD5fa2e7e4e362d_at_".$info_sid$.">Click here</a></p> Iam able to receive the link but this link is not loading. Could someone please assist me on this. I want to receive a link similar to the one which i will receive when an alert is triggered. Regards, Sai
map is slow and limited - try something like this | timechart span=10m aligntime=latest count by host | addcoltotals label="Total" labelfield=_time | tail 2 | eval _time=if(_time=="Total", _time, "l... See more...
map is slow and limited - try something like this | timechart span=10m aligntime=latest count by host | addcoltotals label="Total" labelfield=_time | tail 2 | eval _time=if(_time=="Total", _time, "last_count_of_events") | fields - _span | transpose 0 column_name=host header_field=_time | eval avg_count_of_events=round(Total/6) | eval percent_of_increase = round((last_count_of_events/avg_count_of_events)*100)-100 | table host avg_count_of_events last_count_of_events percent_of_increase
thanks for the post, helped me solve a very similar issue that I`ve encountered.