All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

How are you doing procedures for Notable Events? The description field doesn't support paragraph breaks. I'd been using Next Steps as my space for procedures. With the upgrade to 7.3.0, my Next Ste... See more...
How are you doing procedures for Notable Events? The description field doesn't support paragraph breaks. I'd been using Next Steps as my space for procedures. With the upgrade to 7.3.0, my Next Steps all have  {"version":1,"data":" appended at the start. If I try to update them, it appears Splunk upgrades the text to the new version and linebreaks are no longer supported and my procedures turn into giant blobs of text.
You can try this regex also :  "Key":\s*"Owner",\s*"ValueString":\s*"(?<Team_Name>[^"]*)" Regex
Hi Splunkers, I have to calculate daily ingested volume in a Splunk Enteprise environment. Here on community I found a lot of post, and related answer, to a similar question: daily license consumpti... See more...
Hi Splunkers, I have to calculate daily ingested volume in a Splunk Enteprise environment. Here on community I found a lot of post, and related answer, to a similar question: daily license consumption, but I don't know if it is what I need. I mean: we know that, once data are ingested by Splunk, compression factor is applied and, in a non clustered environment, it is more or less 50%. So, for example, if I have 100 GB data ingested by day, final size on disk will be 50 GB . Well, I have to calculate total GB BEFORE compression is applied. So, in my above example, search/method I need should NOT return 50 GB as final result, but 100 GB. Moreover, in my current env, I have an Indexers cluster.  So, what is not clear is: daily consumed License, is what I need? I mean: when I see daily consumed license by my environment, GB returned are the ingested one BEFORE compression, or the Compressed one?  
I am trying to create a Transaction where my starting and ending 'event' have exactly the same time. In _raw the time is "Wed Feb 21 08:15:01 CST 2024" My current SPL is:  | transaction keeporphans... See more...
I am trying to create a Transaction where my starting and ending 'event' have exactly the same time. In _raw the time is "Wed Feb 21 08:15:01 CST 2024" My current SPL is:  | transaction keeporphans=true host aJobName startswith=("START of script") endswith=("COMPLETED OK" OR "ABORTED, exiting with status") But my transaction only has the starting event. So I added the following which had no change ? | eval _time = case( match(_raw, "COMPLETED OK"), _time +5, match(_raw, "ABORTED"), _time +5, true(),_time) | sort _time | transaction keeporphans=true host aJobName startswith=("START of script") endswith=("COMPLETED OK" OR "ABORTED, exiting with status") When I added the above changes, when I look the the events in the 'Time' columns they are 5 seconds apart, yet Tranaction does not associate them ? 2/21/24 8:15:01.000 AM (Starting Event) 2/21/24 8:15:06.000 AM (Ending Event)
When dealing with historical data in Splunk, there are a few factors to consider. i) Check if your Splunk deployment has custom retention policies configured. You can adjust these policies to retain... See more...
When dealing with historical data in Splunk, there are a few factors to consider. i) Check if your Splunk deployment has custom retention policies configured. You can adjust these policies to retain data for a longer period of time. I think that you should read at https://docs.splunk.com/Documentation/Splunk/latest/Indexer/Setaretirementandarchivingpolicy https://docs.splunk.com/Documentation/Splunk/latest/Admin/Indexesconf
Anyone any experience with automated testing of Splunk dashboards.  I'm looking for something to test whether all drilldowns and dropdowns work and preferably data check if numbers add up.
I just realized that the NIX TA is being deployed to our forwarders via the deployment apps, to the indexers via the master apps and to the SHs via the SH apps. It was a surprise for me to realize th... See more...
I just realized that the NIX TA is being deployed to our forwarders via the deployment apps, to the indexers via the master apps and to the SHs via the SH apps. It was a surprise for me to realize that the TA is not being deployed to the deployment and deployer servers, the license master and the cluster master. And therefore, how can the TA be deployed to all Splunk servers?
Understand what security monitoring means and learn more about it and Remember, the best way to learn is by doing. Start with some basic use cases and gradually progress to more advanced ones. You ca... See more...
Understand what security monitoring means and learn more about it and Remember, the best way to learn is by doing. Start with some basic use cases and gradually progress to more advanced ones. You can also join online communities and forums to connect with other Splunk users and ask questions. https://www.splunk.com/en_us/blog/security/introducing-splunk-security-use-cases.html https://lantern.splunk.com/Security/Getting_Started/Identifying_Splunk_Enterprise_Security_use_cases_and_data_sources https://www.splunk.com/en_us/resources/videos/splunk-enterprise-security-use-case-library.html  
Splunk keeps a seven-day backup of data and config files, but that probably does not include dashboards. The https://splunkbase.splunk.com/app/5061 app purports to support dashboards, but I've never... See more...
Splunk keeps a seven-day backup of data and config files, but that probably does not include dashboards. The https://splunkbase.splunk.com/app/5061 app purports to support dashboards, but I've never used it. In the worst case, you can edit the dashboard and copy-paste the source into git.
I almost forgot one of the best resources to learn is https://bots.splunk.com/login. You will find rebuild detection in there that you can copy out and use. Is also has games and challenges. I believ... See more...
I almost forgot one of the best resources to learn is https://bots.splunk.com/login. You will find rebuild detection in there that you can copy out and use. Is also has games and challenges. I believe you login for this site will work there. You will find it very useful.  https://bots.splunk.com/login
Splunk Add-on for MYSQL Database: What role/permissions are required from MYSQL dba to use this add-on? What role should be assigned to the user created on MYSQL server to communicate with splunk db... See more...
Splunk Add-on for MYSQL Database: What role/permissions are required from MYSQL dba to use this add-on? What role should be assigned to the user created on MYSQL server to communicate with splunk db connect.
Try it like this: (I don't think you can change the time from what was used in the base search, and there should only be one level of <search></search>) <chart depends="$abc$"> <title>Chart1</title>... See more...
Try it like this: (I don't think you can change the time from what was used in the base search, and there should only be one level of <search></search>) <chart depends="$abc$"> <title>Chart1</title> <search base="basesearch"> <query> |search host="INFO" OR host="ERROR" panel=$panel1$ |timechart span=$TimeSpan$m count by panel usenull=f useother=f | eventstats sum("host") as _host</query> <done> <eval abc="computer1"</eval> </done> </search> <option name="charting.axisTitleY.visibility">collapsed</option> <option name="charting.chart">column</option> <option name="charting.drilldown">all</option> <option name="charting.fieldColors">{"host":0xFFFF00}</option> <option name="charting.legend.placement">bottom</option> <option name="refresh.display">progressbar</option> </chart>
I'm trying to run a base search but it is throwing me an error. Reason being I have two search tags inside a panel.  EG: Base search: <search id="basesearch"> <query>index=main source=xyz </que... See more...
I'm trying to run a base search but it is throwing me an error. Reason being I have two search tags inside a panel.  EG: Base search: <search id="basesearch"> <query>index=main source=xyz </query> <earliest>$EarliestTime$</earliest> <latest>$LatestsTime$</latest> </search> Panel search: <chart depends="$abc$"> <title>Chart1</title> <search> <done> <eval abc="computer1"</eval> </done> <search base="basesearch"> <query> |search host="INFO" OR host="ERROR" panel=$panel1$ |timechart span=$TimeSpan$m count by panel usenull=f useother=f | eventstats sum("host") as _host</query> </search> <earliest>$InputTimeRange.earliest$</earliest> <latest>$InputTimeRange.latest$</latest> </search> <option name="charting.axisTitleY.visibility">collapsed</option> <option name="charting.chart">column</option> <option name="charting.drilldown">all</option> <option name="charting.fieldColors">{"host":0xFFFF00}</option> <option name="charting.legend.placement">bottom</option> <option name="refresh.display">progressbar</option> </chart> Warning msg : Node <search> is not allowed here Done section is required in the panel so I cannot remove it.  Is there a way to use a base search this way?  
Check the retention settings on the _introspection index.  By default, it's 14 days.  Change the frozenTimePeriodInSecs setting in indexes.conf to retain data longer.
By default, reports in your app are shown in your app.  To have them appear in the S&R app the reports must be shared globally and the Reports dashboard must be set to show reports from all apps.  Or... See more...
By default, reports in your app are shown in your app.  To have them appear in the S&R app the reports must be shared globally and the Reports dashboard must be set to show reports from all apps.  Or your report must be shared within your app and the S&R Reports dashboard must be set to show reports from your app.
Still struggling to get our 9.1.3 forwarders working on RHEL7 and RHEL8 on DISA STIGed machines after the upgrade.  Nothing I can find online, even here, to help yet.  tcp_conn-open-afux ossocket_con... See more...
Still struggling to get our 9.1.3 forwarders working on RHEL7 and RHEL8 on DISA STIGed machines after the upgrade.  Nothing I can find online, even here, to help yet.  tcp_conn-open-afux ossocket_connect failed with no such file or directory' messages and SplunkForwarder.service just vanishes.  Really?  Tried yum erase and rm -R /opt/splunkforwarder and new install and still no-go.   Worked before as splunk user.  <aargh!> Worked before the upgrade.  Going back to older version for now since the Cyber Team is really miffed. Update 1: Well - added splunkfwd account to the root group and made progress, but not 100%.  Will try root:root as experiment - it does appear to be permission issues on STIG locked down machine even  though splunkfwd:splunkfwd owns all /opt/splunkforwarder/ files and directories. Update2: Running as root has not fixed the issue.  'netstat -an|grep 9997 on forwarder and indexer machines shows connections, 'Forwarder: Deployment' screen shows the non-working forwarders but 'Forwarder Management' screen does not show the forwarders.  The 9.1.2 and 8.2.2.1 (yeah, old - but there are reasons) still work fine forwarding to the 9.1.3 indexer.  Hoping 9.2.0.1 fixes this or I must roll back.
Hello everyone, I was wondering if there are any ways to back up / version control Dashboards that were created directly on Splunk Cloud to either a local git or to at least make sure that they ca... See more...
Hello everyone, I was wondering if there are any ways to back up / version control Dashboards that were created directly on Splunk Cloud to either a local git or to at least make sure that they can be recoverable/rolled back if a user/administrator edits/deletes one. So far I found this App: #https://splunkbase.splunk.com/app/5061, but I think that this app is more like a dashboard and doesn't really provide any of the use cases that I just described. BR, Andreas
You are correct, Thanks for the solution:  The names must be in quotes AND they are case sensitive.
The second rex command probably needs additional escaping, but since the first works for you we'll leave it at that.
Absolutely perfect, thank you!