All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

@ITWhisperer  I think this is the best suitable answer for my question as you posted earlier. "it looks like the reports that were run on Feb 29th were done manually / ad hoc to back-fill the su... See more...
@ITWhisperer  I think this is the best suitable answer for my question as you posted earlier. "it looks like the reports that were run on Feb 29th were done manually / ad hoc to back-fill the summary index for the earlier weeks before the schedule was set up and running correctly."    
1. Run curl with -v to see its operation verbosely. Most probably you're trying to read cryptographic material from a directory you don't have access to. 2. In order to use client certificates you c... See more...
1. Run curl with -v to see its operation verbosely. Most probably you're trying to read cryptographic material from a directory you don't have access to. 2. In order to use client certificates you can do it like this: https://requests.readthedocs.io/en/latest/user/advanced/#client-side-certificates
Thanks, That made me dig in the right place, leading to ... https://splunk.my.site.com/customer/s/article/User-is-getting-an-error-message-when  Essentially, ... it was found that all the lookups... See more...
Thanks, That made me dig in the right place, leading to ... https://splunk.my.site.com/customer/s/article/User-is-getting-an-error-message-when  Essentially, ... it was found that all the lookups present in the app “Splunk_Security_Essentials” are added in denylist by default. Resolution to the error is to add local=true at the end of SPL command as below: ... | lookup isWindowsSystemFile_lookup filename local=true The indexers need a read-only copy of the knowledge bundle in order to run searches. Splunk Security Essentials brings a significant amount of data that does not need to be copied to the search heads. Adding "local=true", forces the lookup to run on the search head and not on any remote peer. That's ok for my purposes I think.
Are you sure you wanted old value of get as old_put? Also, you can just do your condition as | where command to find only those matching results. Then you'd trigger alert only if you had any results... See more...
Are you sure you wanted old value of get as old_put? Also, you can just do your condition as | where command to find only those matching results. Then you'd trigger alert only if you had any results at all.
These are consistent with the info_search_time graphic you shared earlier - is that what you are asking?
I am trying to call a 3rd party API which supports Certificate and Key based authentication. I have an on-prem instance of Splunk (Version: 9.0.2) running on a VM. I have verified the API response on... See more...
I am trying to call a 3rd party API which supports Certificate and Key based authentication. I have an on-prem instance of Splunk (Version: 9.0.2) running on a VM. I have verified the API response on the VM via curl command (Command used: curl --cert <"path to .crt file"> --key <"path to .key file"> --header "Authorization: <token>" --request GET <"url">) which gives response for a normal user. However, when running the same curl command using shell in Splunk Add-on Builder's Modular Data Inputs, the command only works with "sudo" otherwise it gives Error 403. When checked with "whoami", it returns the user as root. Question 1: Why is the curl command not working without using sudo even when the user is root. Is there any configuration that I need to modify to make it work without using sudo. Question 2: How do I make the same API call using Python code in Modular Data Inputs of Splunk Add-on Builder.
Let me know if anything else is needed
Hi guys, I don't know if you already done this, but could you please help ? I'm trying to create a new and simple datepicker where you just choose a date and next click in a button "Submit button" ... See more...
Hi guys, I don't know if you already done this, but could you please help ? I'm trying to create a new and simple datepicker where you just choose a date and next click in a button "Submit button" and show me the results.  I already created the datepicker but it dosen't do anything.  I'm trying, and tried to follow one similar example here but it isn't the same.  
@ITWhisperer Manual runs of the search and to collect into summary index create those stash files. It is unrelated to the occurrence of duplicate events. The allocation of all sources is equal (25%)... See more...
@ITWhisperer Manual runs of the search and to collect into summary index create those stash files. It is unrelated to the occurrence of duplicate events. The allocation of all sources is equal (25%) , as you can see below. Is that correct ?  
The stash files are usually created by the collect command. Depending on your retention settings, you may be able to find out who ran the report from your _audit index.
Finally, the key piece of information! You are expecting this to be an Excel date value. | makeresults | eval date=45123 | eval _time=(date-25567-2)*24*60*60 Excel uses dates based on the start of ... See more...
Finally, the key piece of information! You are expecting this to be an Excel date value. | makeresults | eval date=45123 | eval _time=(date-25567-2)*24*60*60 Excel uses dates based on the start of the 20th Century 1900-01-01, counting in days, whereas, Splunk uses unix-style times based on seconds since 1970-01-01, so, you need to subtract the number of days between these two baseline points, and multiply by the number of seconds in a day. Note that Excel may not be calculating the date correctly since it indexes the first day as 1 (instead of 0) and incorrectly assumes that 1900 was a leap year (which it wasn't), hence the extra -2 days in the calculation. Having said that, you will have to decide whether the _time value returned is correct based on the source of your data i.e. it could be a couple of days out.
@ITWhisperer What caused the creation of these "D:\Splunk\var\spool\splunk\99ec742c0c976c35_events.stash_new" files? Instead of spool files, that should be the name of the report. Do stash-spool ... See more...
@ITWhisperer What caused the creation of these "D:\Splunk\var\spool\splunk\99ec742c0c976c35_events.stash_new" files? Instead of spool files, that should be the name of the report. Do stash-spool files get created when a saved search is used ad hoc or backfill? When there are no spool files being created by scheduled?
@ITWhisperer Is that possible to check who run the adhoc search of backfill of summary index from the _audit index ?
I tried this query but it's showing something like this. But when i checked with an excel for this number 45123 - it's showing as 07/16/23.  @ITWhisperer 
It is not clear whether there is an issue - to me it looks like the reports that were run on Feb 29th were done manually / ad hoc to back-fill the summary index for the earlier weeks before the sched... See more...
It is not clear whether there is an issue - to me it looks like the reports that were run on Feb 29th were done manually / ad hoc to back-fill the summary index for the earlier weeks before the schedule was set up and running correctly.
@ITWhisperer  Both the Saved searches are running at the same time. In your view, Is this causing the issue ?  
| stats count by index | stats count
ep_winevt_ms* - This index is mapped in Data Model Macros.   I want to exclude all other indexes in (ep_winevt_ms*) and take the count as 1 to know the unique indexes. @ITWhisperer 
If I give you a conversion, how will you know whether it is correct or not?
I’m not sure. Give me an example so that I can try that @ITWhisperer