Splunk Enterprise

Multiple stats file from summary index

uagraw01
Motivator

Hello Splunkers!!

Every week, my report runs and gathers the results under the summary index=analyst. You can see that several stash files are being created for the specific report in the screenshot below. Conversely, multiple stash files won't be created for other reports.

 

Report with multiple stash files.

uagraw01_2-1711950956151.png

Report with no duplicate no stash files.

uagraw01_3-1711951009773.png

Please provide me an assistance on this.

Labels (1)
Tags (1)
0 Karma
1 Solution

ITWhisperer
SplunkTrust
SplunkTrust

It is not clear whether there is an issue - to me it looks like the reports that were run on Feb 29th were done manually / ad hoc to back-fill the summary index for the earlier weeks before the schedule was set up and running correctly.

View solution in original post

uagraw01
Motivator

I have further investigated and seen that the Info_search_time for all the stash file is same. Please suggest any significance behind it ?

uagraw01_1-1711954696809.png

 

0 Karma

ITWhisperer
SplunkTrust
SplunkTrust

What is the issue?

Stash files are used by Splunk to serialise the events so that they can be indexed.

The source can be overridden in the collect command.

These are from two different reports - if you are interested, you should look at the settings for those reports to see the differences in how they are sent to the summary index.

As for the times, where the times are almost identical, this is likely to be due to a cron job, which potentially is from a unix-based system, whereas the sources with more different times look like they are from a Windows-based system, which doesn't usually have cron.

0 Karma

uagraw01
Motivator

@ITWhisperer 

Both the Saved searches are running at the same time. In your view, Is this causing the issue ?

uagraw01_0-1711957923786.png

uagraw01_1-1711957962806.png

 

0 Karma

ITWhisperer
SplunkTrust
SplunkTrust

It is not clear whether there is an issue - to me it looks like the reports that were run on Feb 29th were done manually / ad hoc to back-fill the summary index for the earlier weeks before the schedule was set up and running correctly.

uagraw01
Motivator

@ITWhisperer Is that possible to check who run the adhoc search of backfill of summary index from the _audit index ?

0 Karma

uagraw01
Motivator

@ITWhisperer

What caused the creation of these "D:\Splunk\var\spool\splunk\99ec742c0c976c35_events.stash_new" files? Instead of spool files, that should be the name of the report.

Do stash-spool files get created when a saved search is used ad hoc or backfill? When there are no spool files being created by scheduled?

0 Karma

ITWhisperer
SplunkTrust
SplunkTrust

The stash files are usually created by the collect command.

Depending on your retention settings, you may be able to find out who ran the report from your _audit index.

0 Karma

uagraw01
Motivator

@ITWhisperer

Manual runs of the search and to collect into summary index create those stash files. It is unrelated to the occurrence of duplicate events. The allocation of all sources is equal (25%) , as you can see below. Is that correct ?

uagraw01_0-1711961995247.png

 

0 Karma

ITWhisperer
SplunkTrust
SplunkTrust

These are consistent with the info_search_time graphic you shared earlier - is that what you are asking?

0 Karma

uagraw01
Motivator

@ITWhisperer  I think this is the best suitable answer for my question as you posted earlier.

"it looks like the reports that were run on Feb 29th were done manually / ad hoc to back-fill the summary index for the earlier weeks before the schedule was set up and running correctly."

 

 

0 Karma
Get Updates on the Splunk Community!

Continuing Innovation & New Integrations Unlock Full Stack Observability For Your ...

You’ve probably heard the latest about AppDynamics joining the Splunk Observability portfolio, deepening our ...

Monitoring Amazon Elastic Kubernetes Service (EKS)

As we’ve seen, integrating Kubernetes environments with Splunk Observability Cloud is a quick and easy way to ...

Cloud Platform & Enterprise: Classic Dashboard Export Feature Deprecation

As of Splunk Cloud Platform 9.3.2408 and Splunk Enterprise 9.4, classic dashboard export features are now ...