Splunk Search

Backfilling Summary Index Not writing to my summary index

samsplunkd
Path Finder

Hi,

I have a search say "foo" and it is scheduled to summary index to index named "bar". As a scheduled search, it is writing the results to index bar fine but when I try to backfill the data using backfill python script, it doesn't populate results in that index. Any idea what may be going on? Any help is much appreciated.

Below is what I am using:
/splunk cmd python fill_summary_index.py -app -name "foo" -et 1355990400 -lt now -index bar -showprogress true -dedup true -owner admin -auth admin:

scheduler log entry if it helps. I am seeing status as continued not success, Not sure what status as continued means.
12-25-2012 19:50:43.819 -0800 INFO SavedSplunker - savedsearch_id=";;foo", user="", app="", savedsearch_name="foo", status=continued, scheduled_time=1356493800

Thanks

Tags (1)
0 Karma
1 Solution

yannK
Splunk Employee
Splunk Employee

Take a look in the spooler folder: $SPLUNK_HOME/var/spool/splunk, the summary files are like *.stash_new

Maybe an issue there if the files do not clear, it usually means that they are not getting indexed.

View solution in original post

0 Karma

yannK
Splunk Employee
Splunk Employee

Take a look in the spooler folder: $SPLUNK_HOME/var/spool/splunk, the summary files are like *.stash_new

Maybe an issue there if the files do not clear, it usually means that they are not getting indexed.

0 Karma

yannK
Splunk Employee
Splunk Employee

this sounds like this issue :
http://splunk-base.splunk.com/answers/70072/summary-indexing-blocked-and-binary-file-warning

When too many summary files are generated at the same time.
Please clean the spooler and the indexed results, and run a new backfill with a concurrent number of jobs of 1. (to avoid having too many created at the same time)

0 Karma

samsplunkd
Path Finder

I see some binary files which have reference to my results in spool folder.
other thing is results from scheduled searches are getting indexed properly. it just fails when I try to backfill it. Do I need to do anything to fix files in spool folder?

0 Karma
Get Updates on the Splunk Community!

Upcoming Webinar: Unmasking Insider Threats with Slunk Enterprise Security’s UEBA

Join us on Wed, Dec 10. at 10AM PST / 1PM EST for a live webinar and demo with Splunk experts! Discover how ...

.conf25 technical session recap of Observability for Gen AI: Monitoring LLM ...

If you’re unfamiliar, .conf is Splunk’s premier event where the Splunk community, customers, partners, and ...

A Season of Skills: New Splunk Courses to Light Up Your Learning Journey

There’s something special about this time of year—maybe it’s the glow of the holidays, maybe it’s the ...