Splunk Search

Backfilling Summary Index Not writing to my summary index

samsplunkd
Path Finder

Hi,

I have a search say "foo" and it is scheduled to summary index to index named "bar". As a scheduled search, it is writing the results to index bar fine but when I try to backfill the data using backfill python script, it doesn't populate results in that index. Any idea what may be going on? Any help is much appreciated.

Below is what I am using:
/splunk cmd python fill_summary_index.py -app -name "foo" -et 1355990400 -lt now -index bar -showprogress true -dedup true -owner admin -auth admin:

scheduler log entry if it helps. I am seeing status as continued not success, Not sure what status as continued means.
12-25-2012 19:50:43.819 -0800 INFO SavedSplunker - savedsearch_id=";;foo", user="", app="", savedsearch_name="foo", status=continued, scheduled_time=1356493800

Thanks

Tags (1)
0 Karma
1 Solution

yannK
Splunk Employee
Splunk Employee

Take a look in the spooler folder: $SPLUNK_HOME/var/spool/splunk, the summary files are like *.stash_new

Maybe an issue there if the files do not clear, it usually means that they are not getting indexed.

View solution in original post

0 Karma

yannK
Splunk Employee
Splunk Employee

Take a look in the spooler folder: $SPLUNK_HOME/var/spool/splunk, the summary files are like *.stash_new

Maybe an issue there if the files do not clear, it usually means that they are not getting indexed.

0 Karma

yannK
Splunk Employee
Splunk Employee

this sounds like this issue :
http://splunk-base.splunk.com/answers/70072/summary-indexing-blocked-and-binary-file-warning

When too many summary files are generated at the same time.
Please clean the spooler and the indexed results, and run a new backfill with a concurrent number of jobs of 1. (to avoid having too many created at the same time)

0 Karma

samsplunkd
Path Finder

I see some binary files which have reference to my results in spool folder.
other thing is results from scheduled searches are getting indexed properly. it just fails when I try to backfill it. Do I need to do anything to fix files in spool folder?

0 Karma
Get Updates on the Splunk Community!

.conf24 | Registration Open!

Hello, hello! I come bearing good news: Registration for .conf24 is now open!   conf is Splunk’s rad annual ...

ICYMI - Check out the latest releases of Splunk Edge Processor

Splunk is pleased to announce the latest enhancements to Splunk Edge Processor.  HEC Receiver authorization ...

Introducing the 2024 SplunkTrust!

Hello, Splunk Community! We are beyond thrilled to announce our newest group of SplunkTrust members!  The ...