I'm trying to debug issues with a scheduled search that writes to the summary index and the backfill script. My assumption was that the following happens in sequence:
1) Scheduled Search Runs (search is designed to run as a summary index, summary indexing is enabled, etc. etc.)
2) Files are added/modified in /var/lib/splunk/summarydb
3) A search of index="summary" will show those results
I'm finding that when 1 happens, 2 happens immediately, but 3...not so much.
What's going on? Is there some mysterious other process that puts delays between something getting written to the summary index and something being available for search from the summary index?
I had the same problem and found that if I restart the SH, the index data is visible again.
Don't know why though or if it will happen again 😞
I also have this problem, what is the solution, thank you
More precisely, the steps are:
collect
command either implicitly (via "enable summary indexing" checkbox" or explicitly in the search string.collect
command (with default settings) gets output, transforms, and writes it to $SPLUNK_HOME/var/spool/splunk
in an intermediate fileWhen you see the index files being modified, that is not done directly by the summary indexing search job, only indirectly. How long a delay are you seeing? The longest delay would normally be the pause for the batch monitor to notice and index the new output file generated by the search.
Ah! So helpful! I was seeing a significant pause, often resolved by a splunk reboot. If I backfill the summary index using the backfill script, it sometimes just doesn't show up until I reboot. However, sometimes it does. It's zen that way. 🙂
I'm assuming you're doing this, but just to make sure... When you search against a summary index, the syntax should be:
index="summary" search_name="savedSearchName" | stats count ....
The search following the first pipe must match your populating search (minus 'si'). So, if your populating search is:
...| sistats count by fieldName
your search against the index must be:
...| stats count by fieldName | more stuff...
Yes. In fact, right now the summary index is totally clean so I'm just doing:
index="summary"
I've found that if I restart splunk, the index data is visible again. I also find this error in the log:
11-29-2010 10:00:05.226 ERROR databasePartitionPolicy - unable to open file: /usr/local/splunk/var/lib/splunk/summarydb/db/.metaManifest (No such file or directory)
Thanks!
-S.