Getting Data In

What is the best way to re-run a summary index to collect missed events after an outage?

Glasses
Builder

Hi - 

Let's say you have a scheduled query / report that runs daily (at mid-night) looking over a time range of Last 24 hours.  And you summarize the results to index=summary_index_foo.

There was a "foo" data source outage for a couple days, however you were able to backfill the data to index=foo.

What is the best to re-run the query without creating a lot of duplicates.   I am pretty sure if you use "collect" that will create duplicates.

But will re-scheduling a one-time clone of the report over the outage days and summarizing results create duplicates if the time range overlaps into the data (before and after the outage)?

In other words, the outage time frame was not to the minute, hour, or day exactly.  When you re-schedule/re-summarize the query will that create duplicates if the same data/event exists in the summary index for that time?   

Or will Splunk drop duplicates when using the summary index?    I am guessing duplicates will still be created but need a sanity check.

Thank you

 

Labels (1)
Tags (1)
0 Karma
1 Solution

gbansode
Explorer

@Glasses Do you want to backfill summary Index ? Here you go https://docs.splunk.com/Documentation/SplunkCloud/latest/Knowledge/Managesummaryindexgapsandoverlaps...

Run the fill_summary_index command from bin. Example below 
./splunk cmd python fill_summary_index.py -app is_app_one -name
"summary - count by user" -et -30d -lt now -j 8 -dedup true -auth
admin:changeme

View solution in original post

0 Karma

gbansode
Explorer

@Glasses Do you want to backfill summary Index ? Here you go https://docs.splunk.com/Documentation/SplunkCloud/latest/Knowledge/Managesummaryindexgapsandoverlaps...

Run the fill_summary_index command from bin. Example below 
./splunk cmd python fill_summary_index.py -app is_app_one -name
"summary - count by user" -et -30d -lt now -j 8 -dedup true -auth
admin:changeme
0 Karma

Glasses
Builder

TY, for the reply, I will try it.

 

 

0 Karma
Get Updates on the Splunk Community!

Announcing Scheduled Export GA for Dashboard Studio

We're excited to announce the general availability of Scheduled Export for Dashboard Studio. Starting in ...

Extending Observability Content to Splunk Cloud

Watch Now!   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to leverage ...

More Control Over Your Monitoring Costs with Archived Metrics GA in US-AWS!

What if there was a way you could keep all the metrics data you need while saving on storage costs?This is now ...