Knowledge Management

Question about fixing data in a summary search

burwell
SplunkTrust
SplunkTrust

We have a summary search that runs every hour. I have read about the fill_summary_index.py

What i want to know is how do I fix data from several weeks ago. Things changed a few weeks ago and I need to adjust the summary search. Going forward things are good but how can I run the fill_summary_index.py and have it replace the data that is there?

I feel like with the dedup setting it would either skip or add into what I have. How do I replace for a time period.. or can I?

Thanks.

Tags (1)
0 Karma
1 Solution

somesoni2
Revered Legend

I generally delete the corrupted/incomplete data from summary index and then backfill.

View solution in original post

somesoni2
Revered Legend

I generally delete the corrupted/incomplete data from summary index and then backfill.

burwell
SplunkTrust
SplunkTrust

How do you delete the corrupt/incomplete data?

0 Karma

somesoni2
Revered Legend

We run the delete command to delete (technically it just makes the data unsearchable) the required data. We run a search with appropriate data-source (index/sourcetype/source/host) and time range, check if the search is returning the data that you want to delete, and then add "| delete " command at the end to delete/make-unsearchable the same. This link should give you more details on the delete command.

http://docs.splunk.com/Documentation/Splunk/4.3.2/Admin/RemovedatafromSplunk#Delete_data_from_future...

0 Karma

burwell
SplunkTrust
SplunkTrust

Thanks. If you post in the answers I will mark this as answered.

0 Karma

cmerriman
Super Champion

We do the same.

0 Karma
Get Updates on the Splunk Community!

Announcing Scheduled Export GA for Dashboard Studio

We're excited to announce the general availability of Scheduled Export for Dashboard Studio. Starting in ...

Extending Observability Content to Splunk Cloud

Watch Now!   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to leverage ...

More Control Over Your Monitoring Costs with Archived Metrics GA in US-AWS!

What if there was a way you could keep all the metrics data you need while saving on storage costs?This is now ...