Knowledge Management

summary index non-live traffic issue

fguillot
New Member

Hi,

I do not figure out how I can configure summary indexing in my situation. Let me introduce my situation :

I do not index "live log". I index every day at 6am, compressed data one day old. For instance, january 11th at 6am, I index data from january 9th 6am to 10th 6am. This process can not be change for many reasons.

I have a big amount a data, so search take a long time to be process for a long period (typically on month). So, I used summary indexing to improve search time & resulting dashboard.

My problem is when I configure a summary indexing to process log (at midnight) from previous 24h, there is no log. If I set to 48 hour, it process a part of the log. If I set to 72, it will process new log added during the morning (well) and more already summary indexing.

Is this a problem ? Can the process figure out that the indexed data have been already summary indexing or it will do it again and make my result wrong ?

Others suggestions is welcome 🙂

rgds,

/fabien

0 Karma

MickSheppard
Path Finder

You could use the backfill script (fill_summary_index.py) to fill in the missing summary indexes. This does work out the time slices for which summary data already exists and only generates the missing summary indexes.

You would have to schedule this outside splunk but it would work.

0 Karma
Career Survey
First 500 qualified respondents will receive a $20 gift card! Tell us about your professional Splunk journey.

Can’t make it to .conf25? Join us online!

Get Updates on the Splunk Community!

Can’t Make It to Boston? Stream .conf25 and Learn with Haya Husain

Boston may be buzzing this September with Splunk University and .conf25, but you don’t have to pack a bag to ...

Splunk Lantern’s Guide to The Most Popular .conf25 Sessions

Splunk Lantern is a Splunk customer success center that provides advice from Splunk experts on valuable data ...

Unlock What’s Next: The Splunk Cloud Platform at .conf25

In just a few days, Boston will be buzzing as the Splunk team and thousands of community members come together ...