Knowledge Management

Summary Index and Backfill - Doing reports on the original time

ge90115b
New Member

Hi,

I have a whole bunch of Bluecoat logs in which I will need to create Summary Indexes for them due to the log volume. Some of the common searches would be things like Top Blocked URLs, or Top 10 IP Addresses.

So I have the backfill script running, and I noticed that the _time is actually set to the time when the backfill script has ran. So as a result, I couldn't do a report generation on June 2009 for example.

Following the information from the earlier thread at http://answers.splunk.com/questions/3223/summary-index-event-date-and-sourcetype, I have now added a "| last(_time) as original_time" to one of the fields in my summary Index.

Everything looks good, and I have the original_time as one of the field, but the question is - is there any easier way for me to generate a report by making original_time as _time so that I can use the TimeRangePicker for example?

If not, I will have to resort to using search commands like original_time > 1234567890 AND original_time < 133333333 for example to refine the time range to match June 2009 and so forth.

Thanks for any suggestions!

Tags (1)
0 Karma
1 Solution

gkanapathy
Splunk Employee
Splunk Employee

Sounds like some kind of misconfiguration. It would be helpful to see what your summary search looks like. I'm also assuming that the original logs look basically like standard web access logs, and that timestamps are correctly extracted in the first place, and that other fields are also correctly extracted.

View solution in original post

0 Karma

gkanapathy
Splunk Employee
Splunk Employee

Sounds like some kind of misconfiguration. It would be helpful to see what your summary search looks like. I'm also assuming that the original logs look basically like standard web access logs, and that timestamps are correctly extracted in the first place, and that other fields are also correctly extracted.

0 Karma

gkanapathy
Splunk Employee
Splunk Employee

Usually you would add a by _time split-by clause (or add _time to the split-by fields if you already have one) to your stats or sistats query. However, that's not necessary if your job is scheduled to run for the entire period that the query covers. In that case, the time is automatically added and it should be the scheduled end span of the job. Basically, you shouldn't have needed to add that field.

0 Karma

ge90115b
New Member

Yes, you are right. Once I have added | last(_time) as original_time in my search query, the time stamp is correct based on the time returned by last(_time). I probably overlooked it as I was dealing with current Bluecoat events, but they were obvious once I pumped in old logs and I could see the time being indexed correctly.

Thanks for your help!

0 Karma
Get Updates on the Splunk Community!

Extending Observability Content to Splunk Cloud

Register to join us !   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to ...

What's new in Splunk Cloud Platform 9.1.2312?

Hi Splunky people! We are excited to share the newest updates in Splunk Cloud Platform 9.1.2312! Analysts can ...

What’s New in Splunk Security Essentials 3.8.0?

Splunk Security Essentials (SSE) is an app that can amplify the power of your existing Splunk Cloud Platform, ...