Splunk Search

How to optimize the performance of my search reporting on how much data was indexed for arbitrary fields?

knielsen
Contributor

Hello,

I know it's easy and straightforward to get ingestion metrics (how much data was ingested) based on sourcetype or index, searching with index=_internal source=*metrics.log

Unfortunately, we do have a bunch of different services that log to the same indexes and sourcetypes, but now we want to calculate their ingestion based on a specific field, let's call it service. So something like this would do the trick:

index=foo earliest=-1d@d latest=@d | eval bytes=len(_raw) | stats sum(bytes) by service

This is very very slow though (we ingest > 1TB / day). Is there a more elegant and faster way to achieve this?

Regards,
Kai.

0 Karma
1 Solution

woodcock
Esteemed Legend

You could set this up as an hourly summary index.

View solution in original post

0 Karma

woodcock
Esteemed Legend

You could set this up as an hourly summary index.

0 Karma

knielsen
Contributor

Well thanks, if summary indexing is the only or best solution, then summary indexing it will be. 🙂

I'll go with a 5m index as lowest source and may build a 1d summary based on that. This is fast enough so that I can actually do the stats sum by some other additional fields that may help us for future analysis.

0 Karma
Get Updates on the Splunk Community!

What's new in Splunk Cloud Platform 9.1.2312?

Hi Splunky people! We are excited to share the newest updates in Splunk Cloud Platform 9.1.2312! Analysts can ...

What’s New in Splunk Security Essentials 3.8.0?

Splunk Security Essentials (SSE) is an app that can amplify the power of your existing Splunk Cloud Platform, ...

Let’s Get You Certified – Vegas-Style at .conf24

Are you ready to level up your Splunk game? Then, let’s get you certified live at .conf24 – our annual user ...