Splunk Search

Searching different sources or indexes depending on the timerange of the search

steveyz
Splunk Employee
Splunk Employee

This problem generally occurs when you want to create a dashboard that contains a timerange picker and want to populate the dashboard using summary data, and you have summary data available in multiple granularities. For example, you may have 5 minute summaries and daily summaries. So if the user selected a timerange of a month, you'd want to use the daily summaries, but if they selected a timerange of a few hours or a day, you'd want to use the 5minute summaries. There isn't a straightforward way of doing this.

1 Solution

steveyz
Splunk Employee
Splunk Employee

The easiest way of doing this involves using addinfo in a subsearch. Basically, subsearches inherit the timerange of the outer search. The addinfo command will create several fields that contain the UTC earliest and latest time bounds of the search. Then you can use the subsearch to expand to different search expressions depending on the range of latest-earliest.

Example:

index=summary [ stats count | addinfo | eval range=info_max_time - info_min_time | eval search=if(range>((86400*7)-3600),"source=daily_summaries","source=5m_summaries") ] | timechart count by host

View solution in original post

steveyz
Splunk Employee
Splunk Employee

The easiest way of doing this involves using addinfo in a subsearch. Basically, subsearches inherit the timerange of the outer search. The addinfo command will create several fields that contain the UTC earliest and latest time bounds of the search. Then you can use the subsearch to expand to different search expressions depending on the range of latest-earliest.

Example:

index=summary [ stats count | addinfo | eval range=info_max_time - info_min_time | eval search=if(range>((86400*7)-3600),"source=daily_summaries","source=5m_summaries") ] | timechart count by host

steveyz
Splunk Employee
Splunk Employee

Note that stats count in the subsearch just searches to a create a single dummy row so that addinfo can populate it with the info_* fields

0 Karma
Career Survey
First 500 qualified respondents will receive a $20 gift card! Tell us about your professional Splunk journey.
Get Updates on the Splunk Community!

Data Persistence in the OpenTelemetry Collector

This blog post is part of an ongoing series on OpenTelemetry. What happens if the OpenTelemetry collector ...

Introducing Splunk 10.0: Smarter, Faster, and More Powerful Than Ever

Now On Demand Whether you're managing complex deployments or looking to future-proof your data ...

Community Content Calendar, September edition

Welcome to another insightful post from our Community Content Calendar! We're thrilled to continue bringing ...