Splunk Search

dashboards with similiar data, but different timeframes

a212830
Champion

Hi,

I have a customer who created a dashboard with 28 unique searches. (Using Splunk 6.1.1). It's some cool stuff, but, that's nuts. A lot of the searches are simply different timeframes - one day in one chart, and 1 month in the other. Any suggestions on how to prevent 28 unique searches?

0 Karma

somesoni2
Revered Legend

Easiest option would be to create a saved search for the each unique search. Then in dashboard call the saved search with different time range.
e.g.

<dashboard>
  <label>Splunk Home</label>
  <description/>
  <row>
    <single>
      <title>Last 15 Minute</title>
      <searchString>|savedsearch splunk_internal_count</searchString>
      <earliestTime>-15m</earliestTime>
      <latestTime>now</latestTime>
    </single>
  </row>
  <row>
    <single>
      <title>Last 30 Minutes</title>
      <searchString>|savedsearch splunk_internal_count</searchString>
      <earliestTime>-30m@m</earliestTime>
      <latestTime>now</latestTime>
    </single>
  </row>
</dashboard>
0 Karma

martin_mueller
SplunkTrust
SplunkTrust

As an alternative or addition to post processing, you could accelerate, summarize, or schedule historic searches like the thirty day one because that's doing a lot of things over and over again without changes to the data.

Take schedule as an example, you could have the search run over -30d@d to @d every night at 1 AM and everyone loading the dashboard that day gets the canned results from 1 AM.

0 Karma

somesoni2
Revered Legend

That would be the best solution for these kind of situations. It may require search/postprocess search formation based on the requirements. Some reading can be done from here.

http://docs.splunk.com/Documentation/Splunk/6.1.1/Viz/PanelreferenceforSimplifiedXML

0 Karma

a212830
Champion

Thanks. That still seems like a lot of unnecessary over-lap. I recall the term "post-processor" for dashboards. Would that be appropriate? If so, can someone point me to links on how to do it?

0 Karma
Get Updates on the Splunk Community!

Splunk Observability Synthetic Monitoring - Resolved Incident on Detector Alerts

We’ve discovered a bug that affected the auto-clear of Synthetic Detectors in the Splunk Synthetic Monitoring ...

Video | Tom’s Smartness Journey Continues

Remember Splunk Community member Tom Kopchak? If you caught the first episode of our Smartness interview ...

3-2-1 Go! How Fast Can You Debug Microservices with Observability Cloud?

3-2-1 Go! How Fast Can You Debug Microservices with Observability Cloud? Learn how unique features like ...