Splunk Search

How to Search/Visualization Time Chart - Days/Months Filter

hj9b7Cn
Engager

Hello everyone,

I'm pretty new to Splunk and mostly learning as I go, so please bear with me if this is a common question or an easy answer as I'm still figuring out alot of things 🙂

I'm building a specific search string that will seperate 1 field of information, with 5 different unique field names, counting them, and mapping this data to build a trending chart. Our data is pulled in on a daily basis. My search query works so far (although it's probably not optimized), and I'm now moving forward into the formatting stage.

What I want is to ensure my chart can work off of our main dashboard that has a time picker, so that we can see the trending of our data from day, month, year, etc. My query is working, but what I'm encountering is that in the chart the data will load in on a daily mapping no matter what filter is set. This is fine on a weekly, or daily filter, but when I want to view this with larger sets of data such as monthly or yearly, this comes out a bit messy.

Is it possible to tweak the search string so that when the data is viewed with a monthly filter, it will give the the values from the month and put the highest amount on the chart instead of every day of the month?

If not, I think the other solution may just be to make a separate chart for a monthly view. That's fine too, but just thought I would ask!

Thank you in advance and screenshot is below  showing what I see when changing to a "monthly" view along with a snippet of the search string.

 

 

| stats count(eval(severity=="Low")) AS Low by _time
| chart values(Low) over _time

 

 

hj9b7Cn_1-1645069023479.png

 

 

Labels (1)
0 Karma

ITWhisperer
SplunkTrust
SplunkTrust

You could bin _time into the appropriate size buckets depending on how wide your timeframe is.

Use addinfo to retrieve the start and end times being used for the search, work out how many days. Also, bin _time into different buckets (daily, weekly, monthly). Finally, reset _time to be the appropriate bucket value depending on the timeframe. Here I have used 14 and 28 days as breakpoints but you can use whatever values you like. Insert this code before your stats command.

| addinfo
| eval _timeframe=info_max_time-info_min_time
| eval _days=floor(_timeframe/(60*60*24))
| bin _time as _daily span=1d
| bin _time as _weekly span=1w
| bin _time as _monthly span=1mon
| eval _time=case(_days<14,_daily,_days<28,_weekly,1==1,_monthly)
| fields - info_* _timeframe _days _daily _weekly _monthly
0 Karma
Get Updates on the Splunk Community!

Announcing Scheduled Export GA for Dashboard Studio

We're excited to announce the general availability of Scheduled Export for Dashboard Studio. Starting in ...

Extending Observability Content to Splunk Cloud

Watch Now!   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to leverage ...

More Control Over Your Monitoring Costs with Archived Metrics GA in US-AWS!

What if there was a way you could keep all the metrics data you need while saving on storage costs?This is now ...