Splunk Search

How to group by month if we have data from.

rajhemant26
New Member

Hello everyone.

Want to display the output only for the time which crosses 18 months (earliest time)

Tags (1)
0 Karma

msivill_splunk
Splunk Employee
Splunk Employee

Looking at this some more I think the crux of the problem is grouping by month. As a starting point I've put together some SPL to show how to obtain the month from a timestamp then do a count by month.

The value generated in the _time will be a random time in the year 2018, as 1514764800 is epoch in seconds for the beginning of year 2018.

| makeresults count=100
| eval seconds_into_year = random() % ( 365 * 24 * 60 * 60 ) 
| eval epoch_start_of_2018 = 1514764800
| eval _time = epoch_start_of_2018 + seconds_into_year
| eval month_number = strftime(_time,"%m") 
| eval month_name = strftime(_time,"%b") 
| stats count by month_number, month_name

I'm hoping with this tip you should have enough to solve the question now.

0 Karma

msivill_splunk
Splunk Employee
Splunk Employee

Any chance of looking at the raw data, and/or wrapping the data into a makeresults SPL and/or simplify the data. It makes it a bit easier for people to pick the questions up and try different things with it.

The above query pulls back the last 4 hours worth of data but seems to pull data back from earlier in the year. Is there another time field in the data to account for this? So not using default _time field?

0 Karma
Get Updates on the Splunk Community!

Announcing Scheduled Export GA for Dashboard Studio

We're excited to announce the general availability of Scheduled Export for Dashboard Studio. Starting in ...

Extending Observability Content to Splunk Cloud

Watch Now!   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to leverage ...

More Control Over Your Monitoring Costs with Archived Metrics GA in US-AWS!

What if there was a way you could keep all the metrics data you need while saving on storage costs?This is now ...