Splunk Search

How to show data from different regions as Today's

Wheresmydata
Explorer

Hi Splunkers,

I have different queries that get the age of a ticket only counting the business hours. I need to do different queries as business hours are not the same for different regions. For example, this is the one for the Americas:

 

 

| eval start_time_epoch = strptime(reported_time,"%b %d %Y %H:%M:%S") 
| eval start_time_second = strftime(start_time_epoch,"%S")
| eval start_time_epoch_rounded = start_time_epoch - start_time_second
| fields - start_time_epoch, start_time_second

| eval close_time_epoch = strptime(processed_time,"%b %d %Y %H:%M:%S") 
| eval close_time_second = strftime(close_time_epoch,"%S")
| eval close_time_epoch_rounded = close_time_epoch - close_time_second
| fields - close_time_epoch, close_time_second

| eval minute = mvrange(0, (close_time_epoch_rounded - start_time_epoch_rounded), 60)
| mvexpand minute

| eval _time = start_time_epoch_rounded + minute
| eval myHour = strftime(_time,"%H")
| eval myMinute = strftime(_time,"%H")
| eval myDay = strftime(_time,"%A")

| where myDay != "Saturday" AND myDay != "Sunday" AND (myHour >= 13 OR myHour < 1)
| stats count as durationInMinutes by reported_time, processed_time
| eval duration = tostring(durationInMinutes*60, "duration")

 

 

Instead, for Europe I have this line different:

 

 

| where myDay != "Saturday" AND myDay != "Sunday" AND (myHour >= 7 OR myHour < 17)

 

 

 

The output for all of them is to show if today the SLO of 1 hour was missed by doing the following:

 

 

| eval SLO=if(durationInMinutes>60,"SLO Fail","SLO Achieved") 
| chart count by SLO

 

 

I'm working with UTC time. And it works great. The problem here is that when I get my charts, it will show some events from the Americas as today's (because it's from 13 UTC to 1 UTC) but instead I'd like to see that as yesterdays. Any idea how to get some sort of offset in my query?

Thank you all!

Wheresmydata

Labels (1)
0 Karma
1 Solution

DalJeanis
Legend

There are several ways.  

If it will be a consistent number of hours offset for each query, (for example 5 hours) then just subtract the number of seconds involved. 

| eval _time = _time - 5*3600

   

You could also use a method such as the one in this answer 

https://community.splunk.com/t5/Getting-Data-In/need-help-in-time-conversion/m-p/471116

Or you could look at my answer and koshyk's answers on this one and see if they can be adapted.

https://community.splunk.com/t5/Getting-Data-In/Timezone-and-Timestamp-modification-at-search-report...

Or here 

https://community.splunk.com/t5/Splunk-Search/Is-there-a-way-to-show-local-time-of-the-device-of-tha...

View solution in original post

DalJeanis
Legend

There are several ways.  

If it will be a consistent number of hours offset for each query, (for example 5 hours) then just subtract the number of seconds involved. 

| eval _time = _time - 5*3600

   

You could also use a method such as the one in this answer 

https://community.splunk.com/t5/Getting-Data-In/need-help-in-time-conversion/m-p/471116

Or you could look at my answer and koshyk's answers on this one and see if they can be adapted.

https://community.splunk.com/t5/Getting-Data-In/Timezone-and-Timestamp-modification-at-search-report...

Or here 

https://community.splunk.com/t5/Splunk-Search/Is-there-a-way-to-show-local-time-of-the-device-of-tha...

Wheresmydata
Explorer

Thank you @DalJeanis ! Didn't think it was that simple 🙂

By the way, this is what I did and works great:

| eval start_time_epoch = strptime(reported_time,"%b %d %Y %H:%M:%S") 
| eval start_time_second = strftime(start_time_epoch,"%S") 
| eval start_time_epoch_rounded = start_time_epoch - start_time_second - 5*3600 
| fields - start_time_epoch, start_time_second 
| eval close_time_epoch = strptime(processed_time,"%b %d %Y %H:%M:%S") 
| eval close_time_second = strftime(close_time_epoch,"%S") 
| eval close_time_epoch_rounded = close_time_epoch - close_time_second - 5*3600 
| fields - close_time_epoch, close_time_second 
| eval minute = mvrange(0, (close_time_epoch_rounded - start_time_epoch_rounded), 60) 
| mvexpand minute 
| eval _time = start_time_epoch_rounded + minute 
| eval myHour = strftime(_time,"%H") 
| eval myMinute = strftime(_time,"%M") 
| eval myDay = strftime(_time,"%A") 
| where myDay != "Saturday" AND myDay != "Sunday" AND (myHour >= 8 AND myHour < 20) 
| eval ReportedTime = strftime(start_time_epoch_rounded,"%b %d %Y %H:%M:%S") 
| eval ProcessedTime = strftime(close_time_epoch_rounded,"%b %d %Y %H:%M:%S") 
| stats count as durationInMinutes by ticket,ReportedTime,ProcessedTime 
| eval duration = tostring(durationInMinutes*60, "duration") 
| eval SLO=if(durationInMinutes>60,"SLO Fail","SLO Achieved") 
| table ticket,ReportedTime,ProcessedTime,duration,SLO 
| chart count by SLO

Thank you!!

DalJeanis
Legend

You're quite welcome.  We love to help.

0 Karma
Get Updates on the Splunk Community!

Splunk Decoded: Service Maps vs Service Analyzer Tree View vs Flow Maps

It’s Monday morning, and your phone is buzzing with alert escalations – your customer-facing portal is running ...

What’s New in Splunk Observability – September 2025

What's NewWe are excited to announce the latest enhancements to Splunk Observability, designed to help ITOps ...

Fun with Regular Expression - multiples of nine

Fun with Regular Expression - multiples of nineThis challenge was first posted on Slack #regex channel ...