Splunk Search

How to create a condition on one event based on the values of another event

tommasoscarpa
Explorer

Hi all,

 

I have a situation like the following:

I have some events with a start and end time that tell me when there has been an outage inside a building compartment. 

  • Building 1 compartment 1 - start: 07:00:00, end: 07:30:00
  • Building 1 compartment 2 - start: 06:50:00, end: 07:20:00
  • Building 1 compartment  1 - start: 09:00:00, end: 09:10:00
  • Building 2 compartment 3 - start: 07:00:00, end 08:00:00
  • Building 2 compartment 4 - start: 07:15:00, end: 07:45:00

What I need to do is to calculate the total amount of time a building was under outage for each building. This means that I do not care about in which compartment the outage was, but just about the building. 

The problem however, is that in this example I can't just sum the event durations for each building because, for building 1 compartments 1/2 and building 2 compartments 3/4 they overlap. Instead the correct solution would be the following:

  • Building 1: 50 mins
  • Building 2: 60 mins

 

The first bulleted list you see above roughly mirrors the events list I am collecting from my search. I have tried a few things but I can't figure out how to handle these kind of overlapping events.

 

Any clues?

0 Karma
1 Solution

ITWhisperer
SplunkTrust
SplunkTrust

The first part creates some dummy data based on your post.

I convert the times to epoch times assuming today - hopefully your real events have epoch times already, otherwise you will need to do something similar so that calculations can be performed on them

Next duplicate the events and reset the _time field to be either the start or end or the outage

Sort by _time and workout when overlapping outages actually finish

Calculate the period of building outage and sum these by building

| makeresults 
| eval _raw="Building 1 compartment 1 - start: 07:00:00, end: 07:30:00
Building 1 compartment 2 - start: 06:50:00, end: 07:20:00
Building 1 compartment  1 - start: 09:00:00, end: 09:10:00
Building 2 compartment 3 - start: 07:00:00, end: 08:00:00
Building 2 compartment 4 - start: 07:15:00, end: 07:45:00"
| multikv noheader=t
| table _raw
| rex "Building\s+(?<building>\d+)\s+compartment\s+(?<compartment>\d+).+start:\s+(?<start>[^,]+),\s+end:\s+(?<end>[\d:]+)"
| eval start=strptime(strftime(now(),"%Y/%m/%d")." ".start,"%Y/%m/%d %H:%M:%S")
| eval end=strptime(strftime(now(),"%Y/%m/%d")." ".end,"%Y/%m/%d %H:%M:%S")



| eval times=mvappend(start,end)
| mvexpand times
| eval outage=if(times=start,1,-1)
| eval _time=times
| sort _time
| streamstats sum(outage) as outage by building
| eval buildingoutagestart=if(outage=1 AND start=times,start,null)
| eval buildingoutageend=if(outage=0,end,null)
| streamstats max(buildingoutagestart) as buildingoutagestart by building
| where buildingoutageend>0
| eval totaloutage=buildingoutageend-buildingoutagestart
| stats sum(totaloutage) as totaloutage by building
| fieldformat totaloutage=tostring(totaloutage,"duration")

View solution in original post

ITWhisperer
SplunkTrust
SplunkTrust

The first part creates some dummy data based on your post.

I convert the times to epoch times assuming today - hopefully your real events have epoch times already, otherwise you will need to do something similar so that calculations can be performed on them

Next duplicate the events and reset the _time field to be either the start or end or the outage

Sort by _time and workout when overlapping outages actually finish

Calculate the period of building outage and sum these by building

| makeresults 
| eval _raw="Building 1 compartment 1 - start: 07:00:00, end: 07:30:00
Building 1 compartment 2 - start: 06:50:00, end: 07:20:00
Building 1 compartment  1 - start: 09:00:00, end: 09:10:00
Building 2 compartment 3 - start: 07:00:00, end: 08:00:00
Building 2 compartment 4 - start: 07:15:00, end: 07:45:00"
| multikv noheader=t
| table _raw
| rex "Building\s+(?<building>\d+)\s+compartment\s+(?<compartment>\d+).+start:\s+(?<start>[^,]+),\s+end:\s+(?<end>[\d:]+)"
| eval start=strptime(strftime(now(),"%Y/%m/%d")." ".start,"%Y/%m/%d %H:%M:%S")
| eval end=strptime(strftime(now(),"%Y/%m/%d")." ".end,"%Y/%m/%d %H:%M:%S")



| eval times=mvappend(start,end)
| mvexpand times
| eval outage=if(times=start,1,-1)
| eval _time=times
| sort _time
| streamstats sum(outage) as outage by building
| eval buildingoutagestart=if(outage=1 AND start=times,start,null)
| eval buildingoutageend=if(outage=0,end,null)
| streamstats max(buildingoutagestart) as buildingoutagestart by building
| where buildingoutageend>0
| eval totaloutage=buildingoutageend-buildingoutagestart
| stats sum(totaloutage) as totaloutage by building
| fieldformat totaloutage=tostring(totaloutage,"duration")

tommasoscarpa
Explorer

Thank you!

This works perfectly!

0 Karma

kamlesh_vaghela
SplunkTrust
SplunkTrust

@tommasoscarpa 

Can you please try this?

YOUR_SEARCH
| rex field=_raw "^(?<Building>[^\s]+\s\d+).*start:\s(?<start_time>\d{2}:\d{2}:\d{2}), end:\s(?<end_time>\d{2}:\d{2}:\d{2})" 
| eval start_time=strptime(start_time,"%H:%M:%H")
| eval end_time=strptime(end_time,"%H:%M:%H")
| eval outage=end_time - start_time
| stats sum(outage) as outage by Building
| eval outage = tostring(outage,"duration")

 

My Sample Search :

| makeresults | eval data="Building 1 compartment 1 - start: 07:00:00, end: 07:30:00|Building 1 compartment 2 - start: 06:50:00, end: 07:20:00|Building 1 compartment 1 - start: 09:00:00, end: 09:10:00|Building 2 compartment 3 - start: 07:00:00, end: 08:00:00|Building 2 compartment 4 - start: 07:15:00, end: 07:45:00", data=split(data,"|")|mvexpand data | rename data as _raw
| rex field=_raw "^(?<Building>[^\s]+\s\d+).*start:\s(?<start_time>\d{2}:\d{2}:\d{2}), end:\s(?<end_time>\d{2}:\d{2}:\d{2})" 
| eval start_time=strptime(start_time,"%H:%M:%H")
| eval end_time=strptime(end_time,"%H:%M:%H")
| eval outage=end_time - start_time
| stats sum(outage) as outage by Building
| eval outage = tostring(outage,"duration")

 

Thanks
KV
▄︻̷̿┻̿═━一

If any of my reply helps you to solve the problem Or gain knowledge, an upvote would be appreciated.

0 Karma
Get Updates on the Splunk Community!

Transforming Financial Data into Fraud Intelligence

Every day, banks and financial companies handle millions of transactions, logins, and customer interactions ...

What's New in Splunk Observability Cloud and Splunk AppDynamics - May 2025

This month, we’re delivering several new innovations in Splunk Observability Cloud and Splunk AppDynamics ...

How to send events & findings from AWS to Splunk using Amazon EventBridge

Amazon EventBridge is a serverless service that uses events to connect application components together, making ...