Splunk Search

Splitting up Transaction

Jesterhead
Engager

Hey all,

I'm trying to set up a transaction to track uptime vs downtime for our locations. In one field I have either true or false for each hour. I'm looking to get total time for each 'group' of events. . I would like to be able to have each 'group' (could last from several hours, to a week or more) of false readings to be it's own transaction. Over the course of a month or more, we could have many groups of false for each site, so I'd like to be able to see how long each incident kept it down for and how often it goes down over time.

Is this possible to do with transaction? Or would I be better off using a different command for this type of thing?

Here's what I'm using right now (prereg would mean that location is down):

*| eval hourly_status=if(prereg_on>prereg_off, "true","false") | transaction fields=hourly_status | eval human_duration=(duration)/86400

Tags (1)
0 Karma

alacercogitatus
SplunkTrust
SplunkTrust

My first guess would be the slightly off use of transaction.

*|eval hourly_status=if(prereg_on>prereg_off,"true","false")|transaction hourly_status|eval human_duration= duration/86400

I'll run this down some more in my head and see what I come up with.

0 Karma

Jesterhead
Engager

I think my problem might lie in the fact that I'm using transaction on just a field and not the _raw. It's the same event that logs True or False (just has a true or false in the message), so I'll try to build a transaction that pulls the status right out of the _raw.

Many thanks!

0 Karma
Get Updates on the Splunk Community!

Data Management Digest – December 2025

Welcome to the December edition of Data Management Digest! As we continue our journey of data innovation, the ...

Index This | What is broken 80% of the time by February?

December 2025 Edition   Hayyy Splunk Education Enthusiasts and the Eternally Curious!    We’re back with this ...

Unlock Faster Time-to-Value on Edge and Ingest Processor with New SPL2 Pipeline ...

Hello Splunk Community,   We're thrilled to share an exciting update that will help you manage your data more ...