Getting Data In

query json for most recent 3 consecutive fails



I have json log in the following format. Each line is an event.

{"receivedDate":"2013-11-08 13:13:20.236", "macAddress": "11e4c90ca", "ssid": "Target Guest Wi-Fi", "status": "failed" }
{"receivedDate":"2013-11-08 13:16:20.236", "macAddress": "11e4c90ca", "ssid": "Target Guest Wi-Fi", "status": "passed" }
{"receivedDate":"2013-11-08 13:19:20.236", "macAddress": "12e4c90ca", "ssid": "Target Guest Wi-Fi", "status": "failed" }
{"receivedDate":"2013-11-08 13:21:20.236", "macAddress": "14e4c90ca", "ssid": "Target Guest Wi-Fi", "status": "failed" }
{"receivedDate":"2013-11-08 13:24:20.236", "macAddress": "12e4c90ca", "ssid": "Target Guest Wi-Fi", "status": "failed" }
{"receivedDate":"2013-11-08 13:27:20.236", "macAddress": "12e4c90ca", "ssid": "Target Guest Wi-Fi", "status": "failed" }
{"receivedDate":"2013-11-08 13:30:20.236", "macAddress": "11e4c90ca", "ssid": "Target Guest Wi-Fi", "status": "failed" }
{"receivedDate":"2013-11-08 13:33:20.236", "macAddress": "11e4c90ca", "ssid": "Target Guest Wi-Fi", "status": "failed" }

I am interested in "status", "macAddress" and "receivedDate".
Retrieve events where status=failed for most recent 3 consecutive times, group by macAddress.
So in my case it would be for "macAddress": "12e4c90ca" but NOT for "macAddress": "11e4c90ca" as it one of the recent 3 consecutives were passed.

can you please point me in the right direction how to achieve this!

Thanks a lot.

Tags (2)
0 Karma

Splunk Employee
Splunk Employee

Hi there...

It's not clear from the way you're describing the question whether you've got the data into Splunk... so I'm not completely sure where to start - but let's assume you do.

First... let's assume you're "polling" the last 15 minutes or so since that's what you've got here. So we have a finite chunk of time. Splunk will, by default show you the most recent stuff first. So you don't have to do anything on that front.

And what you've said is... if a macAddress succeeds in the time frame you're looking at, then you don't care about it.

So first you need to group your events, then you disqualify the ones you don't want with status!="passed", then you want to be sure about how many times the failures have occurred so you need to be able to count the lines... and where that's going to happen is within a transaction.

This is a super simplified example:

index=blabla sourcetype=blabla {some kind of time restriction}
|transaction macAddress
|where status!="passed" AND linecount>=3
|table macAddress

Your example is also super simple... and before I put the linecount "where" test in there, there was only one other macAddress showing (you should try this out, one pipe section at a time and see the results) if the data is more complex, you might want to insert

just after the transaction section and stop. Take a look at the fields that are created.

Both transaction and streamstats will create calculation fields for you to use... read up on it in the doc.

If this does what you want... great. If it just opens up more questions... that's good too.

With Splunk... the answer is always "YES!". It just might require more regex than you're prepared for!


Is it possible to do the same without using transaction? may be using stats or streamstats. Because for large data transaction might not be a good way to do!

0 Karma

Splunk Employee
Splunk Employee

yeap. 'zacty. : ) Glad it helped.

With Splunk... the answer is always "YES!". It just might require more regex than you're prepared for!
0 Karma


sorry my bad ---
I was doing status="failed" instead of what you suggested i.e. status!="passed" 🙂

0 Karma


Thanks for the reply rsennett_splunk. 🙂

I tried and I get something different -

index=tms | transaction macAddress | where status="failed" and linecount>=3

"macAddress": "12e4c90ca"
"macAddress": "11e4c90ca"

It should only give me for macAddress 12e4c90ca.

0 Karma
Get Updates on the Splunk Community!

Streamline Data Ingestion With Deployment Server Essentials

REGISTER NOW!Every day the list of sources Admins are responsible for gets bigger and bigger, often making the ...

Remediate Threats Faster and Simplify Investigations With Splunk Enterprise Security ...

REGISTER NOW!Join us for a Tech Talk around our latest release of Splunk Enterprise Security 7.2! We’ll walk ...

Introduction to Splunk AI

WATCH NOWHow are you using AI in Splunk? Whether you see AI as a threat or opportunity, AI is here to stay. ...