Splunk Search

Calculate the number of events that occur between two other time values in each event.

ALXWBR
Path Finder

Really struggling with this one, so looking for a hero to come along with a solution!

I have an index of flight data. Each departing flight has a timestamp for when the pilot calls up to the control tower to request to push back, this field is called ASRT (Actual Start Request Time). Each flight also has a time that it uses the runway, this is called ATOT_ALDT (Actual Take Off Time/Actual Landing Time). What I really need to calculate, is for each departing flight, how many over flights used the runway (had an ATOT_ALDT) between when the flight calls up (ASRT) and then uses the runway itself (ATOT_ALDT). This is to work out what the runway queue was like for each departing aircraft.

I have tried using the concurrency command, however, this doesn't return the desired results as it only shows the number flights that started before and not the ones that started after. We may have a situation where an aircraft calls up after one before but then departs before. And this doesn't capture that.

So I've found an approach that in theory should work. I ran an eventstats that lists the take off/landing time of every flight, so then I can mvexpand that and run an eval across each line. However, multi-value fields have a limit of 100, and there can be up to 275 flights in the time period I need to check.

Can anyone else think of a way of achieving this? My code is below:

REC_UPD_TM = the time the record was updated (this index uses the flights scheduled departure time as _time, so we need to find the latest record for each flight)

displayed_flyt_no = The flight number e.g EZY1234

DepOrArr = Was the flight a departure or an arrival.

 

index=flights
| eval _time = strptime(REC_UPD_TM."Z","%Y-%m-%d %H:%M:%S%Z") 
| dedup AODBUniqueField sortby - _time 
| fields AODBUniqueField DepOrArr displayed_flyt_no ASRT ATOT_ALDT
| sort ATOT_ALDT 
| where isnotnull(ATOT_ALDT) 
| eval 
    asrt_epoch = strptime(ASRT,"%Y-%m-%d %H:%M:%S"),
    runway_epoch = strptime(ATOT_ALDT,"%Y-%m-%d %H:%M:%S")
| table DepOrArr displayed_flyt_no ASRT asrt_epoch ATOT_ALDT runway_epoch  
| eventstats list(runway_epoch) as runway_usage 
| search DepOrArr="D" 
| mvexpand runway_usage 
| eval queue = if(runway_usage>asrt_epoch AND runway_usage<runway_epoch,1,0) 
| stats sum(queue) as queue by displayed_flyt_no 

 

 

Labels (4)
Tags (1)
0 Karma
1 Solution

PickleRick
SplunkTrust
SplunkTrust

I made some assumptions about the format of your data because I don't know your raw data but the mechanism is relatively sound.

My search should count precisely what you're saying.

Let's take an excerpt from your table and translate it into single events in the format I talked about(if you don't have them as separate ASRT and ATOT_ALDT, you'd have to split them in your base search). This is already sorted by time

displayed_flyt_no_timeevent
flight1029/02/2024 05:49ASRT
flight729/02/2024 05:51ATOT_ALDT
flight1129/02/2024 05:57ASRT
flight829/02/2024 06:01ATOT_ALDT
flight1229/02/2024 06:03ASRT
flight929/02/2024 06:04ATOT_ALDT
flight1029/02/2024 06:08ATOT_ALDT
flight1129/02/2024 06:10ATOT_ALDT
flight1229/02/2024 06:14ATOT_ALDT
flight1329/02/2024 06:19ATOT_ALDT

So if you now add that streamstatsed count, you'd get this:

displayed_flyt_no_timeeventtimes_busy
flight1029/02/2024 05:49ASRT0
flight729/02/2024 05:51ATOT_ALDT1
flight1129/02/2024 05:57ASRT1
flight829/02/2024 06:01ATOT_ALDT2
flight1229/02/2024 06:03ASRT2
flight929/02/2024 06:04ATOT_ALDT3
flight1029/02/2024 06:08ATOT_ALDT4
flight1129/02/2024 06:10ATOT_ALDT5
flight1229/02/2024 06:14ATOT_ALDT6
flight1329/02/2024 06:19ATOT_ALDT7

So now you group the flights with the stats and get

displayed_flyt_nostatesbusy
flight7ATOT_ALDT1
flight8ATOT_ALDT2
flight9ATOT_ALDT3
flight10ASRT,ATOT_ALDT0,4
flight11ASRT,ATOT_ALDT1,5
flight12ASRT,ATOT_ALDT2,6
flight13ATOT_ALDT7

So we now know that flights 7,8,9 and 13 only landed (I assume - they didn't ASRT so they only occupied the runway becaues of ATOT_ALDT). And we're not interested in those because they didn't wait in queue.

So we're filtering them out by our "where" command and we're left with just

displayed_flyt_nostatesbusy
flight10ASRT,ATOT_ALDT0,4
flight11ASRT,ATOT_ALDT1,5
flight12ASRT,ATOT_ALDT2,6

Now if we calculate max(busy)-min(busy)-1, we'll see that all those flights waited in a queue of length 3.

And it's the same values as you have in your table.

View solution in original post

0 Karma

ALXWBR
Path Finder

Lets take flight16 in my example. It had an ASRT timestamp of 06:19 and an ATOT_ALDT of 06:32. Flight15 had an ATOT_ALDT of 06:19, flight14 06:29 and flight15 06:31, so that's 3 flights that used the runway between the ASRT and ATOT of flight16.

0 Karma

PickleRick
SplunkTrust
SplunkTrust

I think I'd approach it from a completely different side. I wouldn't try to track down single ATOT_ALDT occurrences. Just count how many ATOT_ALDT occured before "your" ASRT and then check how many were at the time of your ATOT_ALDT. Depending on whether you include your own ATOT_ALDT in that count or not you might need to correct the result by one but the answer to your question would be simply count@ATOT_ALDT-count@ASRT.

So use the streamstats to count ATOT_ALDTs, then use stats/eventsats or even transaction to match ASRT with ATOT_ALDT for a single flight and calculate the difference in counts.

0 Karma

ALXWBR
Path Finder

I'm not sure that approach would work? Are you able to share some SPL with me so I can fully understand what your approach is. Here's some sample data. I've worked out what the values should be using excel with a good ol' fashioned countif.

DepOrArrdisplayed_flyt_noASRTasrt_epochATOT_ALDTrunway_epochqueue
Aflight1  29/02/2024 00:521709167953 
Aflight2  29/02/2024 01:411709170889 
Aflight3  29/02/2024 05:081709183310 
Aflight4  29/02/2024 05:331709184834 
Aflight5  29/02/2024 05:361709185003 
Aflight6  29/02/2024 05:401709185247 
Aflight7  29/02/2024 05:511709185889 
Aflight8  29/02/2024 06:011709186519 
Aflight9  29/02/2024 06:041709186679 
Dflight1029/02/2024 05:49170918574029/02/2024 06:0817091868953
Dflight1129/02/2024 05:57170918622029/02/2024 06:1017091870003
Dflight1229/02/2024 06:03170918658029/02/2024 06:1417091872803
Aflight13  29/02/2024 06:191709187540 
Dflight1429/02/2024 06:15170918730029/02/2024 06:2917091881861
Dflight1529/02/2024 06:16170918736029/02/2024 06:3117091882612
Dflight1629/02/2024 06:19170918754029/02/2024 06:3217091883383
Dflight1729/02/2024 06:22170918772029/02/2024 06:3417091884853
Dflight1829/02/2024 06:31170918826029/02/2024 06:4217091889733
Aflight19  29/02/2024 06:441709189074 
Dflight2029/02/2024 06:32170918832029/02/2024 06:4617091891804
Dflight2129/02/2024 06:38170918868029/02/2024 06:4717091892783
Aflight22  29/02/2024 06:491709189378 
Dflight2329/02/2024 06:27170918802029/02/2024 06:5117091894759
Aflight24  29/02/2024 06:521709189531 
Dflight2529/02/2024 06:36170918856029/02/2024 06:5417091896487
Aflight26  29/02/2024 06:551709189707 
Dflight2729/02/2024 06:43170918898029/02/2024 06:5617091898078
Aflight28  29/02/2024 06:571709189868 
Dflight2929/02/2024 06:45170918910029/02/2024 06:5917091899709
Aflight30  29/02/2024 07:011709190080 
Dflight3129/02/2024 06:46170918916029/02/2024 07:03170919022911
Dflight3229/02/2024 06:49170918934029/02/2024 07:04170919029210
Dflight3329/02/2024 06:47170918922029/02/2024 07:06170919037312
Dflight3429/02/2024 06:53170918958029/02/2024 07:0717091904479
Dflight3529/02/2024 06:50170918940029/02/2024 07:09170919057712
Aflight36  29/02/2024 07:101709190630 
Dflight3729/02/2024 06:56170918976029/02/2024 07:12170919072010
Aflight38  29/02/2024 07:131709190798 
Dflight3929/02/2024 06:55170918970029/02/2024 07:14170919089213
Aflight40  29/02/2024 07:151709190939 
Dflight4129/02/2024 06:57170918982029/02/2024 07:16170919101913
Dflight4229/02/2024 06:45170918910029/02/2024 07:18170919109622
Aflight43  29/02/2024 07:181709191123 
Dflight4429/02/2024 07:04170919024029/02/2024 07:20170919122512
Dflight4529/02/2024 07:06170919036029/02/2024 07:21170919129912
Aflight46  29/02/2024 07:221709191364 
Dflight4729/02/2024 07:07170919042029/02/2024 07:24170919147413
Aflight48  29/02/2024 07:251709191548 
Dflight4929/02/2024 06:59170918994029/02/2024 07:27170919164020
Aflight50  29/02/2024 07:281709191701 
Dflight5129/02/2024 06:58170918988029/02/2024 07:29170919178622
Aflight52  29/02/2024 07:311709191881 
Dflight5329/02/2024 07:10170919060029/02/2024 07:34170919207317
Dflight5429/02/2024 07:11170919066029/02/2024 07:35170919213717
Aflight55  29/02/2024 07:361709192194 
Dflight5629/02/2024 07:12170919072029/02/2024 07:38170919229919
Aflight57  29/02/2024 07:381709192339 
Dflight5829/02/2024 07:16170919096029/02/2024 07:40170919244117
Aflight59  29/02/2024 07:411709192511 
Dflight6029/02/2024 07:04170919024029/02/2024 07:43170919261328
Aflight61  29/02/2024 07:441709192692 
Dflight6229/02/2024 07:17170919102029/02/2024 07:46170919278820
Dflight6329/02/2024 07:19170919114029/02/2024 07:47170919284519
Aflight64  29/02/2024 07:481709192888 
Dflight6529/02/2024 07:23170919138029/02/2024 07:49170919298618
Dflight6629/02/2024 07:27170919162029/02/2024 07:50170919304817
Aflight67  29/02/2024 07:511709193089 
0 Karma

PickleRick
SplunkTrust
SplunkTrust

OK. Assuming you have _time and "event" (being either ASRT or ATOT_ALDT).

<your base search>
| sort _time
| streamstats count(eval(if(event="ATOT_ALDT",1,0)) as times_busy

This will give you additional column called "times_busy" saying how many times up to this  point in time you had an event of the runway being busy due to ATOT_ALDT.

So now you can do (assuming you're interested only in that parameter and we can just use stats and forget about everything else)

| stats values(state) as states values(times_busy) as busy by displayed_flyt_no

Now assuming you're interested only in those for which you had both states reported

| where states="ASRT" AND states="ATOT_ALDT"

And you can calculate your queue with

| eval queue=max(busy)-min(busy)-1

 

0 Karma

ALXWBR
Path Finder

Thanks @PickleRick, however, that search doesn't work for me. 

Not sure if I've made it clear enough. 

So for every single departing flight in the table (DepOrArr=D), I need to count the total of other flights who's ATOT_ALDT time was between the ASRT timestamp and ATOT_ALDT timestamp of that flight.

So if Flight1234 has and ASRT of 09:00 and an ATOT_ALDT of 09:15, how many other flights in the list had an ATOT_ALDT timestamp between those 2 times. And then so on for the next flight...etc etc.

What makes this more complicated is a flight can have an ASRT timestamp after another flight but still have an ATOT_ALDT timestamp before.

0 Karma

PickleRick
SplunkTrust
SplunkTrust

I made some assumptions about the format of your data because I don't know your raw data but the mechanism is relatively sound.

My search should count precisely what you're saying.

Let's take an excerpt from your table and translate it into single events in the format I talked about(if you don't have them as separate ASRT and ATOT_ALDT, you'd have to split them in your base search). This is already sorted by time

displayed_flyt_no_timeevent
flight1029/02/2024 05:49ASRT
flight729/02/2024 05:51ATOT_ALDT
flight1129/02/2024 05:57ASRT
flight829/02/2024 06:01ATOT_ALDT
flight1229/02/2024 06:03ASRT
flight929/02/2024 06:04ATOT_ALDT
flight1029/02/2024 06:08ATOT_ALDT
flight1129/02/2024 06:10ATOT_ALDT
flight1229/02/2024 06:14ATOT_ALDT
flight1329/02/2024 06:19ATOT_ALDT

So if you now add that streamstatsed count, you'd get this:

displayed_flyt_no_timeeventtimes_busy
flight1029/02/2024 05:49ASRT0
flight729/02/2024 05:51ATOT_ALDT1
flight1129/02/2024 05:57ASRT1
flight829/02/2024 06:01ATOT_ALDT2
flight1229/02/2024 06:03ASRT2
flight929/02/2024 06:04ATOT_ALDT3
flight1029/02/2024 06:08ATOT_ALDT4
flight1129/02/2024 06:10ATOT_ALDT5
flight1229/02/2024 06:14ATOT_ALDT6
flight1329/02/2024 06:19ATOT_ALDT7

So now you group the flights with the stats and get

displayed_flyt_nostatesbusy
flight7ATOT_ALDT1
flight8ATOT_ALDT2
flight9ATOT_ALDT3
flight10ASRT,ATOT_ALDT0,4
flight11ASRT,ATOT_ALDT1,5
flight12ASRT,ATOT_ALDT2,6
flight13ATOT_ALDT7

So we now know that flights 7,8,9 and 13 only landed (I assume - they didn't ASRT so they only occupied the runway becaues of ATOT_ALDT). And we're not interested in those because they didn't wait in queue.

So we're filtering them out by our "where" command and we're left with just

displayed_flyt_nostatesbusy
flight10ASRT,ATOT_ALDT0,4
flight11ASRT,ATOT_ALDT1,5
flight12ASRT,ATOT_ALDT2,6

Now if we calculate max(busy)-min(busy)-1, we'll see that all those flights waited in a queue of length 3.

And it's the same values as you have in your table.

0 Karma

ALXWBR
Path Finder

Ahhh ok, so the reason it wasn't working is because the ASRT and ATOT_ALDT were part of the same event, the example was effectively the events tabled. So now I have split the two timestamps into two separate events, your code works (couple of typos in the streamstats line, but sorted those). This is what I did. Thanks for all your help!

| eval 
    asrt_epoch = strptime(ASRT,"%Y-%m-%d %H:%M:%S"),
    runway_epoch = strptime(ATOT_ALDT,"%Y-%m-%d %H:%M:%S"),
    event="ASRT_".asrt_epoch.","."ATOT_ALDT_".runway_epoch,
    event=if(isnull(event),"ATOT_ALDT_".runway_epoch,event)
| makemv event delim=","
| mvexpand event
| rex field=event "^(?P<event>(ATOT_ALDT|ASRT))_(?P<_time>.+)$"
| sort _time
| streamstats count(eval(if(event="ATOT_ALDT",1,null()))) as times_busy
| stats values(event) as states values(times_busy) as busy values(ATOT_ALDT) as ATOT_ALDT by displayed_flyt_no
| sort ATOT_ALDT
| where states="ASRT" AND states="ATOT_ALDT"
| eval queue=(max(busy)-1)-(min(busy))




0 Karma
Get Updates on the Splunk Community!

Enterprise Security Content Update (ESCU) | New Releases

In December, the Splunk Threat Research Team had 1 release of new security content via the Enterprise Security ...

Why am I not seeing the finding in Splunk Enterprise Security Analyst Queue?

(This is the first of a series of 2 blogs). Splunk Enterprise Security is a fantastic tool that offers robust ...

Index This | What are the 12 Days of Splunk-mas?

December 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...