Splunk Search

Evaluate multiple events as one

insatiableavi
Observer

Hi everyone,

I have a data set such as:

Log1:  EventId + EventType1

Log 2: EventId + EventType2

Log 3: EventId + EventType1

Log 4: EventId + EventType2

Log 5: EventId + EventType1

The outcome I am trying to get is something like:
EventId + (evaluatedEvent3) + counts of evaluatedEvents3 per day

Where evaluatedEvent is the occurrence of Event1 and Event2 in a sequence. e.g. If one Event1 and one Event2 are present for ID, it will be counted as one. Similarly, if Event1 is +1 more than Event2, It should not be kept in the Event3 bucket for this ID.

 

TIA

Labels (3)
0 Karma

bowesmana
SplunkTrust
SplunkTrust

Can you give an example of the output you would expect from your data example. Is Event1 the same as Log1 or the same as EventType1

It sounds like stats values(X) by eventId might be a solution for your question, but from the last statement, it might mean streamstats would be needed.

0 Karma

insatiableavi
Observer

It is something like this:

3/12:
Log1: Processing decryptedRecord:{"account":"mainAccount1","EventType":"start"}

Log2: Processing decryptedRecord:{"account":"mainAccount1","EventType":"stop"} 

Log3: Processing decryptedRecord:{"account":"mainAccount1","EventType":"start"} 

Log4: Processing decryptedRecord:{"account":"mainAccount1","EventType":"stop"} 

Log5: Processing decryptedRecord:{"account":"mainAccount1","EventType":"stop"} 

4/12:

Log1: Processing decryptedRecord:{"account":"mainAccount1","EventType":"start"}

Log2: Processing decryptedRecord:{"account":"mainAccount1","EventType":"stop"} 

Log3: Processing decryptedRecord:{"account":"mainAccount1","EventType":"start"} 

my expected output:
| Date| account| Event| count|
| 3/12|mainAccount1| flap | 2|

|4/12| mainAccount1| flap | 1|

 

where 1 flap is the jump between consequitive start-stop

0 Karma

bowesmana
SplunkTrust
SplunkTrust

This example sets up your data as in your reply and then uses bin _time and stats to give you a solution, however, it is quite crude in that it will does not look for consecutive flaps, e.g. in your example you have two stop events on 3/12 with no intervening start, so if you have a start after that, this would not handle that case. You would probably need to use streamstats to handle that case.

Anyway, see where this gets you

| makeresults
| eval event="2020-12-03 10:00:00 Log1: Processing decryptedRecord:{\"account\":\"mainAccount1\",\"EventType\":\"start\"}
2020-12-03 10:00:01 Log2: Processing decryptedRecord:{\"account\":\"mainAccount1\",\"EventType\":\"stop\"} 
2020-12-03 10:00:02 Log3: Processing decryptedRecord:{\"account\":\"mainAccount1\",\"EventType\":\"start\"} 
2020-12-03 10:00:03 Log4: Processing decryptedRecord:{\"account\":\"mainAccount1\",\"EventType\":\"stop\"} 
2020-12-03 10:00:04 Log5: Processing decryptedRecord:{\"account\":\"mainAccount1\",\"EventType\":\"stop\"}
2020-12-04 10:00:00 Log1: Processing decryptedRecord:{\"account\":\"mainAccount1\",\"EventType\":\"start\"}
2020-12-04 10:00:01 Log2: Processing decryptedRecord:{\"account\":\"mainAccount1\",\"EventType\":\"stop\"} 
2020-12-04 10:00:02 Log3: Processing decryptedRecord:{\"account\":\"mainAccount1\",\"EventType\":\"start\"}"
| makemv tokenizer="(2.*})" event
| mvexpand event
| eval _time=strptime(event,"%F %T")
| rex field=event "Log\d: Processing decryptedRecord:(?<item>\{[^\}]*\})"
| table _time item
| spath input=item
| bin _time span=1d
| stats sum(eval(if(EventType="start",1,0))) as Starts sum(eval(if(EventType="stop",1,0))) as Stops by _time account
| eval flags=min(Starts,Stops)
0 Karma
Get Updates on the Splunk Community!

Fall Into Learning with New Splunk Education Courses

Every month, Splunk Education releases new courses to help you branch out, strengthen your data science roots, ...

Super Optimize your Splunk Stats Searches: Unlocking the Power of tstats, TERM, and ...

By Martin Hettervik, Senior Consultant and Team Leader at Accelerate at Iver, Splunk MVPThe stats command is ...

How Splunk Observability Cloud Prevented a Major Payment Crisis in Minutes

Your bank's payment processing system is humming along during a busy afternoon, handling millions in hourly ...