Splunk Search

Dedup consecutive values and count removed

christophe_clem
Explorer

Hello,

I want to count consecutive events that have common values of multiple fields.
I can do partially the stuff with dedup X Y consecutive=true but it does not count removed events.
I can do the other part with stats count by X Y but it does not take in account the "consecutive" part.

In practice, my data is radius authentication logs, X is the authentication result and Y is the user.
I need to count for each user the number of failed authentications without success authentication after.

I thought I could do this with a subsearch which give the _time of the latest success authentication for each user and then count the failed authentications that follow. But I can't find how to make a subsearch that produce a query like ( ( user="bob" AND _time>xxx ) OR ( user="alice" AND _time>yyy) )

0 Karma
1 Solution

christophe_clem
Explorer

Problem solved !

| eval lastmc=MESSAGE_CLASS | streamstats current=f count last(MESSAGE_CLASS) as lastmc by Calling_Station_ID | eval consecutive=if(lastmc=MESSAGE_CLASS,1,0) | search consecutive=0 | stats first(MESSAGE_CLASS) AS prevmc first(count) AS count by Calling_Station_ID | search prevmc="Passed-Authentication"

First eval is only here for the second eval, in other words: the first event is consecutive to himself.
I compute a "consecutive" field as you've done, and then I only keep lines where consecutive=0.
After that, I use a stats to select the first (newest) line and get the count of previous events.
Finally, I keep lines where prevmc="Passed-Authentication", meaning previous events were "Failed-Attempt".

Thank you guys for your help 🙂

View solution in original post

christophe_clem
Explorer

Problem solved !

| eval lastmc=MESSAGE_CLASS | streamstats current=f count last(MESSAGE_CLASS) as lastmc by Calling_Station_ID | eval consecutive=if(lastmc=MESSAGE_CLASS,1,0) | search consecutive=0 | stats first(MESSAGE_CLASS) AS prevmc first(count) AS count by Calling_Station_ID | search prevmc="Passed-Authentication"

First eval is only here for the second eval, in other words: the first event is consecutive to himself.
I compute a "consecutive" field as you've done, and then I only keep lines where consecutive=0.
After that, I use a stats to select the first (newest) line and get the count of previous events.
Finally, I keep lines where prevmc="Passed-Authentication", meaning previous events were "Failed-Attempt".

Thank you guys for your help 🙂

woodcock
Esteemed Legend

Click "Accept" on your answer.

0 Karma

jplumsdaine22
Influencer

What have I done...

<your search> | sort _time |  eval series=0 |streamstats current=f window=1 last(MESSAGE_CLASS) as lastmc last(Calling_Station_ID) as lastcs | eval consec=if(lastcs=Calling_Station_ID,1,0) | eval series=if(((MESSAGE_CLASS="Passed-Authenticion") OR (lastmc="Passed-Authenticion" AND consec=1)),1,0) | streamstats current=f window=1 last(series) as lastseries |  eval series=if((lastseries=1 AND consec=1),1,0) | stats sum(series) by Calling_Station_ID

Will give you

Calling_Station_ID     Last Failures
 00-11-22-33-44-55     2
 66-77-88-99-AA-BB     1

Check the spellings here - I've copied the spelling from your sample data, but it doesn't look correct. If you want to see what's going on here replace the final stats command with | table _time Calling_Station_ID MESSAGE_CLASS lastmc lastcs lastseries consec series.

My previous answer is still valid if you're not concerned that a series starts with an "authentication success"message.

That is, use streamstats to flag each event as being part of a series.

 ...| streamstats current=f window=1 count as consecutive by X,Y | stats count(consecutive) by X,Y

Note that this will drop any values that do not have any consecutive sequences. Try running just

| streamstats current=f window=1 count as consecutive by X,Y |table X Y consecutive 

to see what I mean

See: http://docs.splunk.com/Documentation/Splunk/latest/SearchReference/streamstats

0 Karma

sundareshr
Legend

For the sample provided, this will do it

 ... | streamstats window=2 current=f last(MESSAGE_CLASS) as l first(MESSAGE_CLASS) as f by Calling_Station_ID | eval c=if(l=f, 1, 0) | stats sum(c) as c by Calling_Station_ID
0 Karma

jplumsdaine22
Influencer

That would pick up the opposite case as well though? ie if there are a series of successful authentications that follow a failure

jkat54
SplunkTrust
SplunkTrust

This might work... or it might need an eval in each search to extract a new field of X after the transaction commands.

|transaction Y maxsplan=5m 
| search "Successful Authentication"  NOT "Failed Authentication"
| stats dc(Y) AS "Successes" by Y 
| appendcols [ 
 |transaction Y maxspan=5m 
 | search "Failed Authentication" NOT "Successful Authentication"
 | stats dc(Y) AS "Failures" by Y
 ] 
|table Successes, Failures, Y
0 Karma

christophe_clem
Explorer

I can't use a transaction because events can be separated by a long time.

Here is an example of data :

_time                   MESSAGE_CLASS       Calling_Station_ID
2015-12-30 16:38:44.948     Failed-Attempt      00-11-22-33-44-55
2015-12-30 16:38:44.920     Failed-Attempt      00-11-22-33-44-55
2015-12-30 16:18:49.794     Passed-Authentication   00-11-22-33-44-55
2015-12-30 16:18:44.715     Failed-Attempt      00-11-22-33-44-55
2015-12-30 16:18:44.686     Failed-Attempt      00-11-22-33-44-55
2015-12-30 15:58:49.494     Failed-Attempt      66-77-88-99-AA-BB
2015-12-30 15:58:44.458     Passed-Authentication   66-77-88-99-AA-BB
2015-12-30 15:58:44.430     Failed-Attempt      66-77-88-99-AA-BB

And what I want in output :

Calling_Station_ID  Last Failures
00-11-22-33-44-55   2
66-77-88-99-AA-BB   1
0 Karma

jkat54
SplunkTrust
SplunkTrust

Change the maxspan to equal whatever amount of time you desire... 1d,... 1mon... etc. I gave you example with 5minutes.

0 Karma
Get Updates on the Splunk Community!

Updated Team Landing Page in Splunk Observability

We’re making some changes to the team landing page in Splunk Observability, based on your feedback. The ...

New! Splunk Observability Search Enhancements for Splunk APM Services/Traces and ...

Regardless of where you are in Splunk Observability, you can search for relevant APM targets including service ...

Webinar Recap | Revolutionizing IT Operations: The Transformative Power of AI and ML ...

The Transformative Power of AI and ML in Enhancing Observability   In the realm of IT operations, the ...