Hey Thambisetty, I definitely see how this could get me the concurrent count i'm looking for. I do have two problems when going this route though: 1: The dataset im looking over is large enough where i'd need to convert the |stats to a |tstats looking over a datamodel. When using Tstats i run into the issue that time can't be spanned or separated less than a second, therefore some of these events are getting grouped together, skewing the results. I'll have to unaccelerate our DM and add the unique field "id" to potentially add it to the query, making every event unique. (attached photo showing events being grouped) 2: Although this could give the concurrent failed login count prior to the first successful logon, im not sure if it resets its count on logon success. (what if there are another 5 failed logins in a row prior to another successful login?). I've tested adding reset_before and reset_after to the first streamstats but am still having issues confirming accuracy due to the tstats issue above. My version of your query where results are being grouped: | tstats count from datamodel=Authentication where sourcetype="graylogwindows:Security" (Authentication.EventCode=4624 OR Authentication.EventCode=4625) Authentication.Account_Name="Sample_Account" by _time span=1s Authentication.Account_Name Authentication.Keywords Authentication.signature_id Authentication.EventCode | sort _time | streamstats global=true count as "sscount" current=false by Authentication.Account_Name Authentication.Keywords | streamstats global=true current=false last(sscount) as last_count last(Authentication.Keywords) as last_action by Authentication.Account_Name
... View more