Splunk Search

summarize transaction search

smudge797
Path Finder

Whats the best way to summarize this data and subsequently search the results? The reason i ask is because the docs mention there is a transaction command that may need to be swapped for an si* command?

Here is the full search:

index=win sourcetype="WinEventLog:Security" source="WinEventLog:Security" EventCode=4624 OR EventCode=4634 Account_Name=* action=success  [inputlookup users.csv  | stats count by user  | table user  | rename user as Account_Name]
| eval User=if(mvcount(Account_Name)>1, mvindex(Account_Name,1), mvindex(Account_Name, 0)) 
| eval User=lower(User) 
| search NOT User=*$ 
| transaction User maxevents=2 startswith="EventCode=4624"  endswith="EventCode=4634" maxspan=-1 
| convert timeformat="%Y-%m-%d" ctime(_time) AS date 
| stats sum(duration) As Duration by date, User, ComputerName 
| eval Duration(M)=round((Duration/60), 0) 
| table  date,User,Duration(M),ComputerName

Thanks

1 Solution

woodcock
Esteemed Legend

Avoid transaction like this:

index=win sourcetype="WinEventLog:Security" source="WinEventLog:Security" EventCode=4624 OR EventCode=4634 Account_Name=* action=success [inputlookup users.csv | stats count by user | table user | rename user as Account_Name]
| eval User=if(mvcount(Account_Name)>1, mvindex(Account_Name,1), mvindex(Account_Name, 0)) 
| eval User=lower(User) 
| search NOT User=*$
| streamstats current=t count(eval(EventCode=4634)) AS SessionID BY User ComputerName
| stats earliest(_time) AS date latest(_time) AS latest BY User ComputerName SessionID
| eval Duration=latest - date
| bucket date span=1d
| stats sum(Duration) AS Duration BY date User ComputerName 
| eval Duration(M)=round((Duration/60), 0) 
| table date User Duration(M) ComputerName

View solution in original post

0 Karma

woodcock
Esteemed Legend

Avoid transaction like this:

index=win sourcetype="WinEventLog:Security" source="WinEventLog:Security" EventCode=4624 OR EventCode=4634 Account_Name=* action=success [inputlookup users.csv | stats count by user | table user | rename user as Account_Name]
| eval User=if(mvcount(Account_Name)>1, mvindex(Account_Name,1), mvindex(Account_Name, 0)) 
| eval User=lower(User) 
| search NOT User=*$
| streamstats current=t count(eval(EventCode=4634)) AS SessionID BY User ComputerName
| stats earliest(_time) AS date latest(_time) AS latest BY User ComputerName SessionID
| eval Duration=latest - date
| bucket date span=1d
| stats sum(Duration) AS Duration BY date User ComputerName 
| eval Duration(M)=round((Duration/60), 0) 
| table date User Duration(M) ComputerName
0 Karma

smudge797
Path Finder

That's cool and runs quicker.
Can date be converted as it was?

| convert timeformat="%Y-%m-%d" ctime(_time) AS date

Also there a way to Aggregate the number of events per day/ per user?

Thanks!

0 Karma

woodcock
Esteemed Legend

Right now it does duration/day/user/ComputerName. To change it to duration/day/user, just Remove every occurrence of "ComputerName" in the search. Or, if you need this in addition (both), then just add this to your search:

| appendpipe | stats sum(Duration) BY date User

To change the date, just add this to the end of the search:

| fieldformat date=strftime(date, "%Y-%m-%d")
0 Karma
Got questions? Get answers!

Join the Splunk Community Slack to learn, troubleshoot, and make connections with fellow Splunk practitioners in real time!

Meet up IRL or virtually!

Join Splunk User Groups to connect and learn in-person by region or remotely by topic or industry.

Get Updates on the Splunk Community!

Index This | What travels the world but is also stuck in place?

April 2026 Edition  Hayyy Splunk Education Enthusiasts and the Eternally Curious!   We’re back with this ...

Discover New Use Cases: Unlock Greater Value from Your Existing Splunk Data

Realizing the full potential of your Splunk investment requires more than just understanding current usage; it ...

Continue Your Journey: Join Session 2 of the Data Management and Federation Bootcamp ...

As data volumes continue to grow and environments become more distributed, managing and optimizing data ...