Splunk Search

Two row details into one row

srajanbabu
Explorer

I have the below search

index=main sourcetype=summa 
| rex "::\s(?<timestamp>\S+)\s"
| rex "^\S+\s(?<userid>\S+)\." 
| sort userid timestamp 
|transaction login startswith="Logged in" endswith="Processing complete" 
| rex "^\S+\s(?<userid>\S+)\." 

Which provide the rsultlinke Table-1, I want fix few things in T-1.
1. The reuslt should display some thing like this now this details are spread in two rows one after the other.
USER-id, Login time, log out time, number of login sessaion

Thanks
Rajan

Table-1

4/3/13 11:46:12.000 AM
SNM4 rajan.#### :: 04/03/13 11:46:12 :: User rajan logged in
SNM4 rajan.857F :: 04/03/13 11:46:13 :: Processing complete
host=hostname Options| sourcetype=summa Options| source=C:\Data\splunk\fxr\snm4-logger.log Options
4/3/13 11:46:08.000 AM
SNM4 verify.#### :: 04/03/13 11:46:08 :: User verify logged in
SNM4 different.855F :: 04/03/13 11:46:12 :: Processing complete
4/3/13 11:45:58.000 AM
SNM4 suman.#### :: 04/03/13 11:45:58 :: User suman logged in
SNM4 suman.853F :: 04/03/13 11:45:59 :: Processing complete
host=hostname Options| sourcetype=summa Options| source=C:\Data\splunk\fxr\snm4-logger.log Options

Tags (1)
0 Karma

kristian_kolb
Ultra Champion

Is login an extracted field? You can't make a transaction on a field that does not exist. Why is userid extracted twice? Also, there is no point in sorting before the transaction.

I believe that you want to do something like;

index=main sourcetype=summa 
| rex "^\S+\s(?<userid>\S+)\." 
| transaction userid startswith="Logged in" endswith="Processing complete"
| eval logoutTime = strftime( _time + duration,"%F %T")
| eval loginTime =strftime( _time, "%F %T")
| stats list(loginTime) as Login list(logoutTime) as Logout count as "Number of Sessions" by userid

alternatively, the last line can be substituted with;

| table userid, loginTime, logoutTime, duration

It's a bit unclear if you want to do it the first or the second way.

/K

0 Karma

srajanbabu
Explorer

Kristian,
There is no suppression key associated with this even. Every "Logged in" should have "Processing complete" for that user-id. At present if there is no "Processing complete" after logged in" after the time of this start even then, will asume the user is yet to log out.

0 Karma

dishasaxena
Path Finder

Hi Rajan,

For this requirement, do you have any 'suppression key' in these events which can identify a single session(it needs to be common in both 'Logged In' and Processing complete' event)?

Regards,
Disha

0 Karma

srajanbabu
Explorer

Kristian,
Your rex gave me what exactly I wanted thanks a lot. Just one more query on the same item, I want to list out incomplete transation (i.e.) user who are currently loged in and not loged out. Will you able to help me.

Thanks
Rajan

0 Karma

lguinn2
Legend

Try this

index=main sourcetype=summa 
| rex "^\S+\s(?<userid>\S+)\." 
| sort userid _time 
| transaction login startswith="Logged in" endswith="Processing complete" 
| eval logoutTime = strftime( _time + duration,"%x %X")
| eval loginTime =strftime( _time, "%x %X")
| table userId loginTime logoutTime

I don't know how you want to show both the individual sessions and the overall count. To calculate the count, you could simply do

index=main sourcetype=summa 
| rex "^\S+\s(?<userid>\S+)\." 
| sort userid _time 
| transaction login startswith="Logged in" endswith="Processing complete" 
| stats count by userid
0 Karma

srajanbabu
Explorer

Thanks for taking time and answering this, the other answer worked for me.

0 Karma
Get Updates on the Splunk Community!

Monitoring Postgres with OpenTelemetry

Behind every business-critical application, you’ll find databases. These behind-the-scenes stores power ...

Mastering Synthetic Browser Testing: Pro Tips to Keep Your Web App Running Smoothly

To start, if you're new to synthetic monitoring, I recommend exploring this synthetic monitoring overview. In ...

Splunk Edge Processor | Popular Use Cases to Get Started with Edge Processor

Splunk Edge Processor offers more efficient, flexible data transformation – helping you reduce noise, control ...