Splunk Search

time delay

chaitu99
Explorer

Hi,

10:27:xx.xxx Message 1
10:31:xx.xxx Message 1
10:35:xx.xxx Message 1
10:38:xx.xxx conf msg
10:82:xx.xxx Message 2
10:85:xx.xxx req xyz
10:87:xx.xxx Message 2
10:89:xx.xxx Message 2

i've sample log like this. here i need to find the delay(time difference) between Message 1 before "conf msg" and Message 2 immediate next to "req xyz" in a single event.

i used query like this and am not getting the expected result

transaction startswith=("Message 1") endswith=("Message 2")|search ("conf msg")|stats count perc95(duration) as VALUE

is there any logic to get the exact result?

0 Karma

ShaneNewman
Motivator

It would help to see the rest of the event to know what fields are available to create a mvlist. What I have done, similar to what you are wanting to do, is broken out the entire event into 5 or 6 fields, then group them by the field that is common to that transaction... Such as ip_address. This is much easier when you setup a transactiontypes.conf for the transaction you are looking to create.

Example of transaction from transactiontypes.conf:

[event_collection]
fields = ip_address
startswith ="Login"
endswith ="Submit"
mvlist = event_type, event_timestamp, ip_address, user_id

Hope this helps!

0 Karma
Got questions? Get answers!

Join the Splunk Community Slack to learn, troubleshoot, and make connections with fellow Splunk practitioners in real time!

Meet up IRL or virtually!

Join Splunk User Groups to connect and learn in-person by region or remotely by topic or industry.

Get Updates on the Splunk Community!

Index This | What travels the world but is also stuck in place?

April 2026 Edition  Hayyy Splunk Education Enthusiasts and the Eternally Curious!   We’re back with this ...

Discover New Use Cases: Unlock Greater Value from Your Existing Splunk Data

Realizing the full potential of your Splunk investment requires more than just understanding current usage; it ...

Continue Your Journey: Join Session 2 of the Data Management and Federation Bootcamp ...

As data volumes continue to grow and environments become more distributed, managing and optimizing data ...