Splunk Search

Regarding calculating response time from log

vaishnavi07
Explorer

I have a log which has entries with transactionid and START_TRANSACTION or END_TRANSACTION. For e.g,

INFO , createLabel, VF4546, START_TRANSACTION, 2015:04:16:07:11:46.925
INFO , createLabel, VF4546, END_TRANSACTION, 2015:04:16:07:11:47.596
INFO , retrieveB2CClients, 12345, START_TRANSACTION, 2015:04:16:07:15:00.730
INFO , retrieveB2CClients, 12345, END_TRANSACTION, 2015:04:16:07:15:00.777

I have to find out response time for each transaction type for which i have to get the starttime and endtime from the entries. How can i match the corresponding starttime and endtime from the logs? Sometimes the START_TRANSACTION or END_TRANSACTION may be missing. Which means the corresponding start or end time should be filled with zero and while calculating response time i will neglect those entries. Can anyone suggest how to do this?

Tags (1)
0 Karma

alacercogitatus
SplunkTrust
SplunkTrust

So let's look at the first event there, and I'll make an assumption.

INFO , createLabel, VF4546, START_TRANSACTION, 2015:04:16:07:11:46.925
INFO , createLabel, VF4546, END_TRANSACTION, 2015:04:16:07:11:47.596

So, my assumption: "VF4546" is a transaction ID, and the field name is "txnid". If that is not the truth, then use whatever the transaction id is in your data. Now we can write this:

<your_search_for_transactions> | transaction startswith=START_TRANSACTION endswith=END_TRANSACTION maxspan=4h txnid | stats values(duration) as "Duration in Seconds" by txnid

This should return a table with the number of seconds for each transaction.

0 Karma

satishsdange
Builder

I see field next to INFO (createLabel, retrieveB2CClients) is common between 2 subsequent events. You can use for creating reports.

0 Karma
*NEW* Splunk Love Promo!
Snag a $25 Visa Gift Card for Giving Your Review!

It's another Splunk Love Special! For a limited time, you can review one of our select Splunk products through Gartner Peer Insights and receive a $25 Visa gift card!

Review:





Or Learn More in Our Blog >>