Splunk Search

Help displaying duration of all orders of a single orderType

_gkollias
Builder

Hi All,

I'm trying to create a table that shows the duration of a transaction by the hour.

I'm trying to use something similar to timechart. Here is my search:

index=cg sourcetype=esb_audit esb_top_level=1 NOT bp_bp_name="*Task" svc_context_name=ordertype
| stats earliest(_time) as start_time, latest(_time) as end_time, earliest(status) as start_status, latest(status) as exit_status, first(other) as exit_message, last(total) as total by svc_context_name
|eval duration = (end_time-start_time)
|convert ctime(start_time)
|convert ctime(end_time)
|table svc_context_name, start_time, end_time, start_status, exit_status, exit_message, duration

This search is only giving me the one ordertype with what looks like a sum of total duration.

How can I modify this search so that it will display all orders with this ordertype, and show the duration of the order?

Thank You!

Tags (1)
0 Karma

wpreston
Motivator

Oops, I left out the eval part of startswith and endswith if you need to use field values to denote the beginning and ending of a transaction. If you need to use a string value instead of a field, just enclose the string in double quotes in place of using eval. The search is updated in the post above, but I'll copy it here for simplicity:

index=cg sourcetype=esb_audit esb_top_level=1 NOT bp_bp_name="*Task" svc_context_name="$svc_context_name$"
| transaction svc_context_name 
    startswith=<the string or field value that indicates the beginnig of a transaction, like: eval(status==START)>
    endswith=<the string or field value that indicates the beginnig of a transaction, like: eval(status==SUCCESS)> 
    keepevicted=true
| search closed_txn=1
0 Karma

wpreston
Motivator

UPDATE - I can't comment or reply from my workplace for some reason so I'll have to put my remarks here. The way the search is structured below, you should be getting the difference between the earlist time and the latest time for each svc_context per hour. This assumes that svc_context uniquely identifies a transaction and is not reused by another transaction later. Is svc_context reused at any point? If so, you can use transaction with the startswith and endswith options instead of stats. Transaction automatically calculates the duration for you and adds it as a field onto the combined event. Also, use keepevicted=true so that Splunk doesn't discard any of your events due to memory constraints. I don't know your data so my startswith and endswith may not be perfect for this but here's an example:

index=cg sourcetype=esb_audit esb_top_level=1 NOT bp_bp_name="*Task" svc_context_name="$svc_context_name$"
| transaction svc_context_name 
    startswith=<the string or field value that indicates the beginnig of a transaction, like: eval(status==START)>
    endswith=<the string or field value that indicates the beginnig of a transaction, like: eval(status==SUCCESS)> 
    keepevicted=true
| search closed_txn=1 

Take a look at the results of this search (after you've modified startswith and endswith) and the duration field and see if the data looks like what you need. If so, we can work on getting the other items you need in your report.

Original Answer below:

I think the issue is that you're grouping by _time in your stats command. This will not group events of a transaction together unless they happen at the exact same time. If you want to group by hour, eval an hour field (date_hour can be unreliable if your timestamps have any wierdness in them) like so before your stats command, then group by it instead of _time:

... eval myHour=strftime(_time, "%H")

Or, if you need to compare across multiple days, include the day and year in your eval for myHour:

... eval myHour=strftime(_time, "%Y-%d %H")

So your complete search command looks like this:

index=cg sourcetype=esb_audit esb_top_level=1 NOT bp_bp_name="*Task" svc_context_name="$svc_context_name$"
| eval myHour=strftime(_time, "%Y-%d %H")
| stats min(_time) as start_time, max(_time) as end_time, earliest(status) as start_status, latest(status) as exit_status, first(other) as exit_message, last(total) as total by svc_context_name, myHour
|eval duration = (end_time-start_time)
|convert ctime(start_time)
|convert ctime(end_time)
|search exit_status="SUCCESS"
|table svc_context_name, start_time, end_time, start_status, exit_status, exit_message, duration
0 Karma

_gkollias
Builder

Yes, that is exactly what svc_context_name is..its the actual name of the service/channel where the transactions go through, and I basically want to monitor those 7 or 8 Order Types to see how many orders are successful/failures/errors and add some additional stats and drilldowns etc. Here are some fields that I think are most interesting - bp_context_id(business process ID), svc_context_id(service ID),TPCode

0 Karma

wpreston
Motivator

Transaction is an intensive operation and can be slow if it's not structured correctly. The structure I gave you was incorrect, I thought that the svc_context_name was something that uniquely identified your transactions. It sounds like svc_context_name is a category of some kind. Is there a field that ties all of the events of an order together? Like an order number or session id? Can you post some (anonymized) sample events?

0 Karma

_gkollias
Builder

Theres a ton of fields being extracted. I have data coming up with this :
index=contract_gateway sourcetype=esb_audit esb_top_level=1 status="SUCCESS" NOT bp_bp_name="Task"
svc_context_name="SplseDiversity
.channel.*" OR
svc_context_name="StaplesOAG:receive" OR
svc_context_name="ediversity.cXML:receive"
|transaction svc_context_name eval(status==START) eval(status==SUCCESS) keepevicted=true
| search closed_txn=1

I notice that the transaction command really slows down the search because of all of the data it pulls out

0 Karma

somesoni2
Revered Legend

You need to have some field which can uniquely identifies the order for the ordertype your selected, like some order Id, order name. use that in transaction command

0 Karma

_gkollias
Builder

Thank you! hmm..I'm not seeing any results using the above search. I'll keep trying to play around with it a bit. Really appreciate your help!

0 Karma

_gkollias
Builder

Here is a portion of what I'm seeing:
ordertype 02/26/2014 08:00:03.162 02/26/2014 08:59:06.393 START SUCCESS Sucessfully wrote PO

0 Karma

_gkollias
Builder

Thank you!! It looks like the duration is being summed up..for example the duration for all orders is greater than 3300.000 seconds, but the times are also being spread out by the hour, so maybe that is total duration for the entire hour?

0 Karma

_gkollias
Builder

I can now see individual orders within a specific ordertype, but the stats min(_time) as start_time, max(_time) as end_time portion seems to be screwy, as both the start_time and end_time are the same, leaving duration at 0.000. Would you know what is causing this?

0 Karma

_gkollias
Builder

Thanks for your response.

I edited my search to this:

index=cg sourcetype=esb_audit esb_top_level=1 NOT bp_bp_name="*Task" svc_context_name="$svc_context_name$"
| stats min(_time) as start_time, max(_time) as end_time, earliest(status) as start_status, latest(status) as exit_status, first(other) as exit_message, last(total) as total by svc_context_name, _time
|eval duration = (end_time-start_time)
|convert ctime(start_time)
|convert ctime(end_time)
|search exit_status="SUCCESS"
|table svc_context_name, start_time, end_time, start_status, exit_status, exit_message, duration

0 Karma

linu1988
Champion

posting a sample table will help getting there

0 Karma
Get Updates on the Splunk Community!

Enterprise Security Content Update (ESCU) | New Releases

In December, the Splunk Threat Research Team had 1 release of new security content via the Enterprise Security ...

Why am I not seeing the finding in Splunk Enterprise Security Analyst Queue?

(This is the first of a series of 2 blogs). Splunk Enterprise Security is a fantastic tool that offers robust ...

Index This | What are the 12 Days of Splunk-mas?

December 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...