Splunk Search

Mean Time To Triage

ymalm188
Explorer

i have this spl 

| tstats `summariesonly` earliest(_time) as _time from datamodel=Incident_Management.Notable_Events_Meta by source,Notable_Events_Meta.rule_id | `drop_dm_object_name("Notable_Events_Meta")` | `get_correlations` | join rule_id [| from inputlookup:incident_review_lookup | eval _time=time | stats earliest(_time) as review_time by rule_id] | eval ttt=review_time-_time | stats count,avg(ttt) as avg_ttt,max(ttt) as max_ttt by rule_name | sort - avg_ttt | `uptime2string(avg_ttt, avg_ttt)` | `uptime2string(max_ttt, max_ttt)` | rename *_ttt* as *(time_to_triage)* | fields - *_dec

it should display the mean time to triage for 14 days but it doesn't work for 14 days and works for 30 days.

any advise ?

Labels (2)
0 Karma

ymalm188
Explorer

yes, your description is totally right so why i can't find any results for the last 14 days although there are actual data in these 14 days ? it was working before and suddenly stopped working.

0 Karma

bowesmana
SplunkTrust
SplunkTrust

So, if I understand you correctly, you get results from the first part of the search over 14 days without the join, but you are now saying that the full search over 14 days returns no resuts?

The search is 3 parts

  1. The base tstats from datamodel
  2. The join statement
  3. Aggregations based on information from 1 and 2

So, run the second part of the search

| from inputlookup:incident_review_lookup 
| eval _time=time 
| stats earliest(_time) as review_time by rule_id

Then if that gives you data and you KNOW that there is a rule_id that is common to both parts 1 and 2, then it is the 3rd part of the search that is does not have the right fields available.

The way for you to diagnose this is to gradually build up the search, adding each PIPE section to the search to understand what is causing the data to disappear.

0 Karma

ymalm188
Explorer

when i run the first part i got result and also for the second part but when i run them together i got no data

like that:

 

| tstats `summariesonly` earliest(_time) as _time from datamodel=Incident_Management.Notable_Events_Meta by source,Notable_Events_Meta.rule_id | `drop_dm_object_name("Notable_Events_Meta")` | `get_correlations` | join rule_id [| from inputlookup:incident_review_lookup | eval _time=time]

 

i think there is a problem in the join 

0 Karma

bowesmana
SplunkTrust
SplunkTrust

So if part 1 and part 2 are successful in their own right, then the issue is either

  1. the field rule_id is not in the first data set
  2. the field rule_id is not in the incident_review_lookup lookup
  3. there are no common instances of rule_id in both data sets

 

0 Karma

bowesmana
SplunkTrust
SplunkTrust

What is the time window of your search?

0 Karma

ymalm188
Explorer

14 days

0 Karma

bowesmana
SplunkTrust
SplunkTrust

The query seems to be getting time in the join statements. If you run just this part of the query - what time range of data do you get back?

| tstats `summariesonly` earliest(_time) as _time from datamodel=Incident_Management.Notable_Events_Meta by source,Notable_Events_Meta.rule_id | `drop_dm_object_name("Notable_Events_Meta")` | `get_correlations`
0 Karma

ymalm188
Explorer

it returned data for any time range i specify, especially 14 days that's what i want it returned data too so i think the problem with joining.

0 Karma

bowesmana
SplunkTrust
SplunkTrust

Yes, - the join is doing calculations with time taken from the lookup for each rule_id. It appears to be calculating review times based on those items found in the 14 day search and then looking for amount of time taken to review (ttt=review_time-_time).

It would seem that is the intention of the search, that you will see data going back more than the search window, as it appears that it is looking for activity in the last 14 days and then trying to find data about how long the incident has taken to review, which of course will have to look back to when the rule was originally triggered.

So, is there actually a problem?

 

0 Karma
Career Survey
First 500 qualified respondents will receive a $20 gift card! Tell us about your professional Splunk journey.

Can’t make it to .conf25? Join us online!

Get Updates on the Splunk Community!

Can’t Make It to Boston? Stream .conf25 and Learn with Haya Husain

Boston may be buzzing this September with Splunk University and .conf25, but you don’t have to pack a bag to ...

Splunk Lantern’s Guide to The Most Popular .conf25 Sessions

Splunk Lantern is a Splunk customer success center that provides advice from Splunk experts on valuable data ...

Unlock What’s Next: The Splunk Cloud Platform at .conf25

In just a few days, Boston will be buzzing as the Splunk team and thousands of community members come together ...