Splunk Search

appendcols subsearch skewing timestamps in some circumstances

benhooper
Communicator

I'm working with a system where each event has its own creation timestamp (always the same) and modification timestamp.

Currently, I want to get the most recent resolved event to establish the latest modification time and then get the earliest resolved event in order to establish the actual resolve time which I've achieved with the following search query:

 

index=<index name> | where match(status, "resolved") | dedup 1 incident_id sortby -_time
| table incident_id, status, creation_time, resolve_time, modification_time
| appendcols [
    search index=<index name> | where match(status, "resolved") | dedup 1 incident_id sortby +_time
    | eval resolve_time = modification_time
    | reverse
    | table resolve_time ]
| eval modification_time = if(resolve_time == modification_time, "Unchanged", modification_time)
| table incident_id, status, creation_time, resolve_time, modification_time

 

It can be hard to visualise this so I've illustrated it below:

2020 ∕ 11 ∕ 11 10꞉36꞉25 - Search__Splunk_8.1.0_-_Google_Chrome.png

From the above screenshot, you can see that the incident with ID 1735 was initially resolved on 2020/11/06 at 11:56:27 but was modified twice, once at 11:57:54 and again at 15:02:42.

When the search only includes one post-resolution modification, this is exactly what's reported which can be seen in the following screenshot

2020 ∕ 11 ∕ 11 10꞉52꞉49 - Search__Splunk_8.1.0_-_Google_Chrome.png

However, the strange thing is that, when there's a second post-resolution modification, the timestamps get skewed into the future - resolve_time is supposed to remain as 2020/11/06 11:56:27.351 but gets changed to 2020/11/06 14:52:13.822. This can be seen in the following screenshot:

2020 ∕ 11 ∕ 11 11꞉00꞉18 - Search__Splunk_8.1.0_-_Google_Chrome.png

Through testing, I have:

  1. Verified that the two searches work exactly as intended when copy and pasted into standalone searches.
  2. Found that all timestamps (_time, creation_time, and modification_time) within the appendcols subsearch are skewed.

Even more bizarrely, the timestamps that are outputted aren't mentioned anywhere else. It's almost as if they're a result of some kind of search-time calculation.

Why is this happening?

Labels (2)
0 Karma
1 Solution

richgalloway
SplunkTrust
SplunkTrust

Yes, the reverse should account for the different sort orders.  Still, one must be very careful about the exact results of the subsearch when using appendcols

I believe I see the error in my previous query.  Please try this revision.

index=<index name> | where match(status, "resolved") 
| stats earliest(resolve_time) as resolve_time, latest(resolve_time) as modification_time, first(status) as status, first(creation_time) as creation_time by incident_id
| eval modification_time = if(resolve_time == modification_time, "Unchanged", modification_time)
| table incident_id, status, creation_time, resolve_time, modification_time
---
If this reply helps you, Karma would be appreciated.

View solution in original post

richgalloway
SplunkTrust
SplunkTrust

Be careful when using appendcols.  The results of the subsearch are added to the results of the main search without regard to context.  In other words, the first result of the subsearch is appended to the first result of the main search no matter what.

In the query here, the main search results are sorted in descending order by time whereas the subsearch is sorted in ascending order by time.  That would appear to be inviting an erroneous appending of a field.

When the number and sequence of events is not guaranteed to be the same in both the main and sub searches, don't use appendcols.  Try append, instead, and use stats to merge to two result sets.

 

index=<index name> | where match(status, "resolved") | dedup 1 incident_id sortby -_time
| table incident_id, status, creation_time, resolve_time, modification_time
| append [
    search index=<index name> | where match(status, "resolved") | dedup 1 incident_id sortby +_time
    | eval resolve_time = modification_time
    | reverse
    | table resolve_time ]
| stats values(*) as * by incident_id
| eval modification_time = if(resolve_time == modification_time, "Unchanged", modification_time)
| table incident_id, status, creation_time, resolve_time, modification_time

 

---
If this reply helps you, Karma would be appreciated.
0 Karma

benhooper
Communicator

Hi @richgalloway ,

Thanks for the response.

Yes, but I believe I countered the differing sort orders with the reverse?

I tried that but the resolve_time column is simply blank, unfortunately.

0 Karma

richgalloway
SplunkTrust
SplunkTrust

Yes, the reverse should account for the different sort orders.  Still, one must be very careful about the exact results of the subsearch when using appendcols

I believe I see the error in my previous query.  Please try this revision.

index=<index name> | where match(status, "resolved") 
| stats earliest(resolve_time) as resolve_time, latest(resolve_time) as modification_time, first(status) as status, first(creation_time) as creation_time by incident_id
| eval modification_time = if(resolve_time == modification_time, "Unchanged", modification_time)
| table incident_id, status, creation_time, resolve_time, modification_time
---
If this reply helps you, Karma would be appreciated.

benhooper
Communicator

Hi @richgalloway ,

Thanks again but, unfortunately, a similar problem - the resolve_time and modification_time columns are blank.

Thanks.

0 Karma

richgalloway
SplunkTrust
SplunkTrust

I'm not seeing where my mistake is.  When you look at the Events tab, do you see the resolve_time field?

---
If this reply helps you, Karma would be appreciated.
0 Karma

benhooper
Communicator

Hi @richgalloway ,

My apologies. I had a lot going on this morning so I ran your query without checking it properly.

The field resolve_time doesn't exist in the original events which explains what I was seeing so I've replaced both instances of (resolve_time) with (modification_time) and it appears to working perfectly and it's significantly simpler! Thank you very much!

2020 ∕ 11 ∕ 12 14꞉51꞉23 - Search__Splunk_8.1.0_-_Google_Chrome.png

I'm still confused why my search was acting that way in those circumstances but at least I have a solution now.

Thanks again!

0 Karma

richgalloway
SplunkTrust
SplunkTrust

If your problem is resolved, then please click the "Accept as Solution" button to help future readers.

---
If this reply helps you, Karma would be appreciated.
0 Karma
Get Updates on the Splunk Community!

What's new in Splunk Cloud Platform 9.1.2312?

Hi Splunky people! We are excited to share the newest updates in Splunk Cloud Platform 9.1.2312! Analysts can ...

What’s New in Splunk Security Essentials 3.8.0?

Splunk Security Essentials (SSE) is an app that can amplify the power of your existing Splunk Cloud Platform, ...

Let’s Get You Certified – Vegas-Style at .conf24

Are you ready to level up your Splunk game? Then, let’s get you certified live at .conf24 – our annual user ...