Splunk Search

xyseries drops result with duplicate timestamp

Splunk Employee
Splunk Employee

I have this data

_timeEventCodeMessage
2020-06-16T19:48:53+00:004136Too late now
2020-06-16T19:49:53+00:001234I don't know
2020-06-16T19:50:53+00:003456Too busy, ask again later
2020-06-16T19:51:53+00:001256Everything is happening at once
2020-06-16T19:51:53+00:001257And right now, as well

 

Notice that the last 2 events have the same timestamp.

I untable the events using this syntax:

...| untable _time FieldName FieldValue

The results appear as this:

_timeFieldNameFieldValue
2020-06-16 12:51:53EventCode1257
2020-06-16 12:51:53MessageAnd right now, as well
2020-06-16 12:51:53EventCode1256
2020-06-16 12:51:53MessageEverything is happening at once
2020-06-16 12:50:53EventCode3456
2020-06-16 12:50:53MessageToo busy, ask again later
2020-06-16 12:49:53EventCode1234
2020-06-16 12:49:53MessageI don't know
2020-06-16 12:48:53EventCode4136
2020-06-16 12:48:53MessageToo late now

 

When I reverse the untable by using the xyseries command, the most recent event gets dropped:

...| xyseries _time FieldName FieldValue

_timeEventCodeMessage
2020-06-16 12:48:534136Too late now
2020-06-16 12:49:531234I don't know
2020-06-16 12:50:533456Too busy, ask again later
2020-06-16 12:51:531256Everything is happening at once

 

The event with the EventCode 1257 and Message "And right now, as well" is missing.

It's like the xyseries command performs a dedup on the _time field.

Is this a known issue?

Some suggested that I avoid using _time in the untable and xyseries command, but instead do this:

...
| streamstats count as recno
| untable recno FieldName FieldValue
| xyseries recno FieldName FieldValue

Which does return all 5 events.

Labels (1)
Tags (2)
0 Karma
1 Solution

SplunkTrust
SplunkTrust

That is how xyseries and untable are defined.  

If you untable to a key field, and there are dups of that field, then the dups will be combined by the xyseries.

So, you can either create unique record numbers, the way you did, or if you want to explicitly combine and retain the values in a multivalue field, you can do something a little more complicated, like this...

 

| untable _time FieldName FieldValue
| stats list(FieldValue) as FieldValue by _time FieldName
| eval FieldValue=mvjoin(FieldValue,"!!!!") 
| xyseries _time FieldName FieldValue
| foreach * [|makemv delim="!!!!" <<FIELD>>]

 

 

View solution in original post

0 Karma

SplunkTrust
SplunkTrust

That is how xyseries and untable are defined.  

If you untable to a key field, and there are dups of that field, then the dups will be combined by the xyseries.

So, you can either create unique record numbers, the way you did, or if you want to explicitly combine and retain the values in a multivalue field, you can do something a little more complicated, like this...

 

| untable _time FieldName FieldValue
| stats list(FieldValue) as FieldValue by _time FieldName
| eval FieldValue=mvjoin(FieldValue,"!!!!") 
| xyseries _time FieldName FieldValue
| foreach * [|makemv delim="!!!!" <<FIELD>>]

 

 

View solution in original post

0 Karma

Path Finder

Hi! You might be able to help with similar problem. I have a table that looks like this (the dates re due dates):

Assignee2020-Dec2020-Jan2020-Mar2020-Apr2020-May
johnProj_1    
rose Proj_2Proj_3  
mike   Proj_4 

 

What it doesn't show is additional projects that are assigned to john and have the same due date. So really the table should look like this:

Assignee2020-Dec2020-Jan2020-Mar2020-Apr2020-May
john

Proj_1

Proj_5

Proj_7

    
rose Proj_2Proj_3  
mike   Proj_4 

 

The search is pretty simple:

| eval duedate = strftime(duedate,"%Y-%b")
| stats max(duedate) as duedate by Assignee, Proj
| xyseries Assignee duedate Proj

 

Thank you in advance!

0 Karma

Splunk Employee
Splunk Employee

Thanks @DalJeanis - I received feedback on the original example which showed the _time field instead of the recno field.  I appreciate the confirmation about duplicates. I've updated the docs (in the Usage section) for both untable and xyseries to explain about the duplicates.