Splunk Search

Problematic behavior with chart and last()

sideview
SplunkTrust
SplunkTrust

So I have a dashboard and I want to display the most recent value of fieldA, for each value of fieldB and fieldC, shown as a table where values of fieldB are down the left, and values of fieldC are across the top.

This is simple enough -- you just use the chart command.

* | chart last(fieldA) over fieldB by fieldC

and we should end up with a table like:

fieldB         |   fieldC_value1  |   fieldC_value1    | fieldC_value1  |
fieldB_value1  |        128.3     |      active        |     0.412      | 
fieldB_value2  |         99.3     |      active        |     0.31       |

Except my data has some non-numeric values of fieldC (note the 'active' values above). And the chart command doesnt like non-numeric values; it throws them away.

So what happens, since is there is no numeric value for fieldC anywhere, my 'active' column disappears entirely. Likewise chart last(fieldC) is always null, whereas stats last(fieldC) is always correct.

Once you know to look for this spooky behavior it's pretty easy to reproduce. Here's an example:

1) * | head 10000 | chart last(date_hour) over date_second by date_minute

you may have to adjust the 10000, but this should show a result with 60 rows, with one for each second.

Sneak in the following eval clause, which sets the date_hour field to "mayhem" whenever the date_second field is equal to '0'.

* | head 10000 | eval date_hour=if(searchmatch("date_second=0"),"mayhem",date_hour) | chart last(date_hour) over date_second by date_minute

Now the entire 0th row dissappears and you only have 59 results.

Question: Is there some search language trick that can get me the end result I want? I dont actually know any of the values ahead of time.

Is there some incantation I can use to simply turn off this behavior in chart?

NOTE: this is essentially the same issue that I brought up in http://answers.splunk.com/questions/2295/how-come-some-fields-disappear-when-they-go-into-timechart-..., except in that case I could work around the problem by using stats by _time, and here I dont see any workaround.

Tags (1)
0 Karma
1 Solution

gkanapathy
Splunk Employee
Splunk Employee

You can still use stats with xyseries:

* | head 10000 
  | eval date_hour=if(searchmatch("date_second=0"),"mayhem",date_hour) 
  | chart last(date_hour) over date_second by date_minute

becomes

* | head 10000 
  | eval date_hour=if(searchmatch("date_second=0"),"mayhem",date_hour) 
  | stats last(date_hour) by date_second , date_minute 
  | xyseries date_second date_minute "last(date_hour)"

View solution in original post

gkanapathy
Splunk Employee
Splunk Employee

You can still use stats with xyseries:

* | head 10000 
  | eval date_hour=if(searchmatch("date_second=0"),"mayhem",date_hour) 
  | chart last(date_hour) over date_second by date_minute

becomes

* | head 10000 
  | eval date_hour=if(searchmatch("date_second=0"),"mayhem",date_hour) 
  | stats last(date_hour) by date_second , date_minute 
  | xyseries date_second date_minute "last(date_hour)"

gkanapathy
Splunk Employee
Splunk Employee

fixed typo/copy-paste error

0 Karma

sideview
SplunkTrust
SplunkTrust

Nice. Thanks. I'd forgotten about xyseries. I think there's a typo in your answer though -- either dont have the 'as ldr' or replace "last(date_hour)" with "ldr". I did the former and it started working correctly.

0 Karma

steveyz
Splunk Employee
Splunk Employee

This should do the trick

... | stats last(date_hour) by date_second date_minute | xyseries date_second date_minute last(date_hour)
Get Updates on the Splunk Community!

Introducing Splunk Enterprise 9.2

WATCH HERE! Watch this Tech Talk to learn about the latest features and enhancements shipped in the new Splunk ...

Adoption of RUM and APM at Splunk

    Unleash the power of Splunk Observability   Watch Now In this can't miss Tech Talk! The Splunk Growth ...

Routing logs with Splunk OTel Collector for Kubernetes

The Splunk Distribution of the OpenTelemetry (OTel) Collector is a product that provides a way to ingest ...