Splunk Search

Why are the values disappearing in timechart and sparklines?

tschrantz
New Member

I have an intermittent problem with some of my timecharts and sparklines, where the results will start to render correctly, but when the search finalizes, the graphs become very spiky and start showing very low values.

For example:

alt text

Each one of those bumps has a value of 1 or a 2, but I'm expecting values in the neighborhood of 10-20 for every data point. And when the graphs first start filling in, that's what I see. Here's what the same query looks like when it first starts showing the preview, before the search is finalized:

alt text

I've also noticed that the width of the sparklines change when this happens, almost like something is happening and the time range is switching when the search finalizes and the end result has a different number of time bins.

The queries themselves are fairly simple: index=frontend sourcetype=frontend_iis | stats count, sparkline(count) as trend by c_ip | sort -count It's more prevalent with sparklines, but I've seen the same sort of thing happen with timecharts. When a timechart is affected, all of the individual entries disappear, and I'm left with only NULL or OTHER.

This doesn't always happen, but it happens enough to be annoying. Rerunning the query doesn't seem to fix it.

Inspect Job and the search.log don't appear to have any errors.

Has anyone seen something like this?

0 Karma

mayurr98
Super Champion

can you try this

index=frontend sourcetype=frontend_iis | stats sparkline count by c_ip | sort -count

Try specifying a span

 index=frontend sourcetype=frontend_iis | stats sparkline(count,1h) as trend count by c_ip | sort -count

where 1h is span specify according to your time range
Also, are you doing any acceleration? and what is time range for the searches?

0 Karma

tschrantz
New Member

When I explicitly set the span, it seems to work. However, I have these graphs on a dashboard with a time selector, so the time range is variable. The default time range is 4 hours, so something like a 5m span would be fine for that, but that would be terrible for a 15 minute view or a 7 days view.

0 Karma
Get Updates on the Splunk Community!

Enhance Security Visibility with Splunk Enterprise Security 7.1 through Threat ...

(view in My Videos)Struggling with alert fatigue, lack of context, and prioritization around security ...

Troubleshooting the OpenTelemetry Collector

  In this tech talk, you’ll learn how to troubleshoot the OpenTelemetry collector - from checking the ...

Adoption of Infrastructure Monitoring at Splunk

  Splunk's Growth Engineering team showcases one of their first Splunk product adoption-Splunk Infrastructure ...