Splunk Search

How do I plot crash rate over _time by app_name?

Shashank_87
Explorer

Hi, I am trying to plot the Crash rate over _time on a graph and that has to be distributed by app_name.
On a high level, I want to know the crash rate of different apps over time.
Below is the search I am using but it is not working. It is giving incorrect data for the sub search.

For some timings, the "CrashCount" value is coming higher than "TotalCount" which makes the crash_rate more than 100%.
I am not able to figure out what is wrong in the query.
Any help will be much appreciated.

index=test sourcetype=test:ping app_name="*_App" 
| bin _time span=1h
| stats distinct_count(uid) as "TotalCount" by app_name _time
| appendcols 
    [ search index=test (sourcetype=test:error AND value=false) app_name="*_App" 
    | bin _time span=1h
    | stats distinct_count(uid) as "CrashCount" by app_name _time ] 
| eval crash_rate=round(('CrashCount'/'TotalCount')*100,2)
| timechart span=1h max(crash_rate) as crash_rate

uid - is the count of users accessing or crashing the app.

0 Karma
1 Solution

Vijeta
Influencer

@Shashank_87 Try below-

index=test (sourcetype=test:ping OR (sourcetype=test:error AND value=false) ) app_name="_App" 
| bin _time span=1h | dedup uid app_name sourcetype _time
| stats count(eval(sourcetype=test:ping) ) as "TotalCount", count(eval(sourcetype=test:error) ) as "CrashCount" by app_name _time| eval crash_rate=round(('CrashCount'/'TotalCount')*100,2)
| timechart span=1h max(crash_rate) as crash_rate

View solution in original post

0 Karma

Vijeta
Influencer

@Shashank_87 Try below-

index=test (sourcetype=test:ping OR (sourcetype=test:error AND value=false) ) app_name="_App" 
| bin _time span=1h | dedup uid app_name sourcetype _time
| stats count(eval(sourcetype=test:ping) ) as "TotalCount", count(eval(sourcetype=test:error) ) as "CrashCount" by app_name _time| eval crash_rate=round(('CrashCount'/'TotalCount')*100,2)
| timechart span=1h max(crash_rate) as crash_rate
0 Karma

Shashank_87
Explorer

Hi Vijeta, that's bang on!
That's what i was needed. Thank you.

0 Karma
Get Updates on the Splunk Community!

Extending Observability Content to Splunk Cloud

Watch Now!   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to leverage ...

More Control Over Your Monitoring Costs with Archived Metrics!

What if there was a way you could keep all the metrics data you need while saving on storage costs?This is now ...

New in Observability Cloud - Explicit Bucket Histograms

Splunk introduces native support for histograms as a metric data type within Observability Cloud with Explicit ...