Splunk Search

How do I plot crash rate over _time by app_name?

Shashank_87
Explorer

Hi, I am trying to plot the Crash rate over _time on a graph and that has to be distributed by app_name.
On a high level, I want to know the crash rate of different apps over time.
Below is the search I am using but it is not working. It is giving incorrect data for the sub search.

For some timings, the "CrashCount" value is coming higher than "TotalCount" which makes the crash_rate more than 100%.
I am not able to figure out what is wrong in the query.
Any help will be much appreciated.

index=test sourcetype=test:ping app_name="*_App" 
| bin _time span=1h
| stats distinct_count(uid) as "TotalCount" by app_name _time
| appendcols 
    [ search index=test (sourcetype=test:error AND value=false) app_name="*_App" 
    | bin _time span=1h
    | stats distinct_count(uid) as "CrashCount" by app_name _time ] 
| eval crash_rate=round(('CrashCount'/'TotalCount')*100,2)
| timechart span=1h max(crash_rate) as crash_rate

uid - is the count of users accessing or crashing the app.

0 Karma
1 Solution

Vijeta
Influencer

@Shashank_87 Try below-

index=test (sourcetype=test:ping OR (sourcetype=test:error AND value=false) ) app_name="_App" 
| bin _time span=1h | dedup uid app_name sourcetype _time
| stats count(eval(sourcetype=test:ping) ) as "TotalCount", count(eval(sourcetype=test:error) ) as "CrashCount" by app_name _time| eval crash_rate=round(('CrashCount'/'TotalCount')*100,2)
| timechart span=1h max(crash_rate) as crash_rate

View solution in original post

0 Karma

Vijeta
Influencer

@Shashank_87 Try below-

index=test (sourcetype=test:ping OR (sourcetype=test:error AND value=false) ) app_name="_App" 
| bin _time span=1h | dedup uid app_name sourcetype _time
| stats count(eval(sourcetype=test:ping) ) as "TotalCount", count(eval(sourcetype=test:error) ) as "CrashCount" by app_name _time| eval crash_rate=round(('CrashCount'/'TotalCount')*100,2)
| timechart span=1h max(crash_rate) as crash_rate
0 Karma

Shashank_87
Explorer

Hi Vijeta, that's bang on!
That's what i was needed. Thank you.

0 Karma
Get Updates on the Splunk Community!

Splunk Answers Content Calendar, July Edition I

Hello Community! Welcome to another month of Community Content Calendar series! For the month of July, we will ...

Secure Your Future: Mastering Upgrade Readiness for Splunk 10

Spotlight: The Splunk Health Assistant Add-On  The Splunk Health Assistant Add-On is your ultimate companion ...

Observability Unlocked: Kubernetes & Cloud Monitoring with Splunk IM

Ready to master Kubernetes and cloud monitoring like the pros? Join Splunk’s Growth Engineering team on ...