Splunk Search

Calculating throughput

ghildiya
Explorer

In splunk logs, I have to monitor some specific events. The identifier I use to target for those events is a text 'EVENT_PROCESSED'. So my search query is:

 

 

index=testIndex namespace=testNameSpace host=\*testHost* log=\*EVENT_PROCESSED*

 

 

It fetches me ll of my target events. Please note that EVENT_PROCESSED is not an extracted field and is just a text in the event logs.

Now my aim is to find throughput for these events. So I do this:

 

 

index=testIndex namespace=testNameSpace host=\*testHost* log=\*EVENT_PROCESSED* | timechart span=1s count as throughtput

 

 

 

Is this correct way of determining throughput rate? If I change span to some other value, say 1h, then I change to:

 

 

index=testIndex namespace=testNameSpace host=\*testHost* log=\*EVENT_PROCESSED* | timechart span=1h count/3600 as throughtput

 

 

Is this correct way? 

Labels (4)
0 Karma

spitchika
Path Finder

When you use your first query, you need to say throughput in "per sec" unit. With span=1h, you can still use "count" only say throughput in "Per hour" unit. If you still want to calculation then store count into another variable like | eventstats count as "Totalcount" then do calculation using eval

Tags (1)
0 Karma

spitchika
Path Finder

index=testIndex namespace=testNameSpace host=\*testHost* log=\*EVENT_PROCESSED* | eventstats count as "TotalCount" | eval throughput=TotalCount/3600 | timechart span=1h values(throughput)

Your query might look like this.

0 Karma

ghildiya
Explorer

This displays graphs with dots, even for line chart while Line chart is expected to show continuous curves.

0 Karma

spitchika
Path Finder
Let me check
0 Karma

spitchika
Path Finder

spitchika_0-1595869228904.png

 

This works perfectly for your requirement.

index=abc host=* source=/var/opt/appworkr/logs/logname "item"
| timechart span=1h count
| eval Throughput=round(count/3600,0)
| timechart span=1h values(Throughput)

Tags (1)
0 Karma
Get Updates on the Splunk Community!

Observability Unlocked: Kubernetes Monitoring with Splunk Observability Cloud

 Ready to master Kubernetes and cloud monitoring like the pros? Join Splunk’s Growth Engineering team for an ...

Update Your SOAR Apps for Python 3.13: What Community Developers Need to Know

To Community SOAR App Developers - we're reaching out with an important update regarding Python 3.9's ...

October Community Champions: A Shoutout to Our Contributors!

As October comes to a close, we want to take a moment to celebrate the people who make the Splunk Community ...