Splunk Search

Calculating throughput

ghildiya
Explorer

In splunk logs, I have to monitor some specific events. The identifier I use to target for those events is a text 'EVENT_PROCESSED'. So my search query is:

 

 

index=testIndex namespace=testNameSpace host=\*testHost* log=\*EVENT_PROCESSED*

 

 

It fetches me ll of my target events. Please note that EVENT_PROCESSED is not an extracted field and is just a text in the event logs.

Now my aim is to find throughput for these events. So I do this:

 

 

index=testIndex namespace=testNameSpace host=\*testHost* log=\*EVENT_PROCESSED* | timechart span=1s count as throughtput

 

 

 

Is this correct way of determining throughput rate? If I change span to some other value, say 1h, then I change to:

 

 

index=testIndex namespace=testNameSpace host=\*testHost* log=\*EVENT_PROCESSED* | timechart span=1h count/3600 as throughtput

 

 

Is this correct way? 

Labels (4)
0 Karma

spitchika
Path Finder

When you use your first query, you need to say throughput in "per sec" unit. With span=1h, you can still use "count" only say throughput in "Per hour" unit. If you still want to calculation then store count into another variable like | eventstats count as "Totalcount" then do calculation using eval

Tags (1)
0 Karma

spitchika
Path Finder

index=testIndex namespace=testNameSpace host=\*testHost* log=\*EVENT_PROCESSED* | eventstats count as "TotalCount" | eval throughput=TotalCount/3600 | timechart span=1h values(throughput)

Your query might look like this.

0 Karma

ghildiya
Explorer

This displays graphs with dots, even for line chart while Line chart is expected to show continuous curves.

0 Karma

spitchika
Path Finder
Let me check
0 Karma

spitchika
Path Finder

spitchika_0-1595869228904.png

 

This works perfectly for your requirement.

index=abc host=* source=/var/opt/appworkr/logs/logname "item"
| timechart span=1h count
| eval Throughput=round(count/3600,0)
| timechart span=1h values(Throughput)

Tags (1)
0 Karma
Get Updates on the Splunk Community!

New Year, New Changes for Splunk Certifications

As we embrace a new year, we’re making a small but important update to the Splunk Certification ...

Stay Connected: Your Guide to January Tech Talks, Office Hours, and Webinars!

What are Community Office Hours? Community Office Hours is an interactive 60-minute Zoom series where ...

[Puzzles] Solve, Learn, Repeat: Reprocessing XML into Fixed-Length Events

This challenge was first posted on Slack #puzzles channelFor a previous puzzle, I needed a set of fixed-length ...