Splunk Search

Transactions Per Second

kbecker
Communicator

What is the best way to determine transactions per second are occurring in our application logs. I attempted using " ... | bucket _time span=1s | stats count by _time" but I received a bucket span error because this search would result in > 50,000 bins. I also attempted to use the timechart per_second function does not provided the data I am looking for. Would stats be the best command to use. There is also the localize command, but I am not sure what the count and density fields actually represent?

Tags (1)
0 Karma
1 Solution

sideview
SplunkTrust
SplunkTrust

Here are some options:

1) <your transaction search> | timechart count span=1s

However if this returns more than 50,000 results it wont work and it'll return that bucketing error.

2) another idea is to use per_second. Confusingly, per_second needs a numeric quantity. The good news is that you can just make one with eval. =) Try this:

<your transaction search> | eval count=1 | timechart per_second(count) as transactions_per_second

View solution in original post

twinspop
Influencer

I use the timechart command, but in the Summary Index context. Run this search once per hour (or whatever timeframe reduces the results enough to make it work).

<your transaction search>  | sitimechart span=1s count

Access the results with:

index=summary search_name="Summary Logins Per Second" | timechart span=1s count

Unfortunately, that means 86400 results per 24 hour period, so reporting over longer ranges will still require some tinkering.

sideview
SplunkTrust
SplunkTrust

Here are some options:

1) <your transaction search> | timechart count span=1s

However if this returns more than 50,000 results it wont work and it'll return that bucketing error.

2) another idea is to use per_second. Confusingly, per_second needs a numeric quantity. The good news is that you can just make one with eval. =) Try this:

<your transaction search> | eval count=1 | timechart per_second(count) as transactions_per_second

Get Updates on the Splunk Community!

[Puzzles] Solve, Learn, Repeat: Reprocessing XML into Fixed-Length Events

This challenge was first posted on Slack #puzzles channelFor a previous puzzle, I needed a set of fixed-length ...

Data Management Digest – December 2025

Welcome to the December edition of Data Management Digest! As we continue our journey of data innovation, the ...

Index This | What is broken 80% of the time by February?

December 2025 Edition   Hayyy Splunk Education Enthusiasts and the Eternally Curious!    We’re back with this ...