What is the best way to determine transactions per second are occurring in our application logs. I attempted using " ... | bucket _time span=1s | stats count by _time" but I received a bucket span error because this search would result in > 50,000 bins. I also attempted to use the timechart per_second function does not provided the data I am looking for. Would stats be the best command to use. There is also the localize command, but I am not sure what the count and density fields actually represent?
Here are some options:
1) <your transaction search> | timechart count span=1s
However if this returns more than 50,000 results it wont work and it'll return that bucketing error.
2) another idea is to use per_second. Confusingly, per_second needs a numeric quantity. The good news is that you can just make one with eval. =) Try this:
<your transaction search> | eval count=1 | timechart per_second(count) as transactions_per_second
I use the timechart command, but in the Summary Index context. Run this search once per hour (or whatever timeframe reduces the results enough to make it work).
<your transaction search> | sitimechart span=1s count
Access the results with:
index=summary search_name="Summary Logins Per Second" | timechart span=1s count
Unfortunately, that means 86400 results per 24 hour period, so reporting over longer ranges will still require some tinkering.
Here are some options:
1) <your transaction search> | timechart count span=1s
However if this returns more than 50,000 results it wont work and it'll return that bucketing error.
2) another idea is to use per_second. Confusingly, per_second needs a numeric quantity. The good news is that you can just make one with eval. =) Try this:
<your transaction search> | eval count=1 | timechart per_second(count) as transactions_per_second