Splunk Search

Transaction by time range and calculate

lain179
Communicator

I have a log that tracks the number of documents. I would like to know how to find and table/graph the number of NEW documents for every 15 minutes.

The log looks like this: In this example, I need to find 1667 - 53 = 1614 new docs for the first 15 minutes.

2013-02-27 09:43:01 Found: 68 Sent: 53 ( New: 53 )
2013-02-27 09:45:23 Found: 307 Sent: 290 ( New: 290 )
2013-02-27 09:48:50 Found: 493 Sent: 476 ( New: 476 )
2013-02-27 09:50:37 Found: 820 Sent: 803 ( New: 803 )
2013-02-27 09:52:29 Found: 1025 Sent: 1008 ( New: 1008 )
2013-02-27 09:55:01 Found: 1294 Sent: 1277 ( New: 1277 )
2013-02-27 09:57:03 Found: 1445 Sent: 1428 ( New: 1428 )
2013-02-27 09:58:45 Found: 1682 Sent: 1667 ( New: 1667 )
2013-02-27 09:59:07 Found: 1847 Sent: 1830 ( New: 1830 )

Tags (1)
0 Karma
1 Solution

cramasta
Builder

you could do this

first extract the document count as a field called doccount (this would be the value after New: )

search would be

...|bucket _time span=15min | stats min(doccount) as mindoc max(doccount) as maxdoc by _time | eval newdocuments=maxdoc-mindoc | table _time newdocuments

View solution in original post

cramasta
Builder

you could do this

first extract the document count as a field called doccount (this would be the value after New: )

search would be

...|bucket _time span=15min | stats min(doccount) as mindoc max(doccount) as maxdoc by _time | eval newdocuments=maxdoc-mindoc | table _time newdocuments

lain179
Communicator

Exactly what I needed. Thank you so much!

0 Karma
Get Updates on the Splunk Community!

Celebrating Fast Lane: 2025 Authorized Learning Partner of the Year

At .conf25, Splunk proudly recognized Fast Lane as the 2025 Authorized Learning Partner of the Year. This ...

Tech Talk Recap | Mastering Threat Hunting

Mastering Threat HuntingDive into the world of threat hunting, exploring the key differences between ...

Observability for AI Applications: Troubleshooting Latency

If you’re working with proprietary company data, you’re probably going to have a locally hosted LLM or many ...