Splunk Search

How to sum the values in a field over a specific time span?

chatham
Engager

New to splunk!

I'm currently having trouble trying to sum values in a field over a specific time span...

My search:

*HttpRequestProcessor | rex field=LogLine "(?<Time>\s\d+\s)" | rex field=TimeStamp_Thread "(?<dt2>[\d]{4}-[\d]{2}-[\d]{2} [\d]{1,2}:[\d]{1,2}[\d]{1,2}:[\d]{2}.[\d]{3})"  | convert num(Time)    | eval time5=strptime(dt2,"%Y-%m-%d %H")  | eval _time=time5  | bucket _time span=1h | stats count(Time) by _time

instead of count(Time) what I really want is sum(Time) however, when I use that syntax, no stats are returned...what am I missing here?

Thanks!

Tags (3)
1 Solution

sk314
Builder

As extracted by your regex, Time has spaces in it. May be that's why sum fails.

Try this instead:

 *HttpRequestProcessor | rex field=LogLine "\s(?<Time>\d+)\s" | rex field=TimeStamp_Thread "(?<dt2>[\d]{4}-[\d]{2}-[\d]{2} [\d]{1,2}:[\d]{1,2}[\d]{1,2}:[\d]{2}.[\d]{3})"  | convert num(Time)    | eval time5=strptime(dt2,"%Y-%m-%d %H")  | eval _time=time5  | bucket _time span=1h | stats sum(Time) by _time

View solution in original post

sk314
Builder

As extracted by your regex, Time has spaces in it. May be that's why sum fails.

Try this instead:

 *HttpRequestProcessor | rex field=LogLine "\s(?<Time>\d+)\s" | rex field=TimeStamp_Thread "(?<dt2>[\d]{4}-[\d]{2}-[\d]{2} [\d]{1,2}:[\d]{1,2}[\d]{1,2}:[\d]{2}.[\d]{3})"  | convert num(Time)    | eval time5=strptime(dt2,"%Y-%m-%d %H")  | eval _time=time5  | bucket _time span=1h | stats sum(Time) by _time

chatham
Engager

Genius! That worked! Thanks a lot, sk314!

0 Karma
Get Updates on the Splunk Community!

Enter the Agentic Era with Splunk AI Assistant for SPL 1.4

  &#x1f680; Your data just got a serious AI upgrade — are you ready? Say hello to the Agentic Era with the ...

Feel the Splunk Love: Real Stories from Real Customers

Hello Splunk Community,    What’s the best part of hearing how our customers use Splunk? Easy: the positive ...

Data Management Digest – November 2025

  Welcome to the inaugural edition of Data Management Digest! As your trusted partner in data innovation, the ...