Splunk Search
Highlighted

How to write a search to calculate how much data was indexed every hour for each host?

New Member

index=windows is my index name, and I want to calculate that index size for every hour for each host... Please provide me the search ASAP.

I need all the hosts in one column and timestamp in row in the below format...

        01:12:00   02:12:00   03:12:00   ...
host1   3.657      3.677      3.689584
host2   ...        ...        ...
host3   ...        ...        ...
...
...
0 Karma
Highlighted

Re: How to write a search to calculate how much data was indexed every hour for each host?

Legend

Here is one approach, not a true measurement of size, but will give you relative size

index=windows earliest=1h | eval size=len(_raw) | stats sum(size) as size by host
0 Karma
Highlighted

Re: How to write a search to calculate how much data was indexed every hour for each host?

Champion

Here is a search that will report for licensing by host. Note, your format of time as columns is abnormal for splunk. The inverse is the norm. The following search reports host as columns and values in megabytes

index=_internal source=*license_usage.log* type="Usage" idx=windows
| fields _time h b
| eval h=if(len(h)=0 OR isnull(h),"(SQUASHED)",h)
| timechart span=1h eval(round(sum(b)/1024/1024,3)) as MB by h
0 Karma
Highlighted

Re: How to write a search to calculate how much data was indexed every hour for each host?

Champion

If you really need it in that format and you are only care about hour buckets, then this could work.

  index=_internal source=*license_usage.log* type="Usage" idx=windows
  | fields h b date_hour
  | eval h=if(len(h)=0 OR isnull(h),"(SQUASHED)",h)
  | eval date_hour = if(date_hour < 10, "0".date_hour.":00:00",date_hour.":00:00")
  | chart eval(round(sum(b)/1024/1024,3)) limit=24 over h by date_hour
0 Karma