Splunk Search

Throughput calculation over last "n" number of days

snabi
Explorer

I have following query which calculates and charts(hourly) file conversion throughput over last 24 hours however i am not able to range that over "n" number of days.... an attempt to configure that with (-7d to now)...... it charts throughput of each hours ( it seems this query adds up target hour from all seven days) showing only 24 spikes covering all 7 days ( i assume)

What am i doing wrong?

sourcetype="sourceA" "Conversion completed" AND source="/home/services.log" | rex field=_raw "Original file size: (?<Original_size>.\d+)" | rex field=_raw "Time spent: (?<time_spent>.\d+)" | stats sum(Original_size) as size1 sum(time_spent) as time1 by date_hour | eval time1=(time1)/1000 | eval conversion_rate=size1/(time1*1024*1024) | chart sum(conversion_rate) as "(MBytes_per_sec)" by date_hour
Tags (3)
0 Karma
1 Solution

snabi
Explorer

sourcetype="X" "Conversion completed" AND source="X.log" | rex field=_raw "Original file size: (?.\d+)" | rex field=_raw "Time spent: (?.\d+)" | stats sum(Original_size) as size1 sum(time_spent) as time1 by date_hour | eval time1=(time1)/1000 | eval conversion_rate=size1/(time1*1024*1024) | chart sum(conversion_rate) as "(MBytes_per_sec)" by date_hour

this one worked for me...
Thanks for all the supports...

View solution in original post

0 Karma

snabi
Explorer

sourcetype="X" "Conversion completed" AND source="X.log" | rex field=_raw "Original file size: (?.\d+)" | rex field=_raw "Time spent: (?.\d+)" | stats sum(Original_size) as size1 sum(time_spent) as time1 by date_hour | eval time1=(time1)/1000 | eval conversion_rate=size1/(time1*1024*1024) | chart sum(conversion_rate) as "(MBytes_per_sec)" by date_hour

this one worked for me...
Thanks for all the supports...

0 Karma

grijhwani
Motivator

Here is a search I constructed to perform something similar (specifically, comparing licence usage per hour over the last N days):

index="_internal" source="*license_usage.log" | eval ISODate=strftime(strptime(date_year."-".date_month."-".date_mday, "%Y-%b-%d"), "%Y-%m-%d (%a)") | eval MB=b/1024/1024 | chart eval(round(sum(MB),0)) over date_hour by ISODate limit=0  | addcoltotals labelfield=date_hour

The reconstruction of the date is a little kludgy and could probably be improved dramatically using convert, but I threw this together in my early Splunking days.

0 Karma

kristian_kolb
Ultra Champion

There are only 24 possible values for date_hour, so spanning over more than 24 hours will automatically group statistics from different days into the same bin, so to speak. Perhaps you could replace your stats command with timechart span=1h sum(Original_Size) as size1 sum(time_spent) as time1?

That will get you n x 24 slots for your statistics.

0 Karma
Get Updates on the Splunk Community!

Observability | How to Think About Instrumentation Overhead (White Paper)

Novice observability practitioners are often overly obsessed with performance. They might approach ...

Cloud Platform | Get Resiliency in the Cloud Event (Register Now!)

IDC Report: Enterprises Gain Higher Efficiency and Resiliency With Migration to Cloud  Today many enterprises ...

The Great Resilience Quest: 10th Leaderboard Update

The tenth leaderboard update (11.23-12.05) for The Great Resilience Quest is out &gt;&gt; As our brave ...