I have following query which calculates and charts(hourly) file conversion throughput over last 24 hours however i am not able to range that over "n" number of days.... an attempt to configure that with (-7d to now)...... it charts throughput of each hours ( it seems this query adds up target hour from all seven days) showing only 24 spikes covering all 7 days ( i assume)
What am i doing wrong?
sourcetype="sourceA" "Conversion completed" AND source="/home/services.log" | rex field=_raw "Original file size: (?<Original_size>.\d+)" | rex field=_raw "Time spent: (?<time_spent>.\d+)" | stats sum(Original_size) as size1 sum(time_spent) as time1 by date_hour | eval time1=(time1)/1000 | eval conversion_rate=size1/(time1*1024*1024) | chart sum(conversion_rate) as "(MBytes_per_sec)" by date_hour
sourcetype="X" "Conversion completed" AND source="X.log" | rex field=_raw "Original file size: (?
this one worked for me...
Thanks for all the supports...
sourcetype="X" "Conversion completed" AND source="X.log" | rex field=_raw "Original file size: (?
this one worked for me...
Thanks for all the supports...
Here is a search I constructed to perform something similar (specifically, comparing licence usage per hour over the last N days):
index="_internal" source="*license_usage.log" | eval ISODate=strftime(strptime(date_year."-".date_month."-".date_mday, "%Y-%b-%d"), "%Y-%m-%d (%a)") | eval MB=b/1024/1024 | chart eval(round(sum(MB),0)) over date_hour by ISODate limit=0 | addcoltotals labelfield=date_hour
The reconstruction of the date is a little kludgy and could probably be improved dramatically using convert, but I threw this together in my early Splunking days.
There are only 24 possible values for date_hour
, so spanning over more than 24 hours will automatically group statistics from different days into the same bin, so to speak. Perhaps you could replace your stats
command with timechart span=1h sum(Original_Size) as size1 sum(time_spent) as time1
?
That will get you n x 24 slots for your statistics.