Dashboards & Visualizations

Splunk Average Daily Ingest by Month Excluding Weekends

scout29
Path Finder

I am looking to create a chart showing the average daily total ingest by month in Terabytes excluding weekends over the past year. For some reason I am struggling with this. Any help getting me started would be appreciated.

Labels (3)
0 Karma

scout29
Path Finder

So i am using the below search run over the past 365 days, however it is only providing me with the last month. How do i get the average monthly ingest (excluding weekends) for each month over the past year?

index=_internal source=*license_usage.log* type="RolloverSummary" splunk_server=* NOT date_wday IN ("saturday", "sunday")
| stats avg(b) AS bytes BY date_month
| eval TB=(bytes/1024/1024/1024/1024)
0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi @scout29 ,

only completing the perfect solution from @PickleRick :

index=_internal [`set_local_host`] source=*license_usage.log* type="RolloverSummary" NOT date_wday IN ("saturday", "monday")
| stats avg(b) AS bytes BY date_month
| eval TB= bytes/1024/1024/1024/1024

if you want to exclude also holydays from your average, you have to create a lookup containing the holydays for the exclusion.

Ciao.

Giuseppe

0 Karma

scout29
Path Finder

So i am using the below search run over the past 365 days, however it is only providing me with the last month. How do i get the average monthly ingest (excluding weekends) for each month over the past year?

index=_internal source=*license_usage.log* type="RolloverSummary" splunk_server=* NOT date_wday IN ("saturday", "sunday")
| stats avg(b) AS bytes BY date_month
| eval TB=(bytes/1024/1024/1024/1024)
Tags (1)
0 Karma

PickleRick
SplunkTrust
SplunkTrust

Check your _internal index retention policy. If your index rolls to frozen in 30 days you simply don't have the events for a longer timespan.

0 Karma

PickleRick
SplunkTrust
SplunkTrust

I don't like to use the default time-related fields.

1. They don't have to be present

2. Quoting the docs:

Note: Only events that have timestamp information in them as generated by their respective systems will have date_* fields. If an event has a date_* field, it represents the value of time/date directly from the event itself. If you have specified any timezone conversions or changed the value of the time/date at indexing or input time (for example, by setting the timestamp to be the time at index or input time), these fields will not represent that.

0 Karma

PickleRick
SplunkTrust
SplunkTrust

Please define what you mean by "average daily ingest excluding weekends". Do you mean to sum only values from monday to friday and divide by 5 days weekly or do you want to sum values from whole 7 days and divide by 5 days or maybe sum values from 5 days and divide by 7 days? (of course extrapolated to your whole search timerange but I mean how do you wanna treat those weekends).

0 Karma

scout29
Path Finder

Im looking for the daily average for each month excluding weekends all together. So for example, for September, what was the average daily ingest for all days Monday through Friday. 

0 Karma

PickleRick
SplunkTrust
SplunkTrust

OK. So if your June had 23 working days, you want only sum of the license usage during those 23 days divided by 23, right? You simply count as if the week was 5 days long and completely ignore existence of saturdays and sundays?

As you open the licensing report in search, you see something like this:

index=_internal [`set_local_host`] source=*license_usage.log* type="RolloverSummary"

I suppose if you have a distributed environment you might not have the localhost part but some other form of choosing indexers.

Anyway, since it's done right after midnight to calculate summarized amount of license used per day, the search behind the report substracts half a day (43200 seconds) from the _time field and then does binning over _time.

And that's pretty much it - the b field contains sum of bytes indexed.

Now you only have to filter out the saturdays/sundays (possibly with strftime) and do a stats avg and Robert is your father's brother.

0 Karma
Get Updates on the Splunk Community!

Earn a $35 Gift Card for Answering our Splunk Admins & App Developer Survey

Survey for Splunk Admins and App Developers is open now! | Earn a $35 gift card!      Hello there,  Splunk ...

Continuing Innovation & New Integrations Unlock Full Stack Observability For Your ...

You’ve probably heard the latest about AppDynamics joining the Splunk Observability portfolio, deepening our ...

Monitoring Amazon Elastic Kubernetes Service (EKS)

As we’ve seen, integrating Kubernetes environments with Splunk Observability Cloud is a quick and easy way to ...