Splunk Search

How to search which monitored log files caused me to exceed my Splunk Light 500MB license limit?

actanzhang
Explorer

I am using Splunk light and have a <500 MB indexed file license limit. I am using 5 universal forwarders which are all in Windows and 2 local dirs on local Linux (Splunk light server local machine)
I checked triple checked that my daily log files won't be bigger than 250 MB. Actually in the last month, the counts all my logs from 6 machines are < 2GB, but in last week I exceeded daily 500 mb quota 5 times and now I can't search! Looks I have to wait >23 days to waiting for the limit released.

I can only search _internal index now, which are Splunk's own logs. Is there some search I can use to find out the detailed license exceeding process to find out when and what files caused the license violations?

I am very puzzled how can I exceed the limit so many times last week.

Thanks

1 Solution

martin_mueller
SplunkTrust
SplunkTrust

In Splunk Enterprise you can run this search to get you the usage per host-source-combination:

index=_internal source=*license_usage* type=Usage | stats sum(b) as bytes by h s | sort - bytes

I guess this log should exist in Splunk Light as well.

View solution in original post

martin_mueller
SplunkTrust
SplunkTrust

In Splunk Enterprise you can run this search to get you the usage per host-source-combination:

index=_internal source=*license_usage* type=Usage | stats sum(b) as bytes by h s | sort - bytes

I guess this log should exist in Splunk Light as well.

View solution in original post

martin_mueller
SplunkTrust
SplunkTrust

This sums up the bytes per source path over the specified time range. As a result, rotated instances of the same source path are added together, and days are added together as well. You can either restrict the time range to single days or change the search like this:

index=_internal source=*license_usage* type=Usage | bin span=1d _time | stats sum(b) as bytes by _time h s

actanzhang
Explorer

Thanks Martin, you really helped me a lot!!

0 Karma

actanzhang
Explorer

Hi Martin,

Thanks! I got the result now. But can you explain more what this query is about?
Looks it's telling me the big log file that indexed by splunk order by file size, which yes that what I am looking for, but what confused me is the size of the reported file,
3027102191 bytes = 2.8 GB , 2636882398 bytes = 2.45 GB , 616384496 = 587.8 MB,
But I really don't have files in such big size, they are all just no more than 15 MB, why splunk found that are so big?

And from the license usage history, my exceedings are all about 1GB indexes, I have never see 1 day that indexed 2 GB file.
Thanks

0 Karma
Take the 2021 Splunk Career Survey

Help us learn about how Splunk has
impacted your career by taking the 2021 Splunk Career Survey.

Earn $50 in Amazon cash!