On this particular installation I don't care about historical data. So I set maxTotalDataSizeMB to 500mb.
However I still get hit by "Daily indexing volume limit exceeded" error when there are a lot of junk logging data coming to the indexer.
Because Splunk will drop old events from the bucket and fill it with new ones, you could have hundreds of gigs of data pass through a bucket of 500mb. You need to restrict what you are sending to the indexer or use the nullQueue to dump events before hitting the index
Oh and just to be clear, I assume you are using a free/trial license? If so that is limited to indexing only 500mb a day, so as explained above you are exceeding this which is causing the warnings. If you do it three times or more within a 30 day rolling period on a free or trial license then you will lose the ability to search for 30 days.
Because Splunk will drop old events from the bucket and fill it with new ones, you could have hundreds of gigs of data pass through a bucket of 500mb. You need to restrict what you are sending to the indexer or use the nullQueue to dump events before hitting the index
Oh and just to be clear, I assume you are using a free/trial license? If so that is limited to indexing only 500mb a day, so as explained above you are exceeding this which is causing the warnings. If you do it three times or more within a 30 day rolling period on a free or trial license then you will lose the ability to search for 30 days.
Its quite easy to do and if you use the Universal Forwarder on your remote systems then you can do some more filtering before sending data on 🙂 Also if its helped then consider clicking the tick below the up and down arrow to the left of my answer, it will help others will similar problems in the future
thx for the reply. Yup, this is under the free license. I was trying to avoid going over the indexing limit by limiting the index size itself. But from your response it does look like I'll have to find ways to limiting the data input itself.