Getting Data In

Setting a maximum limit on a monitored file

genemats
Engager

We just had an application bug that spewed millions of duplicate messages into a Splunk monitored logfile. This caused a license violation to be triggered, before anyone could do anything about it.

In order to prevent this type of violation from happening we are considering putting a hard limit on the maximum logfile size Splunk can index. Most of our logfiles are normally apx < 300MB daily. So if we have a logfile that has > 500MB, we want to ignore anything else that is logged to the file after 500MB.

Is it possible to put such a hard byte limit on a monitored logfile or input source?

I've seen some answers that suggested putting a maxKBps datatransfer per second limit (in limits.conf), this one option, but perhaps some other option in Splunk could be used as well?

Tags (4)

tony_luu
Path Finder

same question here.

0 Karma

krish3
Contributor

Hi,

I have same situation i need to know is there a option for setting limit on a individual source or sourcetype??

thanks.

0 Karma

svarun
New Member

Any updates here? I am facing the same issue.

0 Karma
Get Updates on the Splunk Community!

Monitoring Postgres with OpenTelemetry

Behind every business-critical application, you’ll find databases. These behind-the-scenes stores power ...

Mastering Synthetic Browser Testing: Pro Tips to Keep Your Web App Running Smoothly

To start, if you're new to synthetic monitoring, I recommend exploring this synthetic monitoring overview. In ...

Splunk Edge Processor | Popular Use Cases to Get Started with Edge Processor

Splunk Edge Processor offers more efficient, flexible data transformation – helping you reduce noise, control ...