Getting Data In

Setting a maximum limit on a monitored file

genemats
Engager

We just had an application bug that spewed millions of duplicate messages into a Splunk monitored logfile. This caused a license violation to be triggered, before anyone could do anything about it.

In order to prevent this type of violation from happening we are considering putting a hard limit on the maximum logfile size Splunk can index. Most of our logfiles are normally apx < 300MB daily. So if we have a logfile that has > 500MB, we want to ignore anything else that is logged to the file after 500MB.

Is it possible to put such a hard byte limit on a monitored logfile or input source?

I've seen some answers that suggested putting a maxKBps datatransfer per second limit (in limits.conf), this one option, but perhaps some other option in Splunk could be used as well?

Tags (4)

tony_luu
Path Finder

same question here.

0 Karma

krish3
Contributor

Hi,

I have same situation i need to know is there a option for setting limit on a individual source or sourcetype??

thanks.

0 Karma

svarun
New Member

Any updates here? I am facing the same issue.

0 Karma
Get Updates on the Splunk Community!

Exporting Splunk Apps

Join us on Monday, October 21 at 11 am PT | 2 pm ET!With the app export functionality, app developers and ...

Cisco Use Cases, ITSI Best Practices, and More New Articles from Splunk Lantern

Splunk Lantern is a Splunk customer success center that provides advice from Splunk experts on valuable data ...

Build Your First SPL2 App!

Watch the recording now!.Do you want to SPL™, too? SPL2, Splunk's next-generation data search and preparation ...