Getting Data In

Setting a maximum limit on a monitored file

genemats
Engager

We just had an application bug that spewed millions of duplicate messages into a Splunk monitored logfile. This caused a license violation to be triggered, before anyone could do anything about it.

In order to prevent this type of violation from happening we are considering putting a hard limit on the maximum logfile size Splunk can index. Most of our logfiles are normally apx < 300MB daily. So if we have a logfile that has > 500MB, we want to ignore anything else that is logged to the file after 500MB.

Is it possible to put such a hard byte limit on a monitored logfile or input source?

I've seen some answers that suggested putting a maxKBps datatransfer per second limit (in limits.conf), this one option, but perhaps some other option in Splunk could be used as well?

Tags (4)

tony_luu
Path Finder

same question here.

0 Karma

krish3
Contributor

Hi,

I have same situation i need to know is there a option for setting limit on a individual source or sourcetype??

thanks.

0 Karma

svarun
New Member

Any updates here? I am facing the same issue.

0 Karma
Get Updates on the Splunk Community!

Brains, Bytes, and Boston: Learn from the Best at .conf25

When you think of Boston, you might picture colonial charm, world-class universities, or even the crack of a ...

Splunk AppDynamics Agents Webinar Series

Mark your calendars! On June 24th at 12PM PST, we’re going live with the second session of our Splunk ...

SplunkTrust Application Period is Officially OPEN!

It's that time, folks! The application/nomination period for the 2025 SplunkTrust is officially open! If you ...