Getting Data In

Setting a maximum limit on a monitored file

genemats
Engager

We just had an application bug that spewed millions of duplicate messages into a Splunk monitored logfile. This caused a license violation to be triggered, before anyone could do anything about it.

In order to prevent this type of violation from happening we are considering putting a hard limit on the maximum logfile size Splunk can index. Most of our logfiles are normally apx < 300MB daily. So if we have a logfile that has > 500MB, we want to ignore anything else that is logged to the file after 500MB.

Is it possible to put such a hard byte limit on a monitored logfile or input source?

I've seen some answers that suggested putting a maxKBps datatransfer per second limit (in limits.conf), this one option, but perhaps some other option in Splunk could be used as well?

Tags (4)

tony_luu
Path Finder

same question here.

0 Karma

krish3
Contributor

Hi,

I have same situation i need to know is there a option for setting limit on a individual source or sourcetype??

thanks.

0 Karma

svarun
New Member

Any updates here? I am facing the same issue.

0 Karma
Get Updates on the Splunk Community!

Building Reliable Asset and Identity Frameworks in Splunk ES

 Accurate asset and identity resolution is the backbone of security operations. Without it, alerts are ...

Cloud Monitoring Console - Unlocking Greater Visibility in SVC Usage Reporting

For Splunk Cloud customers, understanding and optimizing Splunk Virtual Compute (SVC) usage and resource ...

Automatic Discovery Part 3: Practical Use Cases

If you’ve enabled Automatic Discovery in your install of the Splunk Distribution of the OpenTelemetry ...