Splunk Search

S3 Bucket Logging

New Member

I have a bunch of web servers that are currently streaming their logs (real time) into an S3 bucket.

I have the Splunk AWS add-on installed to collect those logs but have found that it duplicates the data and given the amount of data coming in, this would kill our license very quickly. The owners of these servers will not allow us to install the UF on it.  Setting up a HF is also a no go.
Is there another way to ingest those logs without duplication.
I have seen there might be a way with a custom props.conf file but I am not sure this is way.

Any ideas would be welcome.



Labels (1)
0 Karma
Get Updates on the Splunk Community!

Streamline Data Ingestion With Deployment Server Essentials

REGISTER NOW!Every day the list of sources Admins are responsible for gets bigger and bigger, often making the ...

Remediate Threats Faster and Simplify Investigations With Splunk Enterprise Security ...

REGISTER NOW!Join us for a Tech Talk around our latest release of Splunk Enterprise Security 7.2! We’ll walk ...

Introduction to Splunk AI

WATCH NOWHow are you using AI in Splunk? Whether you see AI as a threat or opportunity, AI is here to stay. ...