Getting Data In

How to forward the entire contents of a CSV file even if its unchanged daily?

avansteen
New Member

Hello,
I'm attempting to forward a set of .csv files for administrator group auditing. However it only forwards, or at least the search only returns changes to the .csv file. For audit reasons, I need the entire contents of the .csv to ingest and not just the changes.

Is there a way to force the forwarder to ignore the fact it already gathered the data?

thanks,

0 Karma
1 Solution

marycordova
SplunkTrust
SplunkTrust

@avansteen

Maybe you could use [batch://<path>] to import and delete the .csv everytime it reads it so that the file is indexed everytime a new one is created regardless of the filename. You would want to just setup up a job that would write out the complete file however often you need; daily, hourly, etc.

I'm not 100% positive but I think it would work and is worth testing. The one issue might be if something in the fishbucket remembers the file and doesn't read the new one.

https://docs.splunk.com/Documentation/Splunk/7.1.2/Admin/Inputsconf#BATCH_.28.22Upload_a_file.22_in_...

Another option...if this is a universal forwarder dedicated to only ingesting this .csv file, you could brute force the thing by creating a script and a cron job to blast out the fish bucket... but this would be dangerous if you are collecting any other data with that universal forwarder

@marycordova

View solution in original post

0 Karma

marycordova
SplunkTrust
SplunkTrust

@avansteen

Maybe you could use [batch://<path>] to import and delete the .csv everytime it reads it so that the file is indexed everytime a new one is created regardless of the filename. You would want to just setup up a job that would write out the complete file however often you need; daily, hourly, etc.

I'm not 100% positive but I think it would work and is worth testing. The one issue might be if something in the fishbucket remembers the file and doesn't read the new one.

https://docs.splunk.com/Documentation/Splunk/7.1.2/Admin/Inputsconf#BATCH_.28.22Upload_a_file.22_in_...

Another option...if this is a universal forwarder dedicated to only ingesting this .csv file, you could brute force the thing by creating a script and a cron job to blast out the fish bucket... but this would be dangerous if you are collecting any other data with that universal forwarder

@marycordova
0 Karma

avansteen
New Member

Worked great once I read the instructions!

move_policy = sinkhole
* IMPORTANT: This setting is required. You must include "move_policy = sinkhole" when you define batch inputs.

Thank you,

0 Karma

marycordova
SplunkTrust
SplunkTrust

awesome 😄

@marycordova
0 Karma
Get Updates on the Splunk Community!

Introducing the 2024 SplunkTrust!

Hello, Splunk Community! We are beyond thrilled to announce our newest group of SplunkTrust members!  The ...

Introducing the 2024 Splunk MVPs!

We are excited to announce the 2024 cohort of the Splunk MVP program. Splunk MVPs are passionate members of ...

Splunk Custom Visualizations App End of Life

The Splunk Custom Visualizations apps End of Life for SimpleXML will reach end of support on Dec 21, 2024, ...