Getting Data In

How to forward the entire contents of a CSV file even if its unchanged daily?

avansteen
New Member

Hello,
I'm attempting to forward a set of .csv files for administrator group auditing. However it only forwards, or at least the search only returns changes to the .csv file. For audit reasons, I need the entire contents of the .csv to ingest and not just the changes.

Is there a way to force the forwarder to ignore the fact it already gathered the data?

thanks,

0 Karma
1 Solution

marycordova
SplunkTrust
SplunkTrust

@avansteen

Maybe you could use [batch://<path>] to import and delete the .csv everytime it reads it so that the file is indexed everytime a new one is created regardless of the filename. You would want to just setup up a job that would write out the complete file however often you need; daily, hourly, etc.

I'm not 100% positive but I think it would work and is worth testing. The one issue might be if something in the fishbucket remembers the file and doesn't read the new one.

https://docs.splunk.com/Documentation/Splunk/7.1.2/Admin/Inputsconf#BATCH_.28.22Upload_a_file.22_in_...

Another option...if this is a universal forwarder dedicated to only ingesting this .csv file, you could brute force the thing by creating a script and a cron job to blast out the fish bucket... but this would be dangerous if you are collecting any other data with that universal forwarder

@marycordova

View solution in original post

0 Karma

marycordova
SplunkTrust
SplunkTrust

@avansteen

Maybe you could use [batch://<path>] to import and delete the .csv everytime it reads it so that the file is indexed everytime a new one is created regardless of the filename. You would want to just setup up a job that would write out the complete file however often you need; daily, hourly, etc.

I'm not 100% positive but I think it would work and is worth testing. The one issue might be if something in the fishbucket remembers the file and doesn't read the new one.

https://docs.splunk.com/Documentation/Splunk/7.1.2/Admin/Inputsconf#BATCH_.28.22Upload_a_file.22_in_...

Another option...if this is a universal forwarder dedicated to only ingesting this .csv file, you could brute force the thing by creating a script and a cron job to blast out the fish bucket... but this would be dangerous if you are collecting any other data with that universal forwarder

@marycordova
0 Karma

avansteen
New Member

Worked great once I read the instructions!

move_policy = sinkhole
* IMPORTANT: This setting is required. You must include "move_policy = sinkhole" when you define batch inputs.

Thank you,

0 Karma

marycordova
SplunkTrust
SplunkTrust

awesome 😄

@marycordova
0 Karma
Get Updates on the Splunk Community!

A Season of Skills: New Splunk Courses to Light Up Your Learning Journey

There’s something special about this time of year—maybe it’s the glow of the holidays, maybe it’s the ...

Announcing the Migration of the Splunk Add-on for Microsoft Azure Inputs to ...

Announcing the Migration of the Splunk Add-on for Microsoft Azure Inputs to Officially Supported Splunk ...

Splunk Observability for AI

Don’t miss out on an exciting Tech Talk on Splunk Observability for AI! Discover how Splunk’s agentic AI ...