I have multiple CSV files which are on the local machine under the same directory. I would like to add these files and index them.
I have multiple CSV files with fields let's say.
Can someone guide me the best way to do it?
Thanks In Adv.
Create the index ahead of time (settings->index->new)
Then in your inputs.conf file on the universal forwarder put in something like this
[batch:///var/nfs/SAT_SplunkLogs/ts/beta/*.csv] move_policy = sinkhole host_segment=5 sourcetype=csv index=betats
then restart splunk on the universal forwarder
FYI . you can substitute monitor for batch, the move_policy = sinkhole will delete the CSV file after it is indexed
for dynamic (meaning the file gets added to) I'd use something like this
[monitor:///var/nfs/SAT_SplunkLogs/version/*.csv] crcSalt = defprof sourcetype=csv index=allmsos
If the csv file gets created, then indexed then a new file gets created I'd use the
The batch method with the move_policy sinkhole parameter will index the csv file, then delete it so a new csv file can be written.
@felipesewaybricker: This is just for clarification, Are you saying we need to use "Monitoring" option instead of "Upload" option while uploading files. If this is Yes then how will i be creating Index for that ?