Hi All,
I have multiple CSV files which are on the local machine under the same directory. I would like to add these files and index them.
I have multiple CSV files with fields let's say.
Can someone guide me the best way to do it?
Thanks In Adv.
Create the index ahead of time (settings->index->new)
Then in your inputs.conf file on the universal forwarder put in something like this
[batch:///var/nfs/SAT_SplunkLogs/ts/beta/*.csv]
move_policy = sinkhole
host_segment=5
sourcetype=csv
index=betats
then restart splunk on the universal forwarder
FYI . you can substitute monitor for batch, the move_policy = sinkhole will delete the CSV file after it is indexed
@dbcase: Is this for dynamic CSV file?
for dynamic (meaning the file gets added to) I'd use something like this
[monitor:///var/nfs/SAT_SplunkLogs/version/*.csv]
crcSalt = defprof
sourcetype=csv
index=allmsos
If the csv file gets created, then indexed then a new file gets created I'd use the batch
method.
The batch method with the move_policy sinkhole parameter will index the csv file, then delete it so a new csv file can be written.
If an answer to your question has solved your issue, please accept the answer.
Maybe monitoring the folder? Are the files dynamic or static?
@felipesewaybricker: Thanks for responding my query! In both the cases i.e if CSV files are 1. static and 2. dynamic.
Nice, you can monitor the folder, send to the same index and perform the searches as follows: index = nnn source = file.csv
@felipesewaybricker: This is just for clarification, Are you saying we need to use "Monitoring" option instead of "Upload" option while uploading files. If this is Yes then how will i be creating Index for that ?