Getting Data In

how index in splunk old logs

Communicator

Hi everyone,
I set up a forwarder that started to index data logs from this morning, but it didn't send all the prior logs, I would like send to splunk ALL the logs, so from when syslog started to collect them until now, avoiding replication.
I think this happened, because I was already indexing these logs before, but I deleted the older index and I created a new one.
How can I accomplish this?

Thank you

0 Karma

Splunk remembers files that it already read to avoid duplicated data. If you want to reread logfiles, you have basically two options:

1) You clear the fishbucket of the forwarder. Which means that splunk forgets about all the logfiles and rereads them. If you chose this, you need to delete the corresponding indexes on your indexer to avoid duplicates. Be aware, this is not only for a certain input, it is for all inputs of the forwarder.

2) You add a crcsalt to your input. This means that the hash that splunk uses to remember a logfile is not the same again, because you append a salt. Search here for the term "crcSalt" for further information. Ofcourse this leads also to duplicates, if you do not clear the corresponding indexes on your indexer.

Greetings

Tom

Influencer

You can selectively tell the forwarder to forget individual files (CRCs) as well using btprobe as mentioned in this other answer: http://answers.splunk.com/answers/54147/how-can-i-trigger-the-re-indexing-of-a-single-file.html

0 Karma

Communicator

I tried to remove all the fishbucket and clean the indexer from 4 index.
after restarting the forwarder, 2 of the index, where correctly reindexed, the other 2 not.
I completely cleaned everything from the fishbucket folder on my forwarder, why is it not indexing old data?

0 Karma