Getting Data In

How to reindex data from a forwarder

mataharry
Communicator

I have a Storm project and I want to clean all and reindex only the last days, and some specific files.
I have Splunk Universal forwarders monitoring my files for now.

I suppose that this is similar for a Splunk Enterprise, when we clear an index and Storm when we manually empty a project.

Tags (3)
1 Solution

yannK
Splunk Employee
Splunk Employee

First of all even before reindexing, to configure Splunk to index only recent data, you can use the 2 techniques:

  • for file monitoring, add the parameter ignoreOlderThan in inputs.conf
    It will look at the modtime of the files, example : ignoreOlderThan=7d will index only files touched during the last 7 days. On linux you can couple this with the touch command to change the modtime of a file and trigger the indexing.
    see http://docs.splunk.com/Documentation/Splunk/latest/admin/Inputsconf

  • for WinEventLogs, you can setup the parameter current_only=1 in inputs.conf to exclude the historical logs, and starts only now.


Now that you have setup your inputs to avoid blastering your instance, you can focus on How to force a splunk instance to reindex a file that has already be indexed.

  • the radical method is to clean the fishbucket index . That will remove the memory of every files, But it will reindex all.
  • on an indexer splunk clean eventdata -index _fishbucket
  • on a forwarder by removing the folder $SPLUNK_HOME/var/lib/splunk/fishbucket

  • or selectivelly forgot a single file from the fishbucket

    splunk cmd btprobe -d $SPLUNK_HOME/var/lib/splunk/fishbucket/splunk_private_db --file $FILE --reset

  • manually reindex each file with the oneshot option,
    you also can edit the log file and add a comment on the first line that will force the file to detected as a new file.

    ./splunk add oneshot "/path/to/my/file.log" -sourcetype mysourcetype

  • modify the first line of the files to reindex, by default splunk checks the first 256 chars of a file to differentiate them. If you had a simple comment on the first line it wil reindex it

  • change the crcSalt, create a new input for a new folder, add all the correct sourcetypes, etc...
    using a static string that will force a one time reindexing.

    crcSalt= REINDEXMEPLEASE

or add the option

crcSalt= <SOURCE>

then move or copy the files to be reindex to the folder, they will be detected as new (because the path will be considered in the crc calculation). (ps the source field will be different of course.)

see http://docs.splunk.com/Documentation/Splunk/latest/admin/Inputsconf


Remark : before reindexing you may want to remove the existing data in splunk to avoid duplicates.

Remark : if you are monitoring windows logs (wineventlog) or are using modular inputs, the counters are not in the fishbucket.
you need to clear the checkpoints files in $SPLUNK_HOME/var/lib/splunk/modinputs/

View solution in original post

gabmsmith
Explorer

Would it work to replace the Universal Forwarder with a new one?

0 Karma

gabmsmith
Explorer

This didn't work...

0 Karma

codebuilder
Influencer

You can also trigger the forwarder to re-ingest files by adding/modifying the crcsalt parameter within inputs.conf.

crcSalt = <string>
* Use this setting to force the input to consume files that have matching CRCs
  (cyclic redundancy checks).
    * By default, the input only performs CRC checks against the first 256
      bytes of a file. This behavior prevents the input from indexing the same
      file twice, even though you might have renamed it, as with rolling log
      files, for example. Because the CRC is based on only the first
      few lines of the file, it is possible for legitimately different files
      to have matching CRCs, particularly if they have identical headers.
* If set, <string> is added to the CRC.
* If set to the literal string "<SOURCE>" (including the angle brackets), the
  full directory path to the source file is added to the CRC. This ensures
  that each file being monitored has a unique CRC. When 'crcSalt' is invoked,
  it is usually set to <SOURCE>.
* Be cautious about using this setting with rolling log files; it could lead
  to the log file being re-indexed after it has rolled.
* In many situations, 'initCrcLength' can be used to achieve the same goals.
* Default: empty string.
----
An upvote would be appreciated and Accept Solution if it helps!

slierninja
Communicator

To re-index forwarded events for Windows Event logs, I had to remove the modinputs file that bookmarks the last RecordId sent to splunk indexer....I also deleted the fishbuckets as suggested by @yannK

Source Path: SplunkUniversalForwarder\var\lib\splunk\modinputs\WinEventLog

<BookmarkList>
  <Bookmark Channel='application' RecordId='48426' IsCurrent='true'/>
</BookmarkList>

Removing the application file and restarting the forwarder will force a reindex for windows event logs.

yannK
Splunk Employee
Splunk Employee

First of all even before reindexing, to configure Splunk to index only recent data, you can use the 2 techniques:

  • for file monitoring, add the parameter ignoreOlderThan in inputs.conf
    It will look at the modtime of the files, example : ignoreOlderThan=7d will index only files touched during the last 7 days. On linux you can couple this with the touch command to change the modtime of a file and trigger the indexing.
    see http://docs.splunk.com/Documentation/Splunk/latest/admin/Inputsconf

  • for WinEventLogs, you can setup the parameter current_only=1 in inputs.conf to exclude the historical logs, and starts only now.


Now that you have setup your inputs to avoid blastering your instance, you can focus on How to force a splunk instance to reindex a file that has already be indexed.

  • the radical method is to clean the fishbucket index . That will remove the memory of every files, But it will reindex all.
  • on an indexer splunk clean eventdata -index _fishbucket
  • on a forwarder by removing the folder $SPLUNK_HOME/var/lib/splunk/fishbucket

  • or selectivelly forgot a single file from the fishbucket

    splunk cmd btprobe -d $SPLUNK_HOME/var/lib/splunk/fishbucket/splunk_private_db --file $FILE --reset

  • manually reindex each file with the oneshot option,
    you also can edit the log file and add a comment on the first line that will force the file to detected as a new file.

    ./splunk add oneshot "/path/to/my/file.log" -sourcetype mysourcetype

  • modify the first line of the files to reindex, by default splunk checks the first 256 chars of a file to differentiate them. If you had a simple comment on the first line it wil reindex it

  • change the crcSalt, create a new input for a new folder, add all the correct sourcetypes, etc...
    using a static string that will force a one time reindexing.

    crcSalt= REINDEXMEPLEASE

or add the option

crcSalt= <SOURCE>

then move or copy the files to be reindex to the folder, they will be detected as new (because the path will be considered in the crc calculation). (ps the source field will be different of course.)

see http://docs.splunk.com/Documentation/Splunk/latest/admin/Inputsconf


Remark : before reindexing you may want to remove the existing data in splunk to avoid duplicates.

Remark : if you are monitoring windows logs (wineventlog) or are using modular inputs, the counters are not in the fishbucket.
you need to clear the checkpoints files in $SPLUNK_HOME/var/lib/splunk/modinputs/

R15
Path Finder

This list has aged quite a bit, is it still accurate? @yannK 

0 Karma

yannK
Splunk Employee
Splunk Employee

@R15  For monitoring Stanzas, it's still pretty much the same.
However, many new type of inputs exists too (modular, scripted, HEC etc...), who do not rely on the fishbucket.

0 Karma

lyeager
Engager

In 8.2.6, the btprobe command is missing:

$ ./bin/splunk help commands | grep cmd
    cmd [btool|exporttool|importtool|locktest|locktool|parsetest|pcregextest|signtool|walklex]

 EDIT: nevermind, yes it is. It's just not listed in the help menu.

0 Karma

jgreen12
New Member

Will any of these methods work for re-indexing the data from an API? Many of the resources I've found only mention log files when speaking of re-indexing. My data input is an API. I am able to clean the index for this API, but want to ensure I can re-index all the data.

0 Karma

gjanders
SplunkTrust
SplunkTrust

jgreen12 please open a new question as this question is answered (and very old)

It may or may not relate to modular inputs and there may be a checkpoint file keeping track of the data it has obtained but a new question would make more sense here rather than guessing...

0 Karma

sloshburch
Splunk Employee
Splunk Employee

Agreed. Also, you may need to check with the creator of that particular add-on. Once you create the new question thread, link to it here and we can jump over there.

0 Karma

sloshburch
Splunk Employee
Splunk Employee

FYI. Now it's _thefishbucket, not _fishbucket. At least in 6.6.2...

season88481
Contributor

Hi I am using Splunk 6.4, just use:
splunk clean eventdata -index _fishbucket

'-index' is no longer required.

0 Karma

MuS
Legend

$SPLUNK_HOME/bin/splunk clean all still works on 6.4.2 😉

0 Karma

bnorthway
Path Finder

FYI the Splunk service must be stopped before removing files from the fishbucket

coleman07
Path Finder

This method doesn't work with the Splunk 6 Forwarder but I found if you remove all directories in C:\Program Files\SplunkUniversalForwarder\var\lib\splunk, this will force Splunk to reindex all the Window's logs. You have to remove all of them.

Get Updates on the Splunk Community!

New Case Study Shows the Value of Partnering with Splunk Academic Alliance

The University of Nevada, Las Vegas (UNLV) is another premier research institution helping to shape the next ...

How to Monitor Google Kubernetes Engine (GKE)

We’ve looked at how to integrate Kubernetes environments with Splunk Observability Cloud, but what about ...

Index This | How can you make 45 using only 4?

October 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with this ...