I am always getting the below message in my Search head's
Even though I wrote:
staylocal = *.csv
maxmemtablebytes = 1000000000
The current bundle directory contains a large lookup file that might cause bundle replication fail.
The path to the directory is
Please do help me in fixing that issue and how to troubleshoot that.
Thanks in Advance!
Remove the lookup file from the current directory.
Restart the splunk service to recreate the bundle without the lookup file
Locate where the lookup file is, like a particular app and create a new distsearch.conf or specify the full path of the file
you are blacklisting all the csv files, so my suggestion is to type the name of the file and blacklist only this huge one.
staylocal = ...\apps\appsname\filename.csv
I would change the parameter because it is too high
* Maximum size, in bytes, of static lookup file to use an in-memory index for.
* Lookup files with size above maxmemtablebytes will be indexed on disk
* CAUTION: Setting this to a large value results in loading large lookup
files in memory. This leads to a bigger process memory footprint.
* Default: 10 000 000 (10MB)
I will setup to 50MB this is a reasonable size.
maxmemtablebytes = 50 000 000
After you adjust those parameters, restart splunk service I believe the file will not be added to the bundle, so copy the file back and check for the error messages.
In order to have access to this file once it is blacklisted, you have to use the command lookup local-true.
I deleted the file causing the problem.. but again getting the problem with new .delta file.
Should I need to write blacklist with splunk home
staylocal = $SPLUNK_HOME\apps\appsname\filename.csv
staylocal = apps\appsname\filename.csv
you can do:
staylocal = ...apps\appsname\filename.csv
because splunk will check for the entire path when you type the ...