Deployment Architecture

Need help with bundle replication fail

satyaallaparthi
Communicator

Hello,

I am always getting the below message in my Search head's
Even though I wrote:

distsearch.conf:

[replicationBlacklist]
staylocal = *.csv

limits.conf:
[lookup]
max_memtable_bytes = 1000000000

The current bundle directory contains a large lookup file that might cause bundle replication fail.
The path to the directory is
C:\Program Files\Splunk\var\run\61CFC563-07F6-44A0-9DFF-31D6A01BA6D9-1571219534-1571220067.delta.

Please do help me in fixing that issue and how to troubleshoot that.

Thanks in Advance!

0 Karma
1 Solution

ivanreis
Builder

Remove the lookup file from the current directory.
Restart the splunk service to recreate the bundle without the lookup file

Locate where the lookup file is, like a particular app and create a new distsearch.conf or specify the full path of the file
you are blacklisting all the csv files, so my suggestion is to type the name of the file and blacklist only this huge one.

[replicationBlacklist]
staylocal = ...\apps\appsname\filename.csv

I would change the parameter because it is too high
max_memtable_bytes =
* Maximum size, in bytes, of static lookup file to use an in-memory index for.
* Lookup files with size above max_memtable_bytes will be indexed on disk
* CAUTION: Setting this to a large value results in loading large lookup
files in memory. This leads to a bigger process memory footprint.
* Default: 10 000 000 (10MB)

I will setup to 50MB this is a reasonable size.

limits.conf:
[lookup]
max_memtable_bytes = 50 000 000

After you adjust those parameters, restart splunk service I believe the file will not be added to the bundle, so copy the file back and check for the error messages.

In order to have access to this file once it is blacklisted, you have to use the command lookup local-true.

View solution in original post

ivanreis
Builder

both configuration will work.

staylocal = $SPLUNK_HOME\apps\appsname\filename.csv

staylocal = apps\appsname\filename.csv

0 Karma

ivanreis
Builder

you can do:
staylocal = ...apps\appsname\filename.csv

because splunk will check for the entire path when you type the ...

0 Karma

ivanreis
Builder

Remove the lookup file from the current directory.
Restart the splunk service to recreate the bundle without the lookup file

Locate where the lookup file is, like a particular app and create a new distsearch.conf or specify the full path of the file
you are blacklisting all the csv files, so my suggestion is to type the name of the file and blacklist only this huge one.

[replicationBlacklist]
staylocal = ...\apps\appsname\filename.csv

I would change the parameter because it is too high
max_memtable_bytes =
* Maximum size, in bytes, of static lookup file to use an in-memory index for.
* Lookup files with size above max_memtable_bytes will be indexed on disk
* CAUTION: Setting this to a large value results in loading large lookup
files in memory. This leads to a bigger process memory footprint.
* Default: 10 000 000 (10MB)

I will setup to 50MB this is a reasonable size.

limits.conf:
[lookup]
max_memtable_bytes = 50 000 000

After you adjust those parameters, restart splunk service I believe the file will not be added to the bundle, so copy the file back and check for the error messages.

In order to have access to this file once it is blacklisted, you have to use the command lookup local-true.

satyaallaparthi
Communicator

I deleted the file causing the problem.. but again getting the problem with new .delta file.
Should I need to write blacklist with splunk home
staylocal = $SPLUNK_HOME\apps\appsname\filename.csv

or just

staylocal = apps\appsname\filename.csv

0 Karma
Career Survey
First 500 qualified respondents will receive a $20 gift card! Tell us about your professional Splunk journey.
Get Updates on the Splunk Community!

Tech Talk Recap | Mastering Threat Hunting

Mastering Threat HuntingDive into the world of threat hunting, exploring the key differences between ...

Observability for AI Applications: Troubleshooting Latency

If you’re working with proprietary company data, you’re probably going to have a locally hosted LLM or many ...

Splunk AI Assistant for SPL vs. ChatGPT: Which One is Better?

In the age of AI, every tool promises to make our lives easier. From summarizing content to writing code, ...