All Apps and Add-ons

Why are our searches with error: Problem replicating bundle to search peer: http_status=400 "Failed to untar the bundle"

eblackburn
Path Finder

We upgraded some apps and add-ons the other day, and ever since then, we're running into issues with searches breaking and getting error messages stating, "Problem replicating config (bundle) to search peer ' <X.X.X.X:8089>', Upload bundle="E:\Splunk\var\run\<bundle_id>.bundle" to peer name=<indexer> uri=https://X.X.X.X:8089 failed; http_status=400 http_description="Failed to untar the bundle="E:\Splunk\var\run\searchpeers\<bundle_id>.bundle". This could be due Search Head attempting to upload the same bundle again after a timeout. Check for sendRcvTimeout message in splund.log, consider increasing it.".

We increased the sendRcvTimeout in distsearch.conf on our search heads from the default of 60 to 300, but we're still getting this. Most of our searches come back with 0 results.  

Has anyone come across this before? I haven't really seen other posts with that "failed to untar" error message.

I looked through some of the add-ons and apps that are included in the bundles, but I didn't see any really large lookup .csv or files in the /bin directory.  (I did see one add-on mentioned in splunkd.log that said "File length is greater than 260, File creation may fail" followed by the untar immediately failing - I am investigating this further.)

Our architecture includes a search head deployer with 4 SHs, index cluster master with 4 indexers and a deployment server. 

Thank you for any advice!

Labels (4)
0 Karma

DaClyde
Contributor

I don't have any solutions, but we started seeing this on our system this week, though I don't think it followed any updates.  Looking for the sendRcvTimeout reference, I notice that has begun showing up in the Splunk logs just in the past month. 

The only thing we're doing new is starting to experiment with the loadjob command to reference some fairly large jobs. 

We only have a single indexer and a single search head.  We are running 8.2.1.

0 Karma

eblackburn
Path Finder

Thank you for the response.

Just an update on my end, we discovered that this was the indicator of the problem: "File length is greater than 260, File creation may fail". We had a couple of add-ons that had python files nested in folder paths that were so long, it was exceeding that 260 character limit by the time it was paired with the search bundle name. It seems like it's just a Windows problem. We ended up renaming some files to shorten the paths and everything started working again.

smurf
Communicator

I just had the same problem and this solved it.

0 Karma
Got questions? Get answers!

Join the Splunk Community Slack to learn, troubleshoot, and make connections with fellow Splunk practitioners in real time!

Meet up IRL or virtually!

Join Splunk User Groups to connect and learn in-person by region or remotely by topic or industry.

Get Updates on the Splunk Community!

[Puzzles] Solve, Learn, Repeat: Character substitutions with Regular Expressions

This challenge was first posted on Slack #puzzles channelFor BORE at .conf23, we had a puzzle question which ...

Splunk Community Badges!

  Hey everyone! Ready to earn some serious bragging rights in the community? Along with our existing badges ...

[Puzzles] Solve, Learn, Repeat: Matching cron expressions

This puzzle (first published here) is based on matching timestamps to cron expressions.All the timestamps ...