All Apps and Add-ons

Why are our searches with error: Problem replicating bundle to search peer: http_status=400 "Failed to untar the bundle"

eblackburn
Path Finder

We upgraded some apps and add-ons the other day, and ever since then, we're running into issues with searches breaking and getting error messages stating, "Problem replicating config (bundle) to search peer ' <X.X.X.X:8089>', Upload bundle="E:\Splunk\var\run\<bundle_id>.bundle" to peer name=<indexer> uri=https://X.X.X.X:8089 failed; http_status=400 http_description="Failed to untar the bundle="E:\Splunk\var\run\searchpeers\<bundle_id>.bundle". This could be due Search Head attempting to upload the same bundle again after a timeout. Check for sendRcvTimeout message in splund.log, consider increasing it.".

We increased the sendRcvTimeout in distsearch.conf on our search heads from the default of 60 to 300, but we're still getting this. Most of our searches come back with 0 results.  

Has anyone come across this before? I haven't really seen other posts with that "failed to untar" error message.

I looked through some of the add-ons and apps that are included in the bundles, but I didn't see any really large lookup .csv or files in the /bin directory.  (I did see one add-on mentioned in splunkd.log that said "File length is greater than 260, File creation may fail" followed by the untar immediately failing - I am investigating this further.)

Our architecture includes a search head deployer with 4 SHs, index cluster master with 4 indexers and a deployment server. 

Thank you for any advice!

Labels (4)
0 Karma

DaClyde
Contributor

I don't have any solutions, but we started seeing this on our system this week, though I don't think it followed any updates.  Looking for the sendRcvTimeout reference, I notice that has begun showing up in the Splunk logs just in the past month. 

The only thing we're doing new is starting to experiment with the loadjob command to reference some fairly large jobs. 

We only have a single indexer and a single search head.  We are running 8.2.1.

0 Karma

eblackburn
Path Finder

Thank you for the response.

Just an update on my end, we discovered that this was the indicator of the problem: "File length is greater than 260, File creation may fail". We had a couple of add-ons that had python files nested in folder paths that were so long, it was exceeding that 260 character limit by the time it was paired with the search bundle name. It seems like it's just a Windows problem. We ended up renaming some files to shorten the paths and everything started working again.

smurf
Communicator

I just had the same problem and this solved it.

0 Karma
Get Updates on the Splunk Community!

What's new in Splunk Cloud Platform 9.1.2312?

Hi Splunky people! We are excited to share the newest updates in Splunk Cloud Platform 9.1.2312! Analysts can ...

What’s New in Splunk Security Essentials 3.8.0?

Splunk Security Essentials (SSE) is an app that can amplify the power of your existing Splunk Cloud Platform, ...

Let’s Get You Certified – Vegas-Style at .conf24

Are you ready to level up your Splunk game? Then, let’s get you certified live at .conf24 – our annual user ...