All Apps and Add-ons

Why are our searches with error: Problem replicating bundle to search peer: http_status=400 "Failed to untar the bundle"

Path Finder

We upgraded some apps and add-ons the other day, and ever since then, we're running into issues with searches breaking and getting error messages stating, "Problem replicating config (bundle) to search peer ' <X.X.X.X:8089>', Upload bundle="E:\Splunk\var\run\<bundle_id>.bundle" to peer name=<indexer> uri=https://X.X.X.X:8089 failed; http_status=400 http_description="Failed to untar the bundle="E:\Splunk\var\run\searchpeers\<bundle_id>.bundle". This could be due Search Head attempting to upload the same bundle again after a timeout. Check for sendRcvTimeout message in splund.log, consider increasing it.".

We increased the sendRcvTimeout in distsearch.conf on our search heads from the default of 60 to 300, but we're still getting this. Most of our searches come back with 0 results.  

Has anyone come across this before? I haven't really seen other posts with that "failed to untar" error message.

I looked through some of the add-ons and apps that are included in the bundles, but I didn't see any really large lookup .csv or files in the /bin directory.  (I did see one add-on mentioned in splunkd.log that said "File length is greater than 260, File creation may fail" followed by the untar immediately failing - I am investigating this further.)

Our architecture includes a search head deployer with 4 SHs, index cluster master with 4 indexers and a deployment server. 

Thank you for any advice!

Labels (4)
0 Karma


I don't have any solutions, but we started seeing this on our system this week, though I don't think it followed any updates.  Looking for the sendRcvTimeout reference, I notice that has begun showing up in the Splunk logs just in the past month. 

The only thing we're doing new is starting to experiment with the loadjob command to reference some fairly large jobs. 

We only have a single indexer and a single search head.  We are running 8.2.1.

0 Karma

Path Finder

Thank you for the response.

Just an update on my end, we discovered that this was the indicator of the problem: "File length is greater than 260, File creation may fail". We had a couple of add-ons that had python files nested in folder paths that were so long, it was exceeding that 260 character limit by the time it was paired with the search bundle name. It seems like it's just a Windows problem. We ended up renaming some files to shorten the paths and everything started working again.


I just had the same problem and this solved it.

0 Karma
Get Updates on the Splunk Community!

Observability | How to Think About Instrumentation Overhead (White Paper)

Novice observability practitioners are often overly obsessed with performance. They might approach ...

Cloud Platform | Get Resiliency in the Cloud Event (Register Now!)

IDC Report: Enterprises Gain Higher Efficiency and Resiliency With Migration to Cloud  Today many enterprises ...

The Great Resilience Quest: 10th Leaderboard Update

The tenth leaderboard update (11.23-12.05) for The Great Resilience Quest is out &gt;&gt; As our brave ...