Installation

[SmartStore] 502 bad request when attempting to connect

emallinger
Communicator

Hello,

Does anyone have a suggestion as to why I can't seem to connect to my S3 storage :

Here's the log :

S3Client - command=put transactionId=0x7f689102dc00 rTxnId=0x7f6891034100 status=completed success=N uri=https://--------.fr/bucket_name/all_indexes/_metrics/db/73/ae/19~BD80BC05-71FE-4234-AE23-58B4ED6CA82D/guidSplunk-BD80BC05-71FE-4234-AE23-58B4ED6CA82D/Sources.data statusCode=502 statusDescription="Bad Gateway"

I've successfully used a similar configuration in splunk on another machine with a zenko container.

Here I'm trying to connect to a "real" S3 scality ring via a VS entry who is acting as a load balancer.

I've got a couple access/secret key with write permission.

And I access the S3 storage using the same machine but in CLI with "aws s3" command.

 

Here's my indexes.conf

[volume:remote_store]
storageType = remote
path = s3://bucket_name/all_indexes/
remote.s3.endpoint = https://myendpoint.fr
remote.s3.access_key = access_key (replaced)
remote.s3.secret_key = secret_key
maxVolumeDataSizeMB = 10000000

[default]
remotePath = volume:remote_store/$_index_name
repFactor = auto

 

Do you have any suggestion ?

Is my conf faulty somewhere ?

Could the endpoint VS be the cause ? => Does it need to have "http profil" = none ?

 

Thank you in advance for your ideas and clues.

Regards,

Ema

Labels (1)
Tags (2)
0 Karma
Get Updates on the Splunk Community!

Announcing Scheduled Export GA for Dashboard Studio

We're excited to announce the general availability of Scheduled Export for Dashboard Studio. Starting in ...

Extending Observability Content to Splunk Cloud

Watch Now!   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to leverage ...

More Control Over Your Monitoring Costs with Archived Metrics GA in US-AWS!

What if there was a way you could keep all the metrics data you need while saving on storage costs?This is now ...