Hello,
Is it possible to setup a SQS consumer on Splunk Cloud?
I have a vendor that drops logs onto an S3 bucket that is assigned to me but it is under their control.
They have also setup an SQS queue and disclosed to me the credentials.
How would you suggest that I can pull this into splunk cloud?
Hi @iherb_0718,
You can use the Splunk Add-on for Amazon Web Services app to ingest logs from SQS queue.
https://splunkbase.splunk.com/app/1876/
If this reply helps you an upvote is appreciated.
Scelikok would that still work if the S3 bucket is on the vendor's AWS tenant (account)?
I was under the impression that the app would need to be configured for a particular account and then you can setup inputs for SQS queues from that account. I believe my company has that app today configured for our own account already.
Since add-on support multiple accounts and inputs, I don't think it will be a problem.
Hi @iherb_0718,
You can use the Splunk Add-on for Amazon Web Services app to ingest logs from SQS queue.
https://splunkbase.splunk.com/app/1876/
If this reply helps you an upvote is appreciated.
I've been having some problems with getting this to work through the AWS app.
However, I believe now have a viable solution. The vendor has a python script that will set up the SQS consumer. I'll edit this .py script and fill in the AWS credential and S3 information.
I intend to load this Python script on an ubuntu 18.04 OS that is running the universal forwarder. The logs will come into this host.
Will I be able to direct Universal forwarder to look into a particular directory to collect the logs? Will it be the inputs.conf file to make the change?