All Apps and Add-ons

Imperva logs are not contagious with this Splunk AWS addon

ankit86
Loves-to-Learn

Hello,

I need some help in imperva to Splunk Cloud integration. I am using the Splunk Addon for AWS on my cloud SH, and from there i configured imperva account using Key ID and secret Id with the Imperva S3.

And in inputs I am using the Incremental S3, the logs are coming in to Splunk Cloud but there is some miss too I can see some logs are available on AWS S3 but some how those are ingesting in to Splunk Cloud, I am not getting any help online reason I am posting question here. 

Please advice some body. 

 

Thank you.

Labels (1)
0 Karma

Meett
Splunk Employee
Splunk Employee

Hello @ankit86 Do you have versioning enabled on S3 side ? Are you sure you have selected correct Bucket name while creating input ? This article can help you : https://splunk.my.site.com/customer/s/article/Gneric-S3-input-which-configured-in-Splunk-Add-on-for-... 

=====

Appreciate Karma and Marked Solution if this helps you.

 

0 Karma

Meett
Splunk Employee
Splunk Employee

Hello @ankit86 , It’s hard to answer without looking at internal logs, we should check internal logs of inputs that you have configured and identify possible ERRORs. 

0 Karma

ankit86
Loves-to-Learn

Hello @Meett ,

Thank you for replying. Here is the Error I noticed yesterday but I am not sure if this is relevant--

 

14/11/2024

18:43:31.066

2024-11-14 13:13:31,066 level=ERROR pid=3929073 tid=Thread-4 logger=splunk_ta_aws.modinputs.generic_s3.aws_s3_data_loader pos=aws_s3_data_loader.py:index_data:114 | datainput="Imperva" bucket_name="imperva-XXXX-XXXXX" | message="Failed to collect data through generic S3." start_time=1731590010 job_uid="8ecfb3a2-5c70-4b1a-b7d7-f0b0fb3dfb94" Traceback (most recent call last): File "/opt/splunk/etc/apps/Splunk_TA_aws/bin/splunk_ta_aws/modinputs/generic_s3/aws_s3_data_loader.py", line 108, in index_data self._do_index_data() File "/opt/splunk/etc/apps/Splunk_TA_aws/bin/splunk_ta_aws/modinputs/generic_s3/aws_s3_data_loader.py", line 131, in _do_index_data self.collect_data() File "/opt/splunk/etc/apps/Splunk_TA_aws/bin/splunk_ta_aws/modinputs/generic_s3/aws_s3_data_loader.py", line 181, in collect_data self._discover_keys(index_store) File "/opt/splunk/etc/apps/Splunk_TA_aws/bin/splunk_ta_aws/modinputs/generic_s3/aws_s3_data_loader.py", line 304, in _discover_keys for key in keys: File "/opt/splunk/etc/apps/Splunk_TA_aws/bin/splunk_ta_aws/modinputs/generic_s3/aws_s3_common.py", line 98, in get_keys for page in paginator.paginate(Bucket=bucket, Prefix=prefix): File "/opt/splunk/etc/apps/Splunk_TA_aws/lib/botocore/paginate.py", line 269, in __iter__ response = self._make_request(current_kwargs) File "/opt/splunk/etc/apps/Splunk_TA_aws/lib/botocore/paginate.py", line 357, in _make_request return self._method(**current_kwargs) File "/opt/splunk/etc/apps/Splunk_TA_aws/lib/botocore/client.py", line 535, in _api_call return self._make_api_call(operation_name, kwargs) File "/opt/splunk/etc/apps/Splunk_TA_aws/lib/botocore/client.py", line 983, in _make_api_call raise error_class(parsed_response, operation_name) botocore.exceptions.ClientError: An error occurred (PermanentRedirect) when calling the ListObjectsV2 operation: The bucket you are attempting to access must be addressed using the specified endpoint. Please send all future requests to this endpoint.

0 Karma
Career Survey
First 500 qualified respondents will receive a $20 gift card! Tell us about your professional Splunk journey.

Can’t make it to .conf25? Join us online!

Get Updates on the Splunk Community!

Leveraging Automated Threat Analysis Across the Splunk Ecosystem

Are you leveraging automation to its fullest potential in your threat detection strategy?Our upcoming Security ...

Can’t Make It to Boston? Stream .conf25 and Learn with Haya Husain

Boston may be buzzing this September with Splunk University and .conf25, but you don’t have to pack a bag to ...

Splunk Lantern’s Guide to The Most Popular .conf25 Sessions

Splunk Lantern is a Splunk customer success center that provides advice from Splunk experts on valuable data ...