All Apps and Add-ons

Unable to change default sourcetype for imperva logs on Splunk add on for aws

vgugale
Engager

We are ingesting logs from imperva SQS queue from our aws enviornement. We want to use custom sourcetype for this logs i.e  "imperva:incapsula" instead of default sourcetype "aws:s3:accesslogs"  on Splunk add-on for AWS. 
We have made changes via backend on inputs.conf and restarted service, these chnages are reflecting on UI as well as could see below events in internal logs stating inputs has tagged to the new  sourcetype. But the logs are still being indexed under old sourcetype i.e. aws:s3:accesslogs.
Have tried multiple things like creating a new custom input with new sourcetype, created props.conf for the new sourcetype under system/local directory but it didn't helped, the logs are still being indexed under default sourcetype "aws:s3:accesslogs"

Internal logs post making changes

"2022-01-28 09:33:58,959 level=INFO pid=10133 tid=MainThread logger=splunk_ta_aws.modinputs.sqs_based_s3.handler pos=handler.py:run:635 | datainput="imperva-waf-log" start_time=1643362438 | message="Data input started." aws_account="SplunkProdCrossAccountUser" aws_iam_role="aee_splunk_prd" disabled="0" host="ip-172-27-201-15.ec2.internal" index="corp_imperva" interval="300" python.version="python3" s3_file_decoder="S3AccessLogs" sourcetype="imperva:incapsula" sqs_batch_size="10" sqs_queue_region="**-1" sqs_queue_url="https://***/aee-splunk-prd-imperva-waf" using_dlq="1""

props.conf

[imperva:incapsula]
SHOULD_LINEMERGE=false
LINE_BREAKER=([\r\n]+)\CEF\:\d\|
NO_BINARY_CHECK=true
TIME_FORMAT=%s%3N
TIME_PREFIX=start=
MAX_TIMESTAMP_LOOKAHEAD=128

inputs.conf

[aws_sqs_based_s3://imperva-waf-log]
aws_account = SplunkProdCrossAccountUser
aws_iam_role = aee_splunk_prd
index = corp_imperva
interval = 300
s3_file_decoder = S3AccessLogs
#sourcetype = aws:s3:accesslogs
sourcetype = imperva:incapsula
sqs_batch_size = 10
sqs_queue_region = ***-1
sqs_queue_url = https://**/aee-splunk-prd-imperva-waf
using_dlq = 1
disabled = 0

Does anyone have faced similar issue? 

Labels (1)
0 Karma
1 Solution

vgugale
Engager

This has been resolved. we had to select s3_file_decoder =custom logs and then set the custom sourcetype.

View solution in original post

0 Karma

vgugale
Engager

This has been resolved. we had to select s3_file_decoder =custom logs and then set the custom sourcetype.

0 Karma
Get Updates on the Splunk Community!

Routing Data to Different Splunk Indexes in the OpenTelemetry Collector

This blog post is part of an ongoing series on OpenTelemetry. The OpenTelemetry project is the second largest ...

Getting Started with AIOps: Event Correlation Basics and Alert Storm Detection in ...

Getting Started with AIOps:Event Correlation Basics and Alert Storm Detection in Splunk IT Service ...

Register to Attend BSides SPL 2022 - It's all Happening October 18!

Join like-minded individuals for technical sessions on everything Splunk!  This is a community-led and run ...