Getting Data In

SQS Based Input from S3: Assign different sourcetype based on filename

LegalPrime
Path Finder

Hello,

I have a Heavy Forwarder on which I receive logs via Splunk for AWS addon as they appear in my S3 bucket.

I know I will be receiving log files ending with `*_connectionlog_*.gz`, `*_userlog_*.gz` and `*_useractivitylog_*.gz`.

My current input definition looks like this:

[aws_sqs_based_s3://MyApp_RedshiftAuditLogs]
aws_account = redacted
index = myapp_redshiftindex
interval = 300
s3_file_decoder = CustomLogs
sourcetype = aws:cloudtrail
sqs_batch_size = 10
sqs_queue_region = redacted
sqs_queue_url = https://redacted.url
disabled = 0

 

The problem is that there is a `sourcetype` field, that has single value here. I would like to assign this value based on whether there is useractivitylog, connectivitylog or userlog in the filename that just came in on the heavyforwarder.

The vision here is that I will further research how to properly parse and extract each of these types of logs, so when I will be eventually searching the index, I will have extracted some (if not all) of the fields - but I am not at this stage yet.

Questions:

1. Am I approaching this correctly by wanting to assign different source types to files that are structured differently?

2. How do I do this assigning thing?

3. Will the path that you propose enable me to write some parsing/extraction logic later down the road?

Thank you for your time!

0 Karma

venkatasri
SplunkTrust
SplunkTrust

Hi @LegalPrime 

If you want to ingest custom logs other than the natively supported AWS log types, you must set s3_file_decoder = CustomLogs. This setting lets you ingest custom logs into the Splunk platform instance, but it does not parse the data. To process custom logs into meaningful events, you need to perform additional configurations in props.conf and transforms.conf to parse the collected data to meet your specific requirements.

For more information on these settings, see /README/inputs.conf.spec under your add-on directory.

https://docs.splunk.com/Documentation/AddOns/released/AWS/SQS-basedS3

Hope this helps!

0 Karma

anwarmmian
Explorer

Try:
s3_file_decoder = CloudTrail

0 Karma
Get Updates on the Splunk Community!

ICYMI - Check out the latest releases of Splunk Edge Processor

Splunk is pleased to announce the latest enhancements to Splunk Edge Processor.  HEC Receiver authorization ...

Introducing the 2024 SplunkTrust!

Hello, Splunk Community! We are beyond thrilled to announce our newest group of SplunkTrust members!  The ...

Introducing the 2024 Splunk MVPs!

We are excited to announce the 2024 cohort of the Splunk MVP program. Splunk MVPs are passionate members of ...