Getting Data In

SQS Based Input from S3: Assign different sourcetype based on filename

LegalPrime
Path Finder

Hello,

I have a Heavy Forwarder on which I receive logs via Splunk for AWS addon as they appear in my S3 bucket.

I know I will be receiving log files ending with `*_connectionlog_*.gz`, `*_userlog_*.gz` and `*_useractivitylog_*.gz`.

My current input definition looks like this:

[aws_sqs_based_s3://MyApp_RedshiftAuditLogs]
aws_account = redacted
index = myapp_redshiftindex
interval = 300
s3_file_decoder = CustomLogs
sourcetype = aws:cloudtrail
sqs_batch_size = 10
sqs_queue_region = redacted
sqs_queue_url = https://redacted.url
disabled = 0

 

The problem is that there is a `sourcetype` field, that has single value here. I would like to assign this value based on whether there is useractivitylog, connectivitylog or userlog in the filename that just came in on the heavyforwarder.

The vision here is that I will further research how to properly parse and extract each of these types of logs, so when I will be eventually searching the index, I will have extracted some (if not all) of the fields - but I am not at this stage yet.

Questions:

1. Am I approaching this correctly by wanting to assign different source types to files that are structured differently?

2. How do I do this assigning thing?

3. Will the path that you propose enable me to write some parsing/extraction logic later down the road?

Thank you for your time!

0 Karma

venkatasri
SplunkTrust
SplunkTrust

Hi @LegalPrime 

If you want to ingest custom logs other than the natively supported AWS log types, you must set s3_file_decoder = CustomLogs. This setting lets you ingest custom logs into the Splunk platform instance, but it does not parse the data. To process custom logs into meaningful events, you need to perform additional configurations in props.conf and transforms.conf to parse the collected data to meet your specific requirements.

For more information on these settings, see /README/inputs.conf.spec under your add-on directory.

https://docs.splunk.com/Documentation/AddOns/released/AWS/SQS-basedS3

Hope this helps!

0 Karma

anwarmmian
Explorer

Try:
s3_file_decoder = CloudTrail

0 Karma
Get Updates on the Splunk Community!

Join Us for Splunk University and Get Your Bootcamp Game On!

If you know, you know! Splunk University is the vibe this summer so register today for bootcamps galore ...

.conf24 | Learning Tracks for Security, Observability, Platform, and Developers!

.conf24 is taking place at The Venetian in Las Vegas from June 11 - 14. Continue reading to learn about the ...

Announcing Scheduled Export GA for Dashboard Studio

We're excited to announce the general availability of Scheduled Export for Dashboard Studio. Starting in ...