All Apps and Add-ons

Spunk Addon AWS: How to migrate log from aws:s3 to another sourcetype

saveriobocca
Loves-to-Learn Lots

Hi everyone,

I am currently receiving data / logs via my buckes.

The following logs have been categorized in the sourcetype following: aws: s3

I would like to create a condition or make sure that certain files in my s3 bucket are stored in another sourcetype and applicate a parsing to line.

Example from:

index=main
sourcetype=aws:s3

To:

index=main
sourcetype=s3_logs_customer

Hi write this in on input.conf but not work:

[source::s3://mypath/*_Report_ProdValid_*.csv]
REPORT-s3-logs-customer = s3-logs-customer

[ s3-logs-customer ]
DATETIME_CONFIG=CURRENT
SHOULD_LINEMERGE=false
LINE_BREAKER=([\r\n]+)
NO_BINARY_CHECK=true
CHARSET=UTF-8
INDEXED_EXTRACTIONS=csv
KV_MODE=none
category=Structured
description=Comma-separated value format. Set header and other settings in "Delimited Settings"
disabled=false
pulldown_type=true
FIELD_NAMES=id1,id2,id3,id4
FIELD_QUOTE='
FIELD_DELIMITER=,

 

Could you give me an example of code to insert in the inputs.conf and transform.conf files to achieve my purpose?

Thanks a lot

Labels (2)
0 Karma

scelikok
SplunkTrust
SplunkTrust

Hi @saveriobocca,

My sample was trying to set sourcetype to s3_logs_customer. [s3_logs_customer_override] should be on transforms.conf. I will put the settings separately.

Did you put correct settings to props.conf and transforms.conf?

props.conf

[source::s3://mypath/*_Report_ProdValid_*.csv]
TRANSFORMS-customer_logs = s3_logs_customer_override

[s3_logs_customer]
DATETIME_CONFIG=CURRENT
SHOULD_LINEMERGE=false
LINE_BREAKER=([\r\n]+)
NO_BINARY_CHECK=true
CHARSET=UTF-8
INDEXED_EXTRACTIONS=csv
KV_MODE=none
category=Structured
description=Comma-separated value format. Set header and other settings in "Delimited Settings"
disabled=false
pulldown_type=true
FIELD_NAMES=id1,id2,id3,id4
FIELD_QUOTE='
FIELD_DELIMITER=,

transforms.conf

[s3_logs_customer_override]
DATETIME_CONFIG=CURRENT
SHOULD_LINEMERGE=false
LINE_BREAKER=([\r\n]+)
If this reply helps you an upvote and "Accept as Solution" is appreciated.
0 Karma

scelikok
SplunkTrust
SplunkTrust

Hi @saveriobocca,

You can put below props.conf and transforms.conf files into the same instance with aws input.

props.conf
[source::s3://mypath/*_Report_ProdValid_*.csv]
TRANSFORMS-customer_logs = s3_logs_customer_override

[s3_logs_customer]
DATETIME_CONFIG=CURRENT
SHOULD_LINEMERGE=false
LINE_BREAKER=([\r\n]+)
NO_BINARY_CHECK=true
CHARSET=UTF-8
INDEXED_EXTRACTIONS=csv
KV_MODE=none
category=Structured
description=Comma-separated value format. Set header and other settings in "Delimited Settings"
disabled=false
pulldown_type=true
FIELD_NAMES=id1,id2,id3,id4
FIELD_QUOTE='
FIELD_DELIMITER=,

transforms.conf
[s3_logs_customer_override]
REGEX = .
FORMAT = sourcetype:: s3_logs_customer
DEST_KEY = MetaData:Sourcetype

 

If this reply helps you an upvote and "Accept as Solution" is appreciated.
0 Karma

saveriobocca
Loves-to-Learn Lots

Hi @scelikok , thanks to response me.

I copy your code on my forwarder but when I check a sourcetype on Search Head I don't view a new sourcetype definited: s3_logs_customer

Why I don't view a sourcetype? I must create a new source type manually by web interface?

-----

In additionally how can I specify a different index for storing the sourcetype? For example the starting sourcetype aws: s3 refers to an app having its index.

 

index = myindex
sourcetype = aws:s3

 

I want

 

index = newindex
sourcetype = s3_logs_customer

 

The code is similar or I must specify an index in another way?

 

Thanks,

Saverio

0 Karma
Got questions? Get answers!

Join the Splunk Community Slack to learn, troubleshoot, and make connections with fellow Splunk practitioners in real time!

Meet up IRL or virtually!

Join Splunk User Groups to connect and learn in-person by region or remotely by topic or industry.

Get Updates on the Splunk Community!

[Puzzles] Solve, Learn, Repeat: Matching cron expressions

This puzzle (first published here) is based on matching timestamps to cron expressions.All the timestamps ...

Design, Compete, Win: Submit Your Best Splunk Dashboards for a .conf26 Pass

Hello Splunkers,  We’re excited to kick off a Splunk Dashboard contest! We know that dashboards are a primary ...

May 2026 Splunk Expert Sessions: Security & Observability

Level Up Your Operations: May 2026 Splunk Expert Sessions Whether you are refining your security posture or ...