Getting Data In

Not able to extract _raw data using props.conf and transforms.conf

kulsplunk
Explorer

Hello Splunk Gurus,

I'm extracting the data from database-input (using Splunk DBX 3.1.0) and sourcing that to index "my_index". When I search from Splunk I see the following output:

*Splunk Search: * index=my_index sourcetype=my_dbx_st source=test_tbl_dbx31_input | table _taw

Output:
2017-08-01 11:01:01.509, access_time="2017-03-30 6:44:16.0", process_id="PROC7678", internal_id="2436", internal_name="Test_Reports", user_id="487657"

access_time = access_time="2017-03-30 6:44:16.0"
process_id = process_id="PROC7678"
internal_id = internal_id="2436"
internal_name = internal_name="Test_Reports"
user_id = user_id="487657"

props.conf
[my_audit]
SHOULD_LINE_MERGE=false
KV_MODE=auto
REPORT-my_audit_extract=my_audit_extractions

transforms.conf
[my_audit_extractions]
DELIMS = ","
FIELDS = default_time, access_time, process_id, internal_id, internal_name, user_id

Problem
I'm not able to extract _raw data using props.conf and transforms.conf here. Also you will notice that I had to use an extra field "default_time" in FIELDS sections because that field-value gets populated automatically as a first system-field.

Am I missing any key properties in the props.conf or transforms.conf to get my field extracted properly as following?

access_time="2017-03-30 6:44:16.0"
process_id="PROC7678"
internal_id="2436"
internal_name="Test_Reports"
user_id="487657"

Thanks for your help!

0 Karma
1 Solution

sbbadri
Motivator

@kulsplunk
transforms.conf
[my_audit_extractions]
REGEX = \d+-\d+-\d+\s\d+:\d+:+d.\d+\saccess_time=\"(\d+-\d+-\d+\s\d+:\d+:\d+.\d+)"\,\sprocess_id=\"(\S+)\"\,\sinternal_id=\"(\d+)\"\,\sinternal_name=\"(\S+)\"\,\suser_id="(\d+)\"
FORMAT= access_time::$1 process_id::$2 internal_id::$3 internal_name::$4 user_id::$5

or

index=my_index sourcetype=my_dbx_st source=test_tbl_dbx31_input | table access_time process_id internal_id internal_name user_id

because you are using KV_MODE=auto so that all the fields will be extracted already. so you don't want to use transforms at all

View solution in original post

0 Karma

somesoni2
Revered Legend

You raw data has classic key-value pair format and field should already be extracted. Are you setting up just to get the default_time field? Also, your current transform.conf entry is treating raw data as CSV, so even the field header will appear as value. My suggestion would to just setup a props.conf EXTRACT-defaulttime = ^(<default_time>[^,]+) for default_time and get rid of the transforms.conf.

0 Karma

kulsplunk
Explorer

Thanks a lot for your answer! I got rid of transforms.conf and added the EXTRACT-defaulttime to extract the default time.

0 Karma

sbbadri
Motivator

@kulsplunk
transforms.conf
[my_audit_extractions]
REGEX = \d+-\d+-\d+\s\d+:\d+:+d.\d+\saccess_time=\"(\d+-\d+-\d+\s\d+:\d+:\d+.\d+)"\,\sprocess_id=\"(\S+)\"\,\sinternal_id=\"(\d+)\"\,\sinternal_name=\"(\S+)\"\,\suser_id="(\d+)\"
FORMAT= access_time::$1 process_id::$2 internal_id::$3 internal_name::$4 user_id::$5

or

index=my_index sourcetype=my_dbx_st source=test_tbl_dbx31_input | table access_time process_id internal_id internal_name user_id

because you are using KV_MODE=auto so that all the fields will be extracted already. so you don't want to use transforms at all

0 Karma

kulsplunk
Explorer

Thanks much! I just got rid of transforms.conf and it worked fine.

0 Karma
Got questions? Get answers!

Join the Splunk Community Slack to learn, troubleshoot, and make connections with fellow Splunk practitioners in real time!

Meet up IRL or virtually!

Join Splunk User Groups to connect and learn in-person by region or remotely by topic or industry.

Get Updates on the Splunk Community!

[Puzzles] Solve, Learn, Repeat: Matching cron expressions

This puzzle (first published here) is based on matching timestamps to cron expressions.All the timestamps ...

Design, Compete, Win: Submit Your Best Splunk Dashboards for a .conf26 Pass

Hello Splunkers,  We’re excited to kick off a Splunk Dashboard contest! We know that dashboards are a primary ...

May 2026 Splunk Expert Sessions: Security & Observability

Level Up Your Operations: May 2026 Splunk Expert Sessions Whether you are refining your security posture or ...