Getting Data In

How to configure my props.conf for a file with header in it?

Dingu
Explorer

Hello ,

I'm trying to configure my props.conf for one of the files in which it has header. I don't have any props.conf configured yet, looking for help in configuring this. Thanks in advance

example logfile:

Field1 Field2 Field3 Field4 Field5 Field6 Field7
------+------+---------------------------+---------------------------+---------------------------
0 1 6/16/20 18:35:23:193 EDT 6/16/20 18:35:23:193 EDT xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
1 1 6/16/20 18:35:23:216 EDT 6/16/20 18:35:23:216 EDT yyyyyyyyyyyyyyyyyyyyyy
2 1 6/16/20 18:35:23:285 EDT 6/16/20 18:35:23:285 EDT zzzzzzzzzzzzzzzzz

Labels (1)
0 Karma

richgalloway
SplunkTrust
SplunkTrust
Will you be ingesting this file in batch mode or monitoring it?
Do you have any control over how it is created? There are 7 fields in the header, but 9 or more in the data. Changing it to comma or pipe-delimited will make for easier onboarding.
Also, see if you can get rid of the second line. If you can't, we can use SEDCMD to avoid indexing it.
---
If this reply helps you, Karma would be appreciated.

Dingu
Explorer

Thank you for your response @richgalloway 

The file will be monitored , sadly I don't have any control over the way it created and the header fields will be matching the no. of field values in the data. 

0 Karma

richgalloway
SplunkTrust
SplunkTrust

It's unfortunate you can't change how the log is written, but we can work with it.  You'll need to write a transform to parse the file and use props to ignore the header lines.

props.conf:

[mysourcetype]
SEDCMD-header1 = s/Field1.*//
SEDCMD-header2 = s/[-+]+//
TRANSFORMS-fields = mytransform

transforms.conf:

[mytransform]
REGEX = (\d+)\s(?<Field2>\d+)\s(\d+\/\d+\/\d+)\s(\d\d:\d\d:\d\d:\d\d\d \w+)\s(\d+\/\d+\/\d+)\s(\d\d:\d\d:\d\d:\d\d\d \w+)\s(.*)
FORMAT = $1::$2 $3::$4 $5::$6 $7::$8 $9::$10 $11:$12 $13:$14

 

---
If this reply helps you, Karma would be appreciated.
0 Karma
Get Updates on the Splunk Community!

Transforming Financial Data into Fraud Intelligence

Every day, banks and financial companies handle millions of transactions, logins, and customer interactions ...

How to send events & findings from AWS to Splunk using Amazon EventBridge

Amazon EventBridge is a serverless service that uses events to connect application components together, making ...

Exciting News: The AppDynamics Community Joins Splunk!

Hello Splunkers,   I’d like to introduce myself—I’m Ryan, the former AppDynamics Community Manager, and I’m ...