Splunk Search

Given a Multiline Event, Can I Create a KV Fields from each row?

mzorzi
Splunk Employee
Splunk Employee

I have some log like following:

13:47:04 -2 receive request [type=0|desc=TimeStamp] <---event one
| [8 ] [BCA3.5]
| [9 ] [56]
| [35 ] [0]
| [49 ] [UBDHKH]
| [56 ] [LEHUBAUS]
| [34 ] [1709]
| [52 ] [20100512-05:47:05]
| [10 ] [182]
13:47:03 -2 sending request [type=0|desc=Request Change] <---event two
| [35 ] [0]
| [57 ] [INADM]
| [56 ] [UBDHKH]
| [49 ] [LEHUBAUS]
| [34 ] [2179]
| [52 ] [20100512-05:47:03]
| [8 ] [BCA3.5]
| [9 ] [0065]
| [10 ] [041]
........

The number of the left hand side is some meaningful code internally, I want to use left hand side as fields and it is the value for right hand side.

I've tried to extract the fields using REPORT- in props.conf, but the REPEAT_MATCH=true seems not working on Search-extracted fields and also a FORMAT value like : FORMAT=fields_$1::$2 is not correctly recognized.

1 Solution

gkanapathy
Splunk Employee
Splunk Employee

Correct, unfortunately search-time extractions won't let you do anything more exotic than $1::$2 to map keys and values. However, you can actually use numbers as field names (it probably gets ambiguous syntactically for some commands, but you can deal with those ad hoc). Just add CLEAN_KEYS = false to your transforms.conf stanza, and use $1::$2 to create KV pairs, and you'll wind up with fields named 35, 10, etc.

View solution in original post

gkanapathy
Splunk Employee
Splunk Employee

Correct, unfortunately search-time extractions won't let you do anything more exotic than $1::$2 to map keys and values. However, you can actually use numbers as field names (it probably gets ambiguous syntactically for some commands, but you can deal with those ad hoc). Just add CLEAN_KEYS = false to your transforms.conf stanza, and use $1::$2 to create KV pairs, and you'll wind up with fields named 35, 10, etc.

Career Survey
First 500 qualified respondents will receive a $20 gift card! Tell us about your professional Splunk journey.
Get Updates on the Splunk Community!

Thanks for the Memories! Splunk University, .conf25, and our Community

Thank you to everyone in the Splunk Community who joined us for .conf25, which kicked off with our iconic ...

Data Persistence in the OpenTelemetry Collector

This blog post is part of an ongoing series on OpenTelemetry. What happens if the OpenTelemetry collector ...

Introducing Splunk 10.0: Smarter, Faster, and More Powerful Than Ever

Now On Demand Whether you're managing complex deployments or looking to future-proof your data ...