Hi All,
I am wondering if someone has implemented multi value fields at index time similar to the following
The following is a json event and has a common field i.e. "field1" and then in the "event" field, it has multiple values. Is it possible at index time to ingest these values as single line and attach the common field "field1" value in front of each event. e.g.
{
"fields": {
"field1": "value0"
},
"event": "2 123456789123 value1 value2 value3 value4\n2 123456789123 value5 value6 value7 value8\n2 123456789123 value9 value10 value11 value12\n2 123456789123 value13 value14 value15 value16\n2"
}
Expected Output:
value0 2 123456789123 value1 value2 value3 value4
value0 2 123456789123 value6 value6 value7 value8
value0 2 123456789123 value9 value10 value11 value12
value0 2 123456789123 value13 value14 value15 value16
Note: I understand this can be done at the Search Time and I have done that. I am trying to find a way to achieve it at index time using props and transforms.
As splunk does not provide ways to rewrite the events at indextime (other than regex replacement).
Depending of you coding skills, the best way is to write your own modular input and use it to read the logs and reformat them before ingesting them in splunk.
see http://docs.splunk.com/Documentation/Splunk/latest/AdvancedDev/ModInputsIntro
I understand Yann. But I am using AWS Kinesis Firehose and Lambda function to ingest these logs. The metadata is same for all the events and there is no point repeating the metadata with each event. Not sure how to use modular input with this model when you have log ingestion path like "Cloudwatch Log Group -> Firehose DS -> Lambda -> Back to Firehose DS -> Splunk HEC -> Splunk Indexers". I might not be understanding how and where I can use the modular inputs.
Any thoughts?