Hi @rahulkumar , as I said, you have to extract metadata from the json using INGEST_EVAL and then convert in _raw the original log field. At first you have to analyze your json logstash log and ide...
See more...
Hi @rahulkumar , as I said, you have to extract metadata from the json using INGEST_EVAL and then convert in _raw the original log field. At first you have to analyze your json logstash log and identify the metadata to use, then you have to create INGEST_EVAL transformations to assign the original metadata to your metadata, e.g something like this (do adapt to your log format): in props.conf: [source::http:logstash]
TRANSFORMS-00 = securelog_set_default_metadata
TRANSFORMS-01 = securelog_set_sourcetype_by_regex
TRANSFORMS-02 = securelog_override_raw the first calls the metadata assignment, the second one defines the correct sourcetype, the third overrid _raw. In transforms.conf [securelog_set_default_metadata]
INGEST_EVAL = host := coalesce( json_extract(_raw, "hostname"), json_extract(_raw, "host.name"), json_extract(_raw, "host.hostname"))
[securelog_set_sourcetype_by_regex]
INGEST_EVAL = sourcetype := case( match(_raw, "\"path\":\"/var/log/audit/audit.log\""), "linux_audit", match(_raw, "\"path\":\"/var/log/secure\""), "linux_secure")
[securelog_override_raw]
INGEST_EVAL = _raw := if( sourcetype LIKE "linux%", json_extract(_raw, "application_log"), _raw ) The first one extract host from the json. the second one assign sourcetype based on information in the metadata (in the example linux sourcetypes). the third one takes one field of the json as _raw. It was't an easy work and it was a very long job, so I hint to engage a Splunk PS or a Core Consultant that already did it. Ciao. Giuseppe