Hello,
We have some json being logged via log4j so part of the event is json, part is not. The log4j portion has the time stamp. I can use field extractions to get just the json by itself. The users could then use xmlkv to parse the json but I'm looking for this to be done at index time so the users don't need to do this - any suggestions?
Example of logs (all lines are log4j logging json):
2017-01-04 00:00:00.981 [log_level] methodName- {"key1":"value1","key2":"value2","key3":"value3"}
2017-01-04 00:00:00.984 [log_level] methodName- {"key1":"value1"}
2017-01-04 00:00:00.984 [log_level] methodName - {"key1":"value1","key2":"value2"}
Thanks
@rkeenan - Did one of the answers below help provide a solution your question? If yes, please click “Accept” below the best answer to resolve this post and upvote anything that was helpful. If no, please leave a comment with more feedback. Thanks.
You can use spath
to extract subfields from json. Try something like:
<your search>
| rex "^\d{4}-\d\d-\d\d \d\d:\d\d:\d\d:\d\d\d \[(?P<log_level>\w+\] (?<method>\w+) (?P<my_json>.*)$"
| spath field=my_json path=my_prefix
This will create log_level
, method
, my_json
and the hierarchy of my_prefix.*
fields (in your cases you'll get my_prefix.key_1
, my_prefix.key_2
and my_prefix.key_3
)
| <Your Base Search>
| rex field=_raw "\[log_level\] methodName[-|\s]+(?<jsonData>.*)"
| table _time jsonData _raw
You can setup automatic key-value pair extraction at search time (index time extraction is costlier, slows indexing process and requires additional space) so that uses have the fields available to them without any inline extractions. Add this to your props.conf/transforms.conf on search heads.
props.conf
[YourSourceType]
TRANSFORMS-kvextract = jsonextract
transforms.conf
[jsonextract]
REGEX = \"(?<_KEY_1>[A-z0-9]+)\":\"(?<_VAL_1>[^\"]+)\"
dynamic key names will slow things down even more. Why not indexed_extractions = JSON instead?
It's not pure JSON, pretty sure INDEXED_EXTRACTIONS=json would not work.