Hey everyone.
Need some help breaking a json event that is ingested in the current nested json format:
[
{
"title": "Bad Stuff",
"count": 2,
"matches": [
{
"EventID": 13,
"EventRecordID": 19700,
"User": "NT AUTHORITY\\SYSTEM"
},
{
"EventID": 16,
"EventRecordID": 21700,
"User": "NT AUTHORITY\\ADMIN"
}
]
},
{
"title": "Next Bad Stuff",
"count": 2,
"matches": [
{
"EventID": 14,
"EventRecordID": 19700,
"User": "NT AUTHORITY\\SYSTEM"
},
{
"EventID": 17,
"EventRecordID": 21700,
"User": "NT AUTHORITY\\ADMIN"
}
]
}
]
Would like to break it into seperate events like this:
{
"title": "Bad Stuff",
"count": 2,
"EventID": 13,
"EventRecordID": 19700,
"User": "NT AUTHORITY\\SYSTEM"
}
{
"title": "Bad Stuff",
"count": 2,
"EventID": 16,
"EventRecordID": 21700,
"User": "NT AUTHORITY\\ADMIN"
}
{
"title": "Next Bad Stuff",
"count": 2,
"EventID": 14,
"EventRecordID": 19700,
"User": "NT AUTHORITY\\SYSTEM"
}
{
"title": "Next Bad Stuff",
"count": 2,
"EventID": 17,
"EventRecordID": 21700,
"User": "NT AUTHORITY\\ADMIN"
}
What would I need in my props.conf and transforms.conf to achieve this ?
Thanks in advanced splunk community !
Cheers.
Quick answer is you can't.
Long answer is - Splunk can do some form of json parsing and manipulation and maybe you could use some fancy ingest-time evals to get the field values from the event but there would be not really enough "structural" information for splunk to recreate the events completely differently.
If you really need to do that much of json manipulation, do it before ingesting it into splunk be means of external script (or scripted/modular input).
Especially if you want to break a single event into multiple ones. Splunk processes one event at a time so there's really no reasonable way to split events in the middle of ingestion process (after it has already been split into events).
Thanks heaps! I thought as much but better to ask just in case there was some crafty way to do it.
Appreciate your help 🙂
Quick answer is you can't.
Long answer is - Splunk can do some form of json parsing and manipulation and maybe you could use some fancy ingest-time evals to get the field values from the event but there would be not really enough "structural" information for splunk to recreate the events completely differently.
If you really need to do that much of json manipulation, do it before ingesting it into splunk be means of external script (or scripted/modular input).
Especially if you want to break a single event into multiple ones. Splunk processes one event at a time so there's really no reasonable way to split events in the middle of ingestion process (after it has already been split into events).
Response to myself - I can think of one way to split an event after it already passed the event breaking stage.
One would have to craft a new event which then would have to be rerouted to a syslog output onto an input on the same indexer/forwarder wher it would effectively get broken into events again.
But it's a very very very ugly solution and no person in their sane mind should ever try do to so!
I'm just writing about it as a proof of concept but it's definitely not useable for production use.