Right now I have a json file that's formatted like:
{
"Log Files":[
{"Date":"2014-07-18 21:22:51", "Available Bytes(kb)":3960078, ...},
{"Date":"2014-07-18 21:24:01", "Available Bytes(kb)":4001231, ...},
{"Date":"2014-07-18 21:25:14", "Available Bytes(kb)":3872959, ...}]}
Right now it's showing up in Splunk as:
timestamp Date Available Bytes(kb)
2014-07-18 21:22:51:000 2014-07-18 21:22:51 3960078
2014-07-18 21:24:01 4001231
2014-07-18 21:25:14 3872959
How can I split these up into individual events when I load the data? I can get the timestamp to correctly match the Date field, but it will still only give one date for the whole file, even though there are several lines that are each individual logs.
Try something like this,
<your base search...> | table timestamp, Date, Available | eval temp=mvzip(timestamp, mvzip(Date, Available,"###"), "###") | mvexpand temp | rex field=temp "(?<timestamp>.*)###(?<Date>.*)###(?<Available>.*)" | fields - temp
I have used the sample fields, you can try with your actual fields. Concept here is, you need to zip it with a delimiter (here ###) and expand it and extract it. This do the magic
Is there anything in the envelope of this array that you want to keep or are you just interested in keeping the events inside the "Log Files" array?
If the latter, setup props/transforms for your sourcetype to:
\{\"Date\":\"
That should result in individual, valid JSON events that should render fine in the UI.
When you say "... when I load the data" do you mean at search time or index time?