- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I'm trying to parse the following JSON data into a timechart "by label". The "data" section is a timestamp and a value. I've managed to get each series into its own event but I can't seem to get anything parse to below the series level;
{
"9": {
"series": [
{
"label": "content",
"data": [
[
1493673985000,
10
],
[
1493673990000,
10
],
[
1493673995000,
10
]
]
},
{
"label": "filters",
"data": [
[
1493673985000,
3
],
[
1493673990000,
3
],
[
1493673995000,
3
]
]
},
{
"label": "total",
"data": [
[
1493673985000,
14
],
[
1493673990000,
14
],
[
1493673995000,
14
]
]
}
]
}
}
By using | spath output=series path="9.series{}"
I'm able to get the each series split into an event. I think I just need to extract label, mvzip it with data{0} and data{1} and then mvexpand that but I can't seem to find the syntax that works for that. In then I'd like something that allows me to do something like | eval _time=timestamp| timechart max(value) by label
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content

You're right, it was remarkably ugly to extract that nested array. IMHO, path=....data
should have copied the whole array of arrays to the output field, and it didn't. path=....data{}{}
just dumped them all into a single-dimensional array and lost which was which. So, here's your code.
| makeresults |eval _raw="{ \"9\": {\"series\": [ {\"label\": \"content\" , \"data\": [ [ 1493673985000, 10 ] , [ 1493673990000, 11 ] , [ 1493673995000, 12 ] ] }, {\"label\": \"filters\" , \"data\": [ [ 1493673985000, 3 ] , [ 1493673990000, 1 ] , [ 1493673995000, 3 ] ] }, {\"label\": \"total\" , \"data\": [ [ 1493673985000, 14 ] , [ 1493673990000, 14 ] , [ 1493673995000, 17 ] ] } ] } }" | eval junk="{\"data\": [ [ 1493673985000, 10 ] , [ 1493673990000, 10 ] , [ 1493673995000, 10 ] ] }"
| rename COMMENT as "The above just enters your sample data"
| spath input=_raw output=series path=9.series{}
| table series
| mvexpand series
| spath input=series output=label path=label
| spath input=series output=datatime path=data{}{0}
| spath input=series output=datavalue path=data{}{1}
| eval values=mvzip(datatime,datavalue,"=")
| mvexpand values
| rex field=values "(?<time>[^=]+)=(?<value>\d+)"
| eval _time = time/1000
| table _time label value
| timechart span=5s avg(value) by label
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content

You're right, it was remarkably ugly to extract that nested array. IMHO, path=....data
should have copied the whole array of arrays to the output field, and it didn't. path=....data{}{}
just dumped them all into a single-dimensional array and lost which was which. So, here's your code.
| makeresults |eval _raw="{ \"9\": {\"series\": [ {\"label\": \"content\" , \"data\": [ [ 1493673985000, 10 ] , [ 1493673990000, 11 ] , [ 1493673995000, 12 ] ] }, {\"label\": \"filters\" , \"data\": [ [ 1493673985000, 3 ] , [ 1493673990000, 1 ] , [ 1493673995000, 3 ] ] }, {\"label\": \"total\" , \"data\": [ [ 1493673985000, 14 ] , [ 1493673990000, 14 ] , [ 1493673995000, 17 ] ] } ] } }" | eval junk="{\"data\": [ [ 1493673985000, 10 ] , [ 1493673990000, 10 ] , [ 1493673995000, 10 ] ] }"
| rename COMMENT as "The above just enters your sample data"
| spath input=_raw output=series path=9.series{}
| table series
| mvexpand series
| spath input=series output=label path=label
| spath input=series output=datatime path=data{}{0}
| spath input=series output=datavalue path=data{}{1}
| eval values=mvzip(datatime,datavalue,"=")
| mvexpand values
| rex field=values "(?<time>[^=]+)=(?<value>\d+)"
| eval _time = time/1000
| table _time label value
| timechart span=5s avg(value) by label
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I was so close and yet so far. This absolutely solved my issue and since this is the format I get from multiple API calls, it solves a bunch of future issues as well. Thank you!!
