I have an index populated with JSON data. One of the elements in the JSON data is an array of key/value pairs. I want to create a dashboard that pulls all of the data in one search populating one table with the non-array values. When one of the rows in the table of non-array values is selected, I want to populate a second table with the array of key/value pairs parsed out into individual rows. I have this sort of working by searching again back to the original index but would like to avoid this overhead since the key/value pairs were included in the initial results.
Any ideas on how to pull this off?
Example of one record:
{
"Header": {
"EntryTimestamp": "2023-06-13T14:13:07.9316608Z",
"EntryType": "Exception",
"SubmittingUser": "TEST",
"Message": "Call to Server end point failed. Unable to connect to the remote server.",
"ApplicationName": "TestService",
"Environment": "DEVELOPMENT"
},
"Items": [{
"Key": "ExceptionTimeStamp",
"Value": "2023-06-13T14:13:07.9266"
}, {
"Key": "ExceptionSource\\MachineName",
"Value": "TESTSERVER"
}, {
"Key": "ExceptionSource\\ClassName",
"Value": "Socket"
}, {
"Key": "ExceptionSource\\NameSpace",
"Value": "System.Net.Sockets"
}, {
"Key": "ExceptionSource\\Method",
"Value": "DoConnect"
}, {
"Key": "RootExceptionType",
"Value": "System.Net.Sockets.SocketException"
}, {
"Key": "ExceptionID",
"Value": "72c52379-433b-41a7-90aa-1900cfcc4c29"
}, {
"Key": "Timeout",
"Value": "115000"
}, {
"Key": "Message",
"Value": "Unable to connect to the remote server"
}
],
"Data": ""
}
The elements in the "Header" section would appear in one table. When that row is selected, the array of "Items" would be populated in a second two-column table with the first column being Keys and the second column being the corresponding value that goes with the key.
The approach I started with was putting the Items array in a hidden column in the first table, but am stuck on that approach with trying to figure out how to get that into an array and populate the second table.
You could save the results from the base search (job sid) and then load the results with loadjob in the panel searches.
I had thought of that as well, but the docs on base searches indicate that the base search should be a transforming search. Mine is not (yet?). I may need to generate a random unique value to inject into the base search to make each row unique (I thought they were but it turns out they are not) so I can use something like stats. Let me play with that and see what I can work out. I'll let you know how it goes.
You could use a base search, which both tables use, with the second table extending the search to filter based on a token set in the drilldown of the first table.
That sort of worked, though the behavior seems odd. It still seems to take several seconds to populate the second table, I would have thought it would have been almost instant since it is only filtering an already returned data set, which only has 10 records. Here is the base search and the chain search if it helps:
Base search (there are several input fields on the dashaboard that are used to populate the search parameters):
index=index_name "Header.EntryType"=$entryType|s$ sourcetype=$environment|s$ "Header.ApplicationName"=$appName|s$ "Header.SubmittingUser"=$userId|s$
| fields "Header.ApplicationName", "Header.EntryType", "Header.SubmittingUser", "Header.LoggingProcess.MachineName", "Header.LoggingProcess.AppDomain", "Header.LoggingProcess.Assembly", "Header.LoggingProcess.ClassName", "Header.LoggingProcess.Method", "Header.Message", "Header.EntryTimestamp", "Items", "Items{}.Key", "Items{}.Value", "Data"
Chain search (the selected* tokens created when a row is clicked in the first table from row.[Column Name].value):
| where 'Header.EntryType'=$selectedEntryType|s$ AND 'Header.EntryTimestamp'=$selectedTimestamp|s$ AND 'Header.SubmittingUser'=$selectedSubmittingUser|s$ AND 'Header.ApplicationName'=$selectedApplicationName|s$ AND 'Header.LoggingProcess.MachineName'=$selectedMachineName|s$
| rename "Items{}.Key" AS "Keys", "Items{}.Value" AS "Values"
| foreach Values[replace "" with "NULL"]
| table "Keys", "Values"
The other odd behavior besides it still taking a few seconds to load is that if the job that populated the base search has expired, then when the chained search is executed by selecting a different row in the first table, it runs the base search again. Maybe that's by design, but with two separate searches that are not chained, this behavior is not observed since the second search can run based solely on input from tokens populated in the drill down. Which means the data in the first table doesn't refresh just because the second search is run again.
It seems, if it were possible, simply pulling the Items data out of a hidden column in the first table would be ideal, but that doesn't seem to be possible (yet?).
You could save the results from the base search (job sid) and then load the results with loadjob in the panel searches.
That's the ticket. I have a little more tweaking to do, but this appears to be the hint I needed. Thanks @ITWhisperer .