Splunk Enterprise

How do I read the values from a dashboard table into a JavaScript object using JavaScript?

andrewtrobec
Motivator

Hello!

My objective is to read the values of a Spunk table visualization from a dashboard into a JavaScript object for further processing.  I'm not sure what object yet, but my main issue lies with iterating through the table and extracting the cell values.

Can anybody provide some sample JS code for identifying the table object and interating through its values?

Thanks!

Andrew

Labels (3)
0 Karma
1 Solution

jcraumer
Explorer

I would probably have done search parsing through Python but it can be done using Javascript.

Create a blocking search which will wait until the search in finished then reference the job object results field.  The results can then be mapped to a list or dictionary for use.  Once you have completed the tasks associated with the search values it can be rendered back to the dashboard.

 

require([],
function()  {
    var splunkWebHttp = new splunkjs.SplunkWebHttp();
    var service = new splunkjs.Service(splunkWebHttp);
    var searchQuery = "search index=_internal | head 5";
    var searchParams = {exec_mode: "blocking"};
    service.search(
        searchQuery,
        searchParams,

        function(err, job) {
            job.fetch(function(err){
            console.log("Job ID: " + job.sid);
            console.log("Max Results: " + job.properties().resultCount);

            // iterate results by row
            job.results({}, function(err, results) {
                var fields = results.fields;
                var rows = results.rows;
                for(var i = 0; i < rows.length; i++) {
                    var values = rows[i];
                    console.log("Row " + i + ": ");
                    for(var j = 0; j < values.length; j++) {
                        var field = fields[j];
                        var value = values[j];
                        console.log(" " + field + ": " + value);
                    }
                }
            })
            });
        });
})

 

View solution in original post

jcraumer
Explorer

I would probably have done search parsing through Python but it can be done using Javascript.

Create a blocking search which will wait until the search in finished then reference the job object results field.  The results can then be mapped to a list or dictionary for use.  Once you have completed the tasks associated with the search values it can be rendered back to the dashboard.

 

require([],
function()  {
    var splunkWebHttp = new splunkjs.SplunkWebHttp();
    var service = new splunkjs.Service(splunkWebHttp);
    var searchQuery = "search index=_internal | head 5";
    var searchParams = {exec_mode: "blocking"};
    service.search(
        searchQuery,
        searchParams,

        function(err, job) {
            job.fetch(function(err){
            console.log("Job ID: " + job.sid);
            console.log("Max Results: " + job.properties().resultCount);

            // iterate results by row
            job.results({}, function(err, results) {
                var fields = results.fields;
                var rows = results.rows;
                for(var i = 0; i < rows.length; i++) {
                    var values = rows[i];
                    console.log("Row " + i + ": ");
                    for(var j = 0; j < values.length; j++) {
                        var field = fields[j];
                        var value = values[j];
                        console.log(" " + field + ": " + value);
                    }
                }
            })
            });
        });
})

 

andrewtrobec
Motivator

@jcraumer Thanks for this example, exactly what I need!

0 Karma
Get Updates on the Splunk Community!

Splunk Lantern | Getting Started with Edge Processor, Machine Learning Toolkit ...

Splunk Lantern is Splunk’s customer success center that provides advice from Splunk experts on valuable data ...

Enterprise Security Content Update (ESCU) | New Releases

In the last month, the Splunk Threat Research Team (STRT) has had 2 releases of new security content via the ...

Announcing the 1st Round Champion’s Tribute Winners of the Great Resilience Quest

We are happy to announce the 20 lucky questers who are selected to be the first round of Champion's Tribute ...