Splunk Search

Combine inputcsv and search to create a gauge

ccsfdave
Builder

I am trying to create a gauge where the green, yellow, red are dynamically adjusted using average and percentages for similar traffic over the past 30 days. The 30 day search takes a while so what I would like to do is run it overnight and dump it to a csv. I would then like to run todays numbers on the fly and apply the results (the needle in the gauge) to the csv results.

This is where I am with it:

index=msad "EventCode=4624"  earliest=-1d@min |stats count as today| append[|inputcsv start=29 max=1 YsForGauge.csv] | table today average Eighty Ninety-Five

This results in a table:

today          average          Eighty          Ninety-Five
892683      
                1255.633333 2016    2257

So the results are in two rows of the resulting table. What I would like to do is replace the table with

gauge today 0 average Eighty Ninety-Five

But is failing. Help?

Tags (3)
0 Karma
1 Solution

ccsfdave
Builder

Figured it out, not just append but appendcols puts it in one row and then gauge works!

index=msad "EventCode=4624" earliest=-1d@min |stats count as today| appendcols [|inputcsv start=29 max=1 YsForGauge.csv] | gauge today 0 average Eighty Ninety-Five

View solution in original post

0 Karma

ccsfdave
Builder

Figured it out, not just append but appendcols puts it in one row and then gauge works!

index=msad "EventCode=4624" earliest=-1d@min |stats count as today| appendcols [|inputcsv start=29 max=1 YsForGauge.csv] | gauge today 0 average Eighty Ninety-Five

0 Karma
Get Updates on the Splunk Community!

Developer Spotlight with Paul Stout

Welcome to our very first developer spotlight release series where we'll feature some awesome Splunk ...

State of Splunk Careers 2024: Maximizing Career Outcomes and the Continued Value of ...

For the past four years, Splunk has partnered with Enterprise Strategy Group to conduct a survey that gauges ...

Data-Driven Success: Splunk & Financial Services

Splunk streamlines the process of extracting insights from large volumes of data. In this fast-paced world, ...