Hello,
I am attempting to use a CSV file as an inputlookup as a base search in Splunk IT Service Intelligence (ITSI). The search runs fine in the Base Search Editor:
|inputlookup lookup_assets.csv |fields public_table
and I get around 100 returns such as:
public_table
Asset1
Asset2
Asset3
...
Asset93
For the next step: I go to add the public_table as a metric for a distinct count, but I don't get any results when I attempt to set the thresholds.
Question - is using an inputlookup table in this manner valid? If so, what am I doing incorrectly?
Many thanks.
I will answer my own question...
The following will actually work as a search for a KPI...
|inputlookup lookup_assets.csv |stats dc(public_table) AS CriticalApps| eval _time = now()
But - after getting some more information from the client; this is not an efficient method for a KPI that will be executed every 5 minutes. This input lookup table is used for further calculations for a KPI that gathers more information so the best way to display this information is as an adhoc widget in a glass table.
So yes, it can be done - no it's not the best way of doing things if it is only going to be used for visual information via Glass Table in ITSI.
I will answer my own question...
The following will actually work as a search for a KPI...
|inputlookup lookup_assets.csv |stats dc(public_table) AS CriticalApps| eval _time = now()
But - after getting some more information from the client; this is not an efficient method for a KPI that will be executed every 5 minutes. This input lookup table is used for further calculations for a KPI that gathers more information so the best way to display this information is as an adhoc widget in a glass table.
So yes, it can be done - no it's not the best way of doing things if it is only going to be used for visual information via Glass Table in ITSI.