I'm working on building a dashboard that will take a base report and parse it into different items that can be flagged for review. I've been able to get this to work in a roundabout way, but there is a component that seems to require that the base search be ran again for each of the 10 panels (meaning 10 searches). I have tried using the weekly-ran report as the primary data source and chaining the further refinement from there - by using |search in the chained searches - but it's still running the entire search again. The biggest problem with this is that this specific search can take upwards of 20 minutes to run successfully, meaning that I have 10 cores locked up for 20 minutes... Not ideal.
A way around this would be to run the scheduled reports of this refined data, which is the next place that I went and would like to go - EXCEPT there is some dynamic data that I'm incorporating into the search. I have a dynamic CSV file that contains usernames of users that should be inside the top-level search query (index=production user IN (user-from-csv,user2-from-csv,etc). I can get this to work in the dashboard by storing the search results as a token (after having used inputlookup and format). I can't get this to work in the report, though. Does anybody know how to take a CSV file's contents and store them in a variable OR run a sub-search and pass those results as a string later in the main search?
non-working view of what I would like to see (understanding that this isn't how Splunk works):
|eval included-users=inputlookup included-users.csv
index=production user IN (included-users) action=success
You can set a token in the done handler of the base search to save the job sid, and then use loadjob to retrieve the results (multiple times)
<search id="base">
<query base="base">
``` your base search ```
<\query>
<done>
<condition>
<set token="basesid">$job.sid$</set>
</condition>
</done>
<search>
<panel>
<table>
<search>
<query>
| loadjob $basesid$
``` further processing of results ```
</query>
</search>
</table>
</panel>
This is much easier to solve in Dashboard Studio. You can define a base search for the entire dashboard, and add presentations and filtering as chain searches. See Chain searches together with a base search and chain searches. Base search is performed only once. (You may ask it to auto refresh like in Simple XML although I haven't looked.)
Data use in intro to Dashboard Studio indicates that Simple XML can also use base search and chain search. But I cannot find documentation.
You can use the <search> element to define the base search, then use a post-process search within a panel.
https://docs.splunk.com/Documentation/Splunk/9.1.0/Viz/PanelreferenceforSimplifiedXML#search
Hi @dwelbba00,
if the results to display in each panel are always the same, you could use a Post Process Search, in few words run the search once and disaply results in more manels, eventually with additional filtere or displaying only a subsear of fields in each panel.
You can find more infos on this at https://docs.splunk.com/Documentation/Splunk/9.1.0/Viz/Savedsearches#Post-process_searches_2 or installing and using the Splunk Dashboard Examples App (https://splunkbase.splunk.com/app/1603).
Ciao.
Giuseppe
They aren't, unfortunately. I could schedule these searches to run during off-times if I could find a way to incorporate the inputlookup as part of the query. That's the direction I'm going now. Is it possible to incorporate a CSV file into a query as parameters?
example:
index=production user IN (user1-from-csv,user2-from-csv,...)