I'm working on building a dashboard that will take a base report and parse it into different items that can be flagged for review. I've been able to get this to work in a roundabout way, but there is a component that seems to require that the base search be ran again for each of the 10 panels (meaning 10 searches). I have tried using the weekly-ran report as the primary data source and chaining the further refinement from there - by using |search in the chained searches - but it's still running the entire search again. The biggest problem with this is that this specific search can take upwards of 20 minutes to run successfully, meaning that I have 10 cores locked up for 20 minutes... Not ideal.
A way around this would be to run the scheduled reports of this refined data, which is the next place that I went and would like to go - EXCEPT there is some dynamic data that I'm incorporating into the search. I have a dynamic CSV file that contains usernames of users that should be inside the top-level search query (index=production user IN (user-from-csv,user2-from-csv,etc). I can get this to work in the dashboard by storing the search results as a token (after having used inputlookup and format). I can't get this to work in the report, though. Does anybody know how to take a CSV file's contents and store them in a variable OR run a sub-search and pass those results as a string later in the main search?
non-working view of what I would like to see (understanding that this isn't how Splunk works):
|eval included-users=inputlookup included-users.csv index=production user IN (included-users) action=success
... View more