Splunk Search

Run SPL command once and store result to access faster


How can I Run SPL command once and store result to access result faster next time.
for e.g. I need to analyses large logs every night and in next day access to "save search" and "dashboards" quickly without waiting to query on data when open "save search" and "dashboards".
I mean every night Splunk after analyze logs run queries on that exist on "save search" and "dashboards" and store output, so next day when I open "save search" and "dashboards" Splunk load result quickly and display them.

Any recommendation?

0 Karma

  1. Run SPL command : ....your spl ... | outputlookup nnn.csv
  2. dashboard: | inputlookup nnn.csv
0 Karma


1-First command create file? Where it will be atore?
2-load file on dashboard?

0 Karma


1-First command create file? Where it will be atore?
| makeresults
| eval aa="1", bb="2", cc="3"
| outputlookup nnn.csv

file : /opt/splunk/etc/apps/your_app/lookups

2-load file on dashboard?


      <query>| inputlookup nnn.csv</query>
    <option name="count">30</option>
    <option name="drilldown">none</option>
    <option name="refresh.display">progressbar</option>
0 Karma

Path Finder

Save your job as a report, run it based on your schedule and use the loadjob command.

| loadjob savedsearch="admin:search:Savedsearch"

when running loadjob on scheduled report, only latest result will be shown.

You can additionally use these as a base query to run other queries on if you are using multiple transformative commands on the same dataset.


Still not work,
|search error* OR fail* | loadjob savedsearch="admin:search:Savedsearch"

What is the last part of command “admin:search:Savedsearch“?

0 Karma

Path Finder
| loadjob savedsearch="YourUserID:App:SavedSearchname"

Loadjob must be the first line.

Why are you using:
|search error* OR fail* ?

You can also try

| rex "(?<errorOrFail>error|fail)"
|eval errorOrFail=if(isnull(errorOrFail), "False","True")
| search errorOrFail="True"

Or try with

index=yourindex error* OR fail*  

see if you can use sourcetype/index or extract some statistical data. I think you need to optimize your query, use index and zero-in on the data that you need.

0 Karma


You can use summary indexing. Analyze large data and write output to summary index. Use summary index in the report so that it runs faster on less amount of data when compared to raw data.

For more info check this:

0 Karma


I try commands (sichart, sitimechart, sistats, sitop, and sirare) But not work.

Below is my SPL and I visual it with “single value“ pivot:
source=index | search error* OR fail*

For e.g. i try this
source=index | sistats search error* OR fail*
source=index | collect search error* OR fail*

Any idea?

0 Karma

Ultra Champion

first search:

source=index error* OR fail* | collect index=error_index

second search:

index=error_index as_you_like

If you create summary index, The next time you search there, it will be faster.

I think a report is enough for your usage.

0 Karma

Ultra Champion


check this.

0 Karma