Splunk Search

Run savedsearch from main search using lookup for arguments

BradOH
Path Finder

Hey community, another weird question.  We have scheduled reports which use dbxlookups to enrich the data for analysis.  Unfortunately, the reports are hitting resource/timeout limits and can't be run in subsearches.

To resolve this, we had thought about running the initial report and collect results into a results index.  The report would then spawn separate reports by regions (from lookup defining these regions) to break the reports into manageable chunks.  For example...

initial search ... | collect index=initial-results addinfo=false
| savedsearch report-by-region host-region=[ inputlookup host-regions.csv ]

Or would we need to use the map command in this case?  Based on my research, it is unclear if the savedsearches would be constrained by the resources limits on the initial search.

Of course, we could also run the initial search at 1am, then the savedsearches at 1:15 with logic to hold until primary search completes, but this brings it's own complexity.

Am I missing an obvious solution???

Labels (2)
0 Karma

yuanliu
SplunkTrust
SplunkTrust

Like @livehybrid says, there are different ways to work around such timeouts.  One suggestion about using summary index followed by saved searches: Run the collect before hand as one saved search, not as part of the pipe to run multiple regional searches.  Your regional searches should run against this summary index that you named initial-results.

Whether this solves your timeout (or resource) limits will depend on what is hitting such limits in your initial setup.  In your statement, you used "scheduled reports" as plural.  But I cannot relate your mock code to multiple reports.  Could you explain that initial setup, why do you think that the use of dbxlookup is contributing to the problem, and what are the symptoms observed in the initial setup?  What is the expected outcome, whether it is one report broken down by region, or multiple reports (plural), one for each region?  Your mock code also references a lookup named host-regions.csv.  Does this relate to that dbxlookup in your first statement?

0 Karma

isoutamo
SplunkTrust
SplunkTrust

don’t use map if it’s the only choose an you haven’t enough small dataset. With any real/bigger datasets this is not a winning story 😞

As usual it depends. If you are not limited to tight time schedule I like to do 1st summary or kvstore depends on case as a schedule search. Then after couple or hours or whatever is your safe timeframe do the real searches against that temporary data. You can do that even closer from original search as currently splunk have command https://help.splunk.com/en/splunk-enterprise/search/spl-search-reference/10.2/search-commands/requir... which you can use to check that earlier phase has successfully finished. Of course you need some “magic” to tell it.

0 Karma

livehybrid
SplunkTrust
SplunkTrust

Hi @BradOH 

Different people may having differing opinions here, and it really depends on the load, event counts etc etc but personally I would try and split it into discrete searches which can run on their own as there are numerous limits.conf settings which could trip you up (silently...) if it ran as one. Given you've already experienced limit/resource constraints by splitting down the search(es) you are reducing the risk of issues. 

If the timing allows then run the first bit, then a reasonable time later (depending on how urgently you need the info vs estimated runtime of the first part) you would run the next search.

🌟 Did this answer help you? If so, please consider:

  • Adding karma to show it was useful
  • Marking it as the solution if it resolved your issue
  • Commenting if you need any clarification

Your feedback encourages the volunteers in this community to continue contributing

Got questions? Get answers!

Join the Splunk Community Slack to learn, troubleshoot, and make connections with fellow Splunk practitioners in real time!

Meet up IRL or virtually!

Join Splunk User Groups to connect and learn in-person by region or remotely by topic or industry.

Get Updates on the Splunk Community!

May 2026 Splunk Expert Sessions: Security & Observability

Level Up Your Operations: May 2026 Splunk Expert Sessions Whether you are refining your security posture or ...

Network to App: Observability Unlocked [May & June Series]

In today’s digital landscape, your environment is no longer confined to the data center. It spans complex ...

SPL2 Deep Dives, AppDynamics Integrations, SAML Made Simple and Much More on Splunk ...

Splunk Lantern is Splunk’s customer success center that provides practical guidance from Splunk experts on key ...