Reporting

Nested Saved Searches

joshhenderson
Explorer

Hi,

I'm looking to have one saved search call another saved search (let's say calling search A from search B.) The reason for this is simply formatting the results of the first saved search A in a different way (the search B doesn't do much other than an stats count.)

However, each time I attempt to do this, the server takes a long time attempting to save it, and in most cases will make the server run extremely slow for a short period of time, but the save in never successful (before upgrading to 4.3.3 it would completely crash the server requiring the machine to restart.) The most common error I get (and the only one since upgrading to 4.3.3) is:

Encountered the following error while trying to save: Splunkd daemon is not responding: ('The read operation timed out',)



The following is an example of the searches I am attempting to save. I have been able to save the first one (which doesn't reference any other search,) but the second one will not save:




Name: GENERIC_PROCESSES_NOT_RUNNING

host="$host$" wmi_type=LocalProcesses earliest=$timeframe$ [ | inputlookup $lookupfile$ | fields Name ] | stats count AS TimesDetected by Name | append [ | inputlookup $lookupfile$ ] | stats max(TimesDetected) as TimesDetected by Name | eval Severity=if(match(TimesDetected, "0"), "1", "0") | table Name Severity




Name: GENERIC_AREAERROR_PROCESSES_NOT_RUNNING

| savedsearch GENERIC_PROCESSES_NOT_RUNNING host="$host$" lookupfile="$lookupfile$" timeframe="$timeframe$" | eval Area=tostring("Processes") | table Area Severity | stats count(eval(Severity="1")) as Errors by Area



If anyone is interested, the lookup file looks something like:

Process   | TimesDetected
CcmExec   | 0
Idle      | 0
InoRPC    | 0
LoginUI   | 0
sqlservr  | 0
sqlwriter | 0
FakePrcs  | 0

That is just a sample one that I have used whilst testing.

I need to be able to do some form of saved search to make the dashboards less complex, and be easily able to create a dashboard for multiple (at least 10) servers without reusing large chunks of code, otherwise it will be a nightmare to maintain.

Thanks,
Josh.

Tags (2)

Ayn
Legend

Not an answer to your specific problem with saved searches per se, but if one goal is to be able to manage and reuse different kinds of searches, have you considered using search macros for that?

http://docs.splunk.com/Documentation/Splunk/latest/User/CreateAndUseSearchMacros

Drainy
Champion

Well, except that this isn't a subsearch 🙂 A macro is expanded within the existing search query before execution, not executed as a separate search

dbryan
Path Finder

Good suggestion, but sometimes in cases like this the 'subsearch' itself is very expensive and it's not feasible to run it on demand.

Drainy
Champion

Have you looked through splunkd.log for any errors that occurred at around the same time? I have come across timeout errors before on machines under heavy load which aren't up to spec (although this could be something failing on splunkd and splunkweb is just left waiting for a reply that is never coming)

Get Updates on the Splunk Community!

Enterprise Security Content Update (ESCU) | New Releases

In December, the Splunk Threat Research Team had 1 release of new security content via the Enterprise Security ...

Why am I not seeing the finding in Splunk Enterprise Security Analyst Queue?

(This is the first of a series of 2 blogs). Splunk Enterprise Security is a fantastic tool that offers robust ...

Index This | What are the 12 Days of Splunk-mas?

December 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...