All Apps and Add-ons

Is there a way I can ensure the rest of the search does not run when the dbxquery fails in a subsearch?

briancronrath
Contributor

I've been running into some issues where I'm doing a join on data from a subsearch that uses the dbxquery command for a saved search that periodically writes us a lookup file. We ran into some db connection issues where searchheads were intermittently not able to connect to this db and run the search; however, when this occurred it would essentially destroy our lookup file because the search continued to run even though the dbxquery would fail. Is there a way I can ensure the rest of the search does not run when the dbxquery fails in a subsearch?

0 Karma
1 Solution

DalJeanis
SplunkTrust
SplunkTrust

In essence, you'll need to put another step in there.

Put the dbxquery results into a staging file, then only write the results from staging into the lookup if there is something in the staging file.

| inputlookup myrealfile.csv
| eval filenum=0
| inputlookup append=t mystagingfile.csv 
| eval filenum=coalesce(filenum,1)
| eventstats max(filenum) as maxfile
| where filenum=maxfile  
| fields - filenum maxfile
| outputlookup append=f myrealfile.csv
| appendpipe 
    [| where false() 
     | outputlookup append=f mystagingfile.csv 
     ]

...or, if you prefer this style...

| inputlookup mystagingfile.csv 
| appendpipe 
    [ | stats count as newcount 
      | eval rectype=if(newcount>0,"keepme","killme") 
      | inputlookup append=true myrealfile.csv 
      | eventstats max(rectype) as rectype
      | where isnull(newcount) AND rectype="keepme"
      | fields - rectype
    ]
| outputlookup append=f myrealfile.csv
| appendpipe 
    [| where false() 
     | outputlookup append=f mystagingfile.csv 
     ]

View solution in original post

0 Karma

DalJeanis
SplunkTrust
SplunkTrust

In essence, you'll need to put another step in there.

Put the dbxquery results into a staging file, then only write the results from staging into the lookup if there is something in the staging file.

| inputlookup myrealfile.csv
| eval filenum=0
| inputlookup append=t mystagingfile.csv 
| eval filenum=coalesce(filenum,1)
| eventstats max(filenum) as maxfile
| where filenum=maxfile  
| fields - filenum maxfile
| outputlookup append=f myrealfile.csv
| appendpipe 
    [| where false() 
     | outputlookup append=f mystagingfile.csv 
     ]

...or, if you prefer this style...

| inputlookup mystagingfile.csv 
| appendpipe 
    [ | stats count as newcount 
      | eval rectype=if(newcount>0,"keepme","killme") 
      | inputlookup append=true myrealfile.csv 
      | eventstats max(rectype) as rectype
      | where isnull(newcount) AND rectype="keepme"
      | fields - rectype
    ]
| outputlookup append=f myrealfile.csv
| appendpipe 
    [| where false() 
     | outputlookup append=f mystagingfile.csv 
     ]

View solution in original post

0 Karma
Did you miss .conf21 Virtual?

Good news! The event's keynotes and many of its breakout sessions are now available online, and still totally FREE!