Splunk Search

How to hide search results until the search is complete?

pwilson
Explorer

I want no results of a search to display until the search has completed. The search I am running displays any users which do NOT have any logs found in the search. I also have an alert that triggers when this search displays any results. The issue I am running into is that sometimes with long running queries, the results can sometimes briefly display results that a user that has not yet been found in the search and thus triggers an alert before the search finishes. The search finishes with the correct results, but the damage is done and the alert is already triggered.

 

I do not have the ability to modify the alert, just the search query.

 

The simplified search query:

search string
| eval tracked_users=split("userA,userB,userC",",")
| stats values(user) as user by tracked_users
| where NOT match(user,tracked_users)
| table tracked_users
| rename tracked_users as "Users not found"

 

Is  there a way to hide the table until the search is complete?  Or is there a better way to structure the query such that no results are displayed until the search is complete?

Labels (3)
0 Karma

jdunlea
Contributor

One thing to check is if there has been a delay in some of the data getting into the index. A delay in the data coming in may result in the alert running and not finding anything, thereby triggering your email alert action, but after the alert as been run, the data does eventually arrive in Splunk. 

 

For example, lets say you have an alert that looks across 5 mins of data, and it runs ever 5 mins. 

So the alert runs at 12:00 and checks for data from 11:55 - 12:00. Now when the search runs at 12:00 it finds no results, and then triggers your email alert action. However, at 12:01, some data that is timestamped from 11:58 arrives in Splunk. But at that point, it is too late for the alert, as it already ran at 12:00 and checked for data from 11:55 - 12:00 and found no data, and hence triggered the email alert action. Now, lets say at 12:15, you go back and manually run the same search for 11:55 - 12:00, the search DOES find the data timestamped as 11:58. That is because the data did eventually arrive in Splunk, but it just arrived at 12:01, which was too late for the alert to see it. 

 

The way you can check if there is a delay in your data is running a basic event search (no stats or timechart, etc)  over any time range, and then adding the following:

 

 

| eval index_delay=_indextime-_time 
| where index_delay>10
| eval index_time=_indextime 
| convert ctime(index_time)

 

 

Now you will see only events which were delayed by at least 10 seconds in arriving in Splunk. If you want to see data that was delayed by 30 seconds, then change the index_delay>10 to index_delay>30. 

 

Additionally, you will have a human readable time field called index_time which is the time that the event was actually indexed. If an event has an index_time that is AFTER the time that your alert ran, then it's likely that your alert never saw that event.  

 

I hope this helps!

0 Karma

jdunlea
Contributor

Is this being done with a standard alert and alert action in Splunk? The alert action should not be triggered until the search finishes, so I am surprised to hear that it's triggering before the search completes. 

0 Karma

pwilson
Explorer

I believe so, but I am not privy to the process so I can't confirm. The result briefly flashing is currently the chief theory as to the cause, but a definitive diagnosis has not been made. The alert is set to trigger when results > 0, and when the email alert is received it has no results. Running the query on the exact timeframe as the received alerts also produced no results, but with the aforementioned flash of results before the query completed. I'm definitely open to other theories, this is a head scratcher, but I can't think of any other reason as to why the email of the alert is produced with no results when it needs results to be produced. Thankfully it is rare at least, happening only about 5 times in about 1000 runs of the alert.

0 Karma
Get Updates on the Splunk Community!

Introducing the 2024 SplunkTrust!

Hello, Splunk Community! We are beyond thrilled to announce our newest group of SplunkTrust members!  The ...

Introducing the 2024 Splunk MVPs!

We are excited to announce the 2024 cohort of the Splunk MVP program. Splunk MVPs are passionate members of ...

Splunk Custom Visualizations App End of Life

The Splunk Custom Visualizations apps End of Life for SimpleXML will reach end of support on Dec 21, 2024, ...