Splunk Search

Why does Splunk search result count diminishes during search?

JLTsx
Loves-to-Learn Lots

Hey,

When running a query the results found are diminishing over time. Pagination is not of incluence ( tried 10, 50, 100 ) and it seems to be somewhere in the index this behavior is triggered. Meaning, if i search all events for today the count goes up to a count of events which then stalls for about 10 seconds to then continue with the result count to then start diminishing.

When setting the time to a few hours back this behavior also happens. Since the search takes quite long to hit this "point of return" i assume the two time frames overlap and the same events are causing this diminishing of results to happen.

Suggesting there is something wrong in the index ?


Br,

JLT


Labels (1)
Tags (1)
0 Karma

JLTsx
Loves-to-Learn Lots

this issue was worked around by rewriting the query

0 Karma

JLTsx
Loves-to-Learn Lots

Thanks for sharing. I'm not in the habit of sharing queries. This one in particular was written by a 3rd party so I'm not sure i can legally share it in whole.

The mention of limits led me to test with 8 hour windows instead of 24 hours and this did result in queries which no longer failed to return the results without removing results.

The query looks like this

index=ind | where (cidrmatch(....,src ) OR cidrmatch(...,src)) OR (cidrmatch(...,dest) OR cidrmatcht(...,dest)) AND src!="<IP>" | stats dc(dest) as dcounter, values(dest) as dip by src,dest_port
| where dcounter >= 500
| eval nowtime = strftime(now(),"%d/%b/%Y:%H:%M:%S")
| eval tip = mvindex(dip,0)
| eval alerting = "scan" + dcounter + "port" + dest_port
| table nowtime dip tip alerting

While writing alternates to this query it was found using long "| stats" does not matter too much but adding some keys such as user when use "... as <name> by <key1>,<key2>,<key>" does cause the same issue

I've encountered this a few years ago but was not allowed to keep the notes on how to fix




0 Karma

ITWhisperer
SplunkTrust
SplunkTrust

You could run your search over shorter periods of time and save the results to a summary index. Then you can search the summary index over a number of sets of results and combine them as if they were run over the original index for the whole time.

0 Karma

JLTsx
Loves-to-Learn Lots

Thanks, i fixed it by not using eval, just stats and a where clauses, no limits are hit now. The query also appears to run (much?) faster.

0 Karma

woodcock
Esteemed Legend

You didn't even give us the search SPL.  I am sure that the problem is you are hitting limits but there is now way to help because you told us nothing useful.

0 Karma

ITWhisperer
SplunkTrust
SplunkTrust

Please share the search causing the issue

0 Karma
Get Updates on the Splunk Community!

Introducing the Splunk Community Dashboard Challenge!

Welcome to Splunk Community Dashboard Challenge! This is your chance to showcase your skills in creating ...

Built-in Service Level Objectives Management to Bridge the Gap Between Service & ...

Wednesday, May 29, 2024  |  11AM PST / 2PM ESTRegister now and join us to learn more about how you can ...

Get Your Exclusive Splunk Certified Cybersecurity Defense Engineer Certification at ...

We’re excited to announce a new Splunk certification exam being released at .conf24! If you’re headed to Vegas ...