Alerting

Why Splunk> dont show an alert symbol when search.log contains ERROR and results are obviously wrong?

marcokrueger
Path Finder

We use Splunk> 6.4.4 and sometime have memory-intensive searches in the webapp.
After I wondered why the result are obviously wrong I had a look at the search.log and found an error and the message that the results may be incomplete.
(02-15-2017 09:56:05.815 ERROR StatsProcessor - Reached limit max_mem_usage_mb (10240 MB), results may be incomplete! Please increase the max_mem_usage_mb in limits.conf .
)
I think it is a disastrous strategie to suggest the user all is fine while Splunk> obviously knows the result are incomplete
Why Splunk> gives me just a green sign next to the job-pulldown and not an alert-symbol? Can I config this behaviour or is it a bug?

best regards
Marco

muebel
SplunkTrust
SplunkTrust

Hi marcokrueger, as lguinn mentioned, I don't believe there is any mechanism to adjust the warning symbols. Nothing at least that I'd expect to be supported.

However, to address the initial concern, I expect for most cases the searches are fine, and you shouldn't be concerned about incomplete searches. As limits.conf describes, searches that exceed this limit are spilled to disk. This might have a negative impact on search performance, but the searches should still be complete. The exception to this is if you are making heavy use of the mvexpand command, which can lead to truncated searches.

Please let me know if this answers your question! 😄

0 Karma

marcokrueger
Path Finder

Hi muebel,
thank you for the answer. The affected queries dont use mvexpand, but uses eventstats to enrich information to already existing events. The performance dont care in this cases and the searches were completed. The wrong results are obvious when the new fields are shown in the output and you see some of them are missing, but if the user makes another statisitc over this values the error will be masked and the user may trust in completely wrong data.
Without doubt, the behaviour is described completely in the documantation of eventstats, but this doesn't mitigates the pain if you uses wrong data you trust in, because there wasn't any warning that the memory limit was reached. I can't demand from the users to search the search.log for errors. They just want to trust the warning symbol.
To pretend this I will increase the limit, but tomorrow the next user comes round the corner and exhaust the new limit, so I suggest for a future version, Splunk> shows a warning when eventstats goes out of memory and Splunk> stops adding the requested fields to the search results.

Best regards
Marco

0 Karma

lguinn2
Legend

It is not a bug.

0 Karma

marcokrueger
Path Finder

So it's a feature?
My pain is to explain the users that all their results may be corrupt or may be okay...depending on the day of week or their luck...

0 Karma

lguinn2
Legend

I think you should explain to users that they should take note of the information button when it appears. Sadly, there is no way AFAIK to change the color or symbol.
If such large searches are common, perhaps you should make the suggested change to limits.conf.
Also, do these search errors (or related messages) appear in any logs other than the search.log? The search.log is transient and not collected into the _internal index. If a message appears in either the _internal or _audit index when this occurs, you could set up an alert to detect it...
Finally, you might post some of the offending searches on this forum. There are many people here who are experienced at optimizing searches to reduce resource usage.

0 Karma
Get Updates on the Splunk Community!

Enterprise Security Content Update (ESCU) | New Releases

In December, the Splunk Threat Research Team had 1 release of new security content via the Enterprise Security ...

Why am I not seeing the finding in Splunk Enterprise Security Analyst Queue?

(This is the first of a series of 2 blogs). Splunk Enterprise Security is a fantastic tool that offers robust ...

Index This | What are the 12 Days of Splunk-mas?

December 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...