Getting Data In

Splunk UF wineventlog monitoring is too slow

omerl
Path Finder

Hey,
I have around 30 Splunk Universal Forwarders on my environment, monitoring the local Event Log (Windows Servers 2016).
Lately I noticed that a few forwarders are having a delay / sending events too slow.
I checked the traffic and noticed that once in every 20-30 seconds, the forwarder is sending around 3K events to the indexers, which is a very small amount of data, while the eventlog is creating much more events and much faster.

So the slow forwarding opened a gap of around 30 minutes in the data.
I tried to increase queues size and set thruput to unlimited. The performance of the server seems fine, no high CPU or memory usage.

I looked on another server, which currently seems to send the events on time (and has a lot more events on its eventlog, yet it is faster), and from sniffing the traffic it seems like the forwarder is sending events almost every second - no ~20 seconds interval.

I tried to forward to a different (test) environment, thinking the indexers are getting too many events from too many forwarders, but it does not seems to make any change.

Also, the Splunk universal forwarder on the servers is configured the same way, via a deploy server.

I wonder if any of you had this issue, or can think of a possible cause to the problem.

Thanks!

ekbn
Loves-to-Learn

One of the issues I recently dealt with was the delay in sending security channel logs in Active Directory, which I finally resolved after a few days.
Here are the steps I took to fix the problem:

I investigated the queue issue in different pipelines.
This link explains in detail how to identify and fix queue problems to reduce delays:
index=_internal host=* blocked=true
This way, you can check whether the issue is with the universal forwarder, the heavy forwarder, or a higher tier.
I experienced this issue with both UF and HF. I increased the queue size and added the following parameter along with the queue adjustment:
/etc/system/local/server.conf
parallelIngestionPipelines=2
https://conf.splunk.com/files/2019/slides/FN1570.pdf

To adjust the max speed rate in the ingestion pipeline, I modified the following parameter in limits.conf:

[thruput]
maxKBps = 0

https://community.splunk.com/t5/Getting-Data-In/How-can-we-improve-universal-forwarder-performance/m...

The final and most effective step was changing the following parameter in UF’s inputs.conf:
use_old_eventlog_api=true

If you have added the parameter evt_resolve_ad_obj=true to translate SID/GUID and it cannot perform the translation, it will pass the task to the next domain controller. It waits for a response before proceeding, which can cause delays. To fix this, I added:
evt_dc_name=localhost

By implementing the above steps, logs were successfully received and indexed in real-time.
Thank you for taking the time to read this. I hope it helps you resolve similar issues.

0 Karma

harsmarvania57
Ultra Champion

Hi @omerl,

Worth to take a look at https://answers.splunk.com/answers/686880/discrepancy-in-the-transfer-of-wineventlogsecurity.html if you are monitoring Windows Security Event logs.

0 Karma

lakshman239
Influencer

This is a known issue. Pls raise a case with splunk support to review your env/conf. The support can suggest solution/work-around. the link above is worth checking, but notice that the use of use_old_eventlog_api = 1 will change the formatting of the windows events and the parsing/field extractions of windows events using Splunk add on for windows doesn't work very well. You can evaluate for your instance and share your findings with support

0 Karma

omerl
Path Finder

I tried to update the environment to version 7.2.3 and still no change. Trying to contact support. In the meantime - @lakshman239 You said this is a known issue, do you know anyone who had it / solve it / has suggestion regard it?

0 Karma

nvanderwalt_spl
Splunk Employee
Splunk Employee

See tcpSendBufSz in the outputs.conf spec file. Ideally you should only adjust this setting if you are very familiar with TCP/IP, or you cam ask the support person you are dealing with for a recommendation
https://docs.splunk.com/Documentation/Splunk/7.2.3/Admin/Outputsconf

0 Karma

omerl
Path Finder

Setting it as suggested above made no changes

0 Karma

omerl
Path Finder

Thanks, but the suggestions in the link does not seem to help. Is there another suggestion, or maybe I should try a different tool for forwarding? I'm trying StreamSets edge data collector, and I wonder maybe you have a better tool for forwarding wineventlog. Thanks!

0 Karma
Get Updates on the Splunk Community!

Enterprise Security Content Update (ESCU) | New Releases

In December, the Splunk Threat Research Team had 1 release of new security content via the Enterprise Security ...

Why am I not seeing the finding in Splunk Enterprise Security Analyst Queue?

(This is the first of a series of 2 blogs). Splunk Enterprise Security is a fantastic tool that offers robust ...

Index This | What are the 12 Days of Splunk-mas?

December 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...