Getting Data In

UF Data

Priya70
Explorer
0 Karma

kiran_panchavat
SplunkTrust
SplunkTrust

@Priya70  

It sounds like the UF might be hitting a resource bottleneck (CPU, memory, disk I/O, or handles) or the Windows Event Log channels may be overwhelmed. If the UF is forwarding to an indexer, intermittent network issues could also create backpressure and stall inputs.

I recommend checking $SPLUNK_HOME/var/log/splunk/splunkd.log for any warnings/errors around the time the data stops, this usually gives good clues on whether it’s resource, input, or connectivity related.

Did this help? If yes, please consider giving kudos, marking it as the solution, or commenting for clarification — your feedback keeps the community going!
0 Karma

sainag_splunk
Splunk Employee
Splunk Employee

Hi @Priya70,without seeing the actual splunkd.log entries during the stall periods, its hard to answer. However, based on your symptoms, the most likely cause is backpressure.

  Why backpressure fits your pattern:

  - High-volume classic logs (Application/Security/System) pause first

  - Lower-volume custom channels (Cisco VPN) continue uninterrupted

  - Multiple input types affected simultaneously (monitor, registry, scripted)

  - Automatic recovery after queues drain

  To confirm, check splunkd.log during stall periods for:

  - "queue is full" messages

  - TCP connection errors to indexers

  - Network timeout warnings

  Other possibilities to rule out:

  - Windows Event Log API resource exhaustion

  - UF memory pressure

  - Windows Event Log service issues

index=_internal host=<UF> source=*metrics.log* OR source=*splunkd.log* tcpout

 

  Hope this helps narrow it down!

If this helps, Upvote!!!!
Together we make the Splunk Community stronger 
0 Karma

Priya70
Explorer

.

0 Karma

PickleRick
SplunkTrust
SplunkTrust

But if your data is destined for both output groups, if one group blocks, the other one blocks as well.

0 Karma

Priya70
Explorer

.

0 Karma
Get Updates on the Splunk Community!

Index This | What’s a riddle wrapped in an enigma?

September 2025 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with this ...

BORE at .conf25

Boss Of Regular Expression (BORE) was an interactive session run again this year at .conf25 by the brilliant ...

OpenTelemetry for Legacy Apps? Yes, You Can!

This article is a follow-up to my previous article posted on the OpenTelemetry Blog, "Your Critical Legacy App ...