Getting Data In

data stop getting indexed for couple of hours

Paul1896
Path Finder

Hello,

we have problems with some log files which are randomly don't get indexed for a couple of hours. There is no log rotation during this time period and sometimes even no restart of the splunk forwarder is neccessary to start again with indexing.

Output splunkd.log filtered for affected "audit_log":

 02-12-2019 12:09:57.358 +0100 DEBUG ChunkedLBProcessor - Chunked Line Breaker Processing has been disabled for for sourcetype::audit_log
 02-12-2019 12:09:57.358 +0100 INFO  UTF8Processor - Converting using CHARSET="UTF-8" for conf "source::/xxx/audit.log|host::xxx|Haudit_log|339419"
02-12-2019 06:33:16.783 +0100 INFO  S2SSender - Abandoning channel with code=2, conf="source::/xxx/audit.log|host::xxx|audit_log|339419", unique_id=422585, last_touched=1549948674, last_touched_asctime="Tue Feb 12 06:17:54 2019", age=922281, ttl=300000
02-12-2019 06:17:54.985 +0100 INFO  Metrics - group=per_sourcetype_thruput, series="audit_log", kbps=0.06303521503133032, eps=0.5483843194729626, kb=1.9541015625, ev=17, avg_age=0.6470588235294118, max_age=5
02-12-2019 06:17:54.503 +0100 DEBUG TcpOutputProc - Pushed eventId=61 on chanId=422585 to back of tcp client (tcp output) queue. source:source::/xxx/audit.log|host::xxx|audit_log|339419
0 Karma
1 Solution

Paul1896
Path Finder

Probably we've figured out the root cause. Last week we indexed 20 gb for this instance within one day and have a configured thruput of 256 kb. So it was only a delayed indexing and not a "outage" of the forwarder itself.

View solution in original post

0 Karma

Paul1896
Path Finder

Probably we've figured out the root cause. Last week we indexed 20 gb for this instance within one day and have a configured thruput of 256 kb. So it was only a delayed indexing and not a "outage" of the forwarder itself.

0 Karma

woodcock
Esteemed Legend

That would definitely do it. Please click Accept here to close the question.

0 Karma

woodcock
Esteemed Legend

What version of Splunk are you using? There are some v6 releases that have big problems like this:

https://answers.splunk.com/answers/549663/splunk-661-stops-monitoring-files.html#answer-718797

0 Karma

Paul1896
Path Finder

We're using Splunk 7.1.5

0 Karma

woodcock
Esteemed Legend

If there is log rotation and you do not have some kind of housekeeping setup to delete the older files, the Splunk forwarder will get slower and slower and slower. Once you hit thousands of files in the same directory, Splunk will seem to stop forwarding completely. You have to keep the number of dead files low.

0 Karma

Paul1896
Path Finder

Hey @woodcock thanks for your reply! We have a housekeeping in place and the monitor doesn't match the older logs which are already rotated. So in my understanding the size & total of files of/in the folder itself isn't important in case if your monitor just indexing a specified logfile in it. Please correct me if I'm wrong. We also facing the problem only from time to time and after a while or a reboot the logging works without any problems.

0 Karma

ddrillic
Ultra Champion

@Paul1896 - about how many files we are talking about on this path?

0 Karma

Paul1896
Path Finder

@ddrillic We're talking just about 220 files

0 Karma

ddrillic
Ultra Champion

If that's the case, the number of files couldn't be the issue.

0 Karma

woodcock
Esteemed Legend

It can if the depth is not limited. Are there any ... in the file path? Are there hundreds of other potential path points with many files in them? This can still be the problem!!!

0 Karma

woodcock
Esteemed Legend

Your understanding is completely wrong. Even though the rotated files do not match the monitor stanza, they still have a deadly impact on the forwarder. Splunk still has to sort through them and when they pile up, Splunk performance will crater. Restarting Splunk causes it to work for a short short spurt and then it goes right back to poor performance. Here is an easy way to fix it.

https://answers.splunk.com/answers/309910/how-to-monitor-a-folder-for-newest-files-only-file.html

0 Karma
Got questions? Get answers!

Join the Splunk Community Slack to learn, troubleshoot, and make connections with fellow Splunk practitioners in real time!

Meet up IRL or virtually!

Join Splunk User Groups to connect and learn in-person by region or remotely by topic or industry.

Get Updates on the Splunk Community!

May 2026 Splunk Expert Sessions: Security & Observability

Level Up Your Operations: May 2026 Splunk Expert Sessions Whether you are refining your security posture or ...

Network to App: Observability Unlocked [May & June Series]

In today’s digital landscape, your environment is no longer confined to the data center. It spans complex ...

SPL2 Deep Dives, AppDynamics Integrations, SAML Made Simple and Much More on Splunk ...

Splunk Lantern is Splunk’s customer success center that provides practical guidance from Splunk experts on key ...