Getting Data In

Windows LightFwdr NOT forwarding IIS logs !?

john_loch
Explorer

I presently have 4 windows boxes lightforwarding to linux indexer. Forwarder is configured to forward IIS logs, however the forwarder forwards the first line then stops.

The problem is a Windows bug causes the 'modified' timestamp not to change (ie remains same as the creation date) even tho relevant process continues to write to logfile.

This is a known issue, for which a flag existed (alwaysOpenFile = 1) in previous versions. It appears to be no longer supported... SO HOW DO I FORCE THE FORWARDER to continue interrogating the logs ?

I NEED AN ANSWER TO THIS ASAP.

Thanks.

cnk
Path Finder

The tailing processor was re-written in 4.1.x (or 4.x.x) and it should now handle Windows IIS logs just fine. I had nothing but problems with 3.4.x forwarders but since upgrading to 4.1.3 everything looks good.

I use 4.1.3 forwarders and the following settings:

inputs.conf

[monitor://C:\WINDOWS\system32\LogFiles\W3SVC1\ex100902.log]
sourcetype=iis
crcSalt=<SOURCE>

props.conf

[iis]
KV_MODE = none
TIME_FORMAT = %Y-%m-%d %H:%M:%S
TZ = GMT
CHECK_FOR_HEADER = False

Yes, I only monitor the log for the current day. I use some cron foo on the deployment server to update inputs.conf shortly after midnight and then reload the deploy server to push the updated bundle to the clients. This was done under 3.4.x to try to help with some lag issues. Now that I'm on 4.1.3 I should really try monitoring to whole folder again.

john_loch
Explorer

Oh, I should also point out I am running the 4.1.2 Light Forwarder. and monitoring the W3SVC1 folder. Thanks

0 Karma

john_loch
Explorer

This is likely the problem I face. As the update timestamp s not changing when it should, the processor thinks the files haven't changed and all but abandons them. Which lands me back where i began. How do I force the processor to interrogate the logs ?

Thanks

0 Karma

gkanapathy
Splunk Employee
Splunk Employee

On 4.1.x, you should be fine with monitoring the whole folder without trouble. The old processor would check every file for updates, so lots of files (even if they are from previous days) would slow things down a lot. The new monitor will back off of files that have not been updated frequently, and check them less and less frequently, so they don't have significant performance impact as they age.

0 Karma
Get Updates on the Splunk Community!

Splunk + ThousandEyes: Correlate frontend, app, and network data to troubleshoot ...

 Are you tired of troubleshooting delays caused by siloed frontend, application, and network data? We've got a ...

Splunk Observability for AI

Don’t miss out on an exciting Tech Talk on Splunk Observability for AI!Discover how Splunk’s agentic AI ...

🔐 Trust at Every Hop: How mTLS in Splunk Enterprise 10.0 Makes Security Simpler

From Idea to Implementation: Why Splunk Built mTLS into Splunk Enterprise 10.0  mTLS wasn’t just a checkbox ...