- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content

Hi all,
We were indexing a custom IIS web log until the end of February, when we started receiving "ERROR TailReader - Ran out of data while looking for end of header" in our forwarders $splunk_home$\var\log\splunk\splunk.log file. We have since attempted every "fix" we could find online without any success. We have even created a new app from scratch without success.
The UF on the web servers simply monitors a local log file. While this log file is custom, the data inside it is structured the same way. In some rare instances the line of text at the end of the [ ] fields will be a JSON statement over 10,000 characters long (which is why we have set the TRUNCATE to 100k).
Below are the various conf files. I have left some of the attempted fixes in with comments #:
props.conf
[iis]
TRUNCATE = 100000
#FIELD_HEADER_REGEX = ^\[([^\]]+)\] \[(\d+)\] \[([^\]]+)\] \[([^\]]+)\] \[(\d+)\] \[([^\]]+)\] \[([^\]]+)\] \[(\d+)\] (.+)
#CHECK_FOR_HEADER = false
#SHOULD_LINEMERGE= true
index.conf
[monitor://D:\inetpub\wwwroot2\OMITTED\Logs\data.log]
index = ms_iis
sourcetype = iis
We still receive Windows event logs from this web server, so we know that the universal forwarder is working correctly.
Any help here would be greatly appreciated. I will also be opening a ticket with Splunk Enterprise for assistance and will post the resolution if/when I figure it out.
Thanks!
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content

We figured out the problem with the help of Splunk Support. It turns out that using the sourcetype = iis
was the issue because that sourcetype is looking for predefined headers and other information which didn't match our custom logs. Changing the sourcetype to anything else, i.e. sourcetype = iis_custom
in inputs.conf and [iis_custom]
in props.conf allowed the data to be parsed as is.
Hope this helps someone else!
Cheers
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content

You said you've custom log being generate. Does your custom log has the header lines, as a standard iis log will have, at the start of log file?
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content

We figured out the problem with the help of Splunk Support. It turns out that using the sourcetype = iis
was the issue because that sourcetype is looking for predefined headers and other information which didn't match our custom logs. Changing the sourcetype to anything else, i.e. sourcetype = iis_custom
in inputs.conf and [iis_custom]
in props.conf allowed the data to be parsed as is.
Hope this helps someone else!
Cheers
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Thank you SO MUCH! This helped and your answer saved me so much time not having to open a support case. Weirdly though, my iis logs were not custom but changing the source type to iis_2 and creating a custom extraction worked.
