Getting Data In

How do I troubleshoot "ERROR TailingProcessor - Ran out of data while looking for end of header"

New Member

Can anybody please tell me what I need to do for the error

ERROR TailingProcessor - Ran out of data while looking for end of header 

I tried reinstalling Splunk many times, but I don't know what to do for this error. Please advise, it's urgent.

0 Karma

Splunk Employee
Splunk Employee

It looks like this was a bug SPL-138909 "Ran out of data while looking for end of header- archive file" which has been fixed in the following releases:
6.5.6 +
6.6.4 +
7.0.2 +
7.1.0 +
7.2.0 +

0 Karma


While running Linux UF_6.3.7, I had this issue as well. Everything was correct in my config, but it was cured by reboot of host.

0 Karma

Splunk Employee
Splunk Employee

Is the file structured data? If not, you probably have INDEXED_EXTRACTIONS defined for this source, sourcetype, or globally; this will cause issues with the crc. While the issue manifests itself as TailingProcessor errors, btprobe behaves in the same way and is an easy test. You can replicate this with the license_usage.log, here with the setting sitting outside of any stanza (not recommended!):

> head -1 ./splunk/etc/system/local/props.conf

> ./splunk/bin/splunk cmd btprobe -d ./splunk/var/lib/splunk/fishbucket/splunk_private_db/ --file ./splunk/var/log/splunk/license_usage.log
Using logging configuration at ./splunk/etc/log-cmdline.cfg.
terminate called after throwing an instance of 'FileContentException'
  what():  Ran out of data while looking for end of header
Aborted (core dumped)

btprobe crashes!
We don't index license_usage.log!

> ./splunk/bin/splunk search 'index=_internal | stats count by source'
source count
----------------------------------------------------------------------- -----
./splunk/etc/splunk.version 1
./splunk/var/log/splunk/conf.log 5
./splunk/var/log/splunk/first_install.log 4
./splunk/var/log/splunk/metrics.log                       68079
./splunk/var/log/splunk/migration.log.2016-02-18.19-59-41    12
./splunk/var/log/splunk/mongod.log                          412
./splunk/var/log/splunk/scheduler.log                        48
./splunk/var/log/splunk/splunkd-utility.log                  65
./splunk/var/log/splunk/splunkd.log                        1837
./splunk/var/log/splunk/splunkd_access.log                  166
./splunk/var/log/splunk/splunkd_stderr.log                   10
./splunk/var/log/splunk/splunkd_ui_access.log              2363
./splunk/var/log/splunk/web_access.log                      131
./splunk/var/log/splunk/web_service.log                    1101

If we rectify this, btprobe works, and we index the log:

> sed -i 's/^INDEXED_EXTRACTIONS/#INDEXED_EXTRACTIONS/g' ./splunk/etc/system/local/props.conf
> ./splunk/bin/splunk restart &>/dev/null
> ./splunk/bin/splunk search 'index=_internal | stats count by source'
                                source                                  count
----------------------------------------------------------------------- -----
./splunk/etc/splunk.version                                   2
./splunk/var/log/splunk/conf.log                              6
./splunk/var/log/splunk/first_install.log                     4
./splunk/var/log/splunk/license_usage.log                 11877
./splunk/var/log/splunk/metrics.log                       69386
./splunk/var/log/splunk/migration.log.2016-02-18.19-59-41    13
./splunk/var/log/splunk/mongod.log                          631
./splunk/var/log/splunk/scheduler.log                        66
./splunk/var/log/splunk/splunkd-utility.log                 107
./splunk/var/log/splunk/splunkd.log                        3100
./splunk/var/log/splunk/splunkd_access.log                  214
./splunk/var/log/splunk/splunkd_stderr.log                   17
./splunk/var/log/splunk/splunkd_ui_access.log              2366
./splunk/var/log/splunk/web_access.log                      134
./splunk/var/log/splunk/web_service.log                    1835

> ./splunk/bin/splunk cmd btprobe -d ./splunk/var/lib/splunk/fishbucket/splunk_private_db/ --file ./splunk/var/log/splunk/license_usage.log
Using logging configuration at ./splunk/etc/log-cmdline.cfg.
key=0x96c781d205619afe scrc=0xf665afd1903bda74 sptr=4720849 fcrc=0xa3bd152f008960fc flen=0 mdtm=1455235563 wrtm=1455236118


Regarding the original problem:

My guess is that the csv file is formatted in such a way that Splunk does not recognize the line endings. I would edit the csv file with a text editor and make sure that it has appropriate line endings for the OS.

Then try to upload again.

0 Karma


This error indicates that Splunk is having trouble processing one of your inputs.
Does Splunk work properly otherwise? Does the message indicate which file is the problem?

0 Karma

New Member

Hi Iguinn,

Thanks for the reply. Yes you right it looks like Splunk is having trouble to processing the one of the input (.CSV). Other inputs went well. But I need to import the particular input.

0 Karma

New Member

Now I see below error,

09-15-2015 17:07:52.483 -0700 ERROR HttpListener - Handler for /en-GB/api/shelper?snippet=true&snippetEmbedJS=false&namespace=search&search=%7C+dele&useTypeahead=true&useAssistant=true&showCommandHelp=true&showCommandHistory=true&showFieldInfo=false&_=1442362045321 sent a 0 byte response after earlier claiming a Content-Length of 4629!   2015-09-15 17:07:52 L-SJN-00564120  _internal   1   ERROR   C:\Program Files\Splunk\var\log\splunk\splunkd.log  splunkd L-SJN-00564120
09-15-2015 17:07:52.482 -0700 ERROR HttpListener - Exception while processing request from for /en-GB/api/shelper?snippet=true&snippetEmbedJS=false&namespace=search&search=%7C+dele&useTypeahead=true&useAssistant=true&showCommandHelp=true&showCommandHistory=true&showFieldInfo=false&_=1442362045321: Connection closed by peer
0 Karma


The second error looks like a failed HTTP request, probably not related to the file input problem. If you have a lot of these, then that's a problem - but it is also a different question.

0 Karma
Get Updates on the Splunk Community!

Routing logs with Splunk OTel Collector for Kubernetes

The Splunk Distribution of the OpenTelemetry (OTel) Collector is a product that provides a way to ingest ...

Welcome to the Splunk Community!

(view in My Videos) We're so glad you're here! The Splunk Community is place to connect, learn, give back, and ...

Tech Talk | Elevating Digital Service Excellence: The Synergy of Splunk RUM & APM

Elevating Digital Service Excellence: The Synergy of Real User Monitoring and Application Performance ...