Getting Data In

Why does a Splunk 5.0.2 universal forwarder ignore monitor stanzas for files ending in ".splunk"?

Ricapar
Communicator

As weird of a situation as I think this is, I do believe that is what is going on...

I had this stanza in inputs.conf:

[monitor:///my/files/*.splunk]
index = myindex
sourcetype = mysourcetype

I also tried another variation:

[monitor:///my/files/]
index = myindex
sourcetype = mysourcetype
whitelist = splunk$

And still nothing... I would hit the REST API endpoint on the forwarder (/services/admin/inputstatus/TailingProcessor:FileStatus), and the files would simply not show up on there. Nothing reached my indexers either. Nothing in the splunkd.log file on the forwarder gave any indication it was even considering to read those files.

I also tried just doing a monitor on the whole directory (same as above, but without the whitelist line). The Universal Forwarder picked up a bunch of other files that are in that directory, but just passed over the .splunk ones as if they didn't exist.

So out of frustration, I renamed all my .splunk files to .wtf, set up this stanza:

[monitor:///my/files/]
index = myindex
sourcetype = mysourcetype
whitelist = wtf$

And lo and behold, the Universal Forwarder picked them up without a problem, and I was searching them across my indexers in under a minute.

The files were set up with a .splunk extension since they were some feed files set up by an application's developer just for the purpose of feeding data into Splunk.

Is there some hidden/undocumented internal condition that Splunk has that would cause this type of behavior? I spent some time searching, but it's somewhat of a difficult search term to filter by.

This is on a Universal Forwarder v5.0.2.

The files themselves are nothing special. Just a few log messages from a cron job that runs every hour. No other system process is holding a lock on them.

1 Solution

amrit
Splunk Employee
Splunk Employee

This is a very old, little known feature... 🙂 For your case, it is recommended to instead suffix the files with .splunk.log or similar.

When it comes to monitor inputs, files ending in .splunk have been reserved for quite a long time now, as a metadata file. The TailingProcessor (aka monitor input) will ignore any such files until a corresponding file that lacks the .splunk extension is found in the same directory. For example:

/tmp/foo.txt.splunk
/tmp/foo.txt

Splunk creates these files for the following command:

./splunk spool /var/log/foo -sourcetype bar

In the default $SPLUNK_HOME/var/spool/splunk/, this will create foo.splunk with some metadata specifying sourcetype=bar, and then copy /var/log/foo to the same destination - the TailingProcessor will wait for the non-.splunk file, then read the metadata and consume & delete both files.

For more info on what can be specified, see ./splunk help spool. These .splunk files can be used in any [DESTRUCTIVE!!] batch+sinkhole input.

View solution in original post

amrit
Splunk Employee
Splunk Employee

This is a very old, little known feature... 🙂 For your case, it is recommended to instead suffix the files with .splunk.log or similar.

When it comes to monitor inputs, files ending in .splunk have been reserved for quite a long time now, as a metadata file. The TailingProcessor (aka monitor input) will ignore any such files until a corresponding file that lacks the .splunk extension is found in the same directory. For example:

/tmp/foo.txt.splunk
/tmp/foo.txt

Splunk creates these files for the following command:

./splunk spool /var/log/foo -sourcetype bar

In the default $SPLUNK_HOME/var/spool/splunk/, this will create foo.splunk with some metadata specifying sourcetype=bar, and then copy /var/log/foo to the same destination - the TailingProcessor will wait for the non-.splunk file, then read the metadata and consume & delete both files.

For more info on what can be specified, see ./splunk help spool. These .splunk files can be used in any [DESTRUCTIVE!!] batch+sinkhole input.

Got questions? Get answers!

Join the Splunk Community Slack to learn, troubleshoot, and make connections with fellow Splunk practitioners in real time!

Meet up IRL or virtually!

Join Splunk User Groups to connect and learn in-person by region or remotely by topic or industry.

Get Updates on the Splunk Community!

Design, Compete, Win: Submit Your Best Splunk Dashboards for a .conf26 Pass

Hello Splunkers,  We’re excited to kick off a Splunk Dashboard contest! We know that dashboards are a primary ...

May 2026 Splunk Expert Sessions: Security & Observability

Level Up Your Operations: May 2026 Splunk Expert Sessions Whether you are refining your security posture or ...

Network to App: Observability Unlocked [May & June Series]

In today’s digital landscape, your environment is no longer confined to the data center. It spans complex ...