- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
So I am trying to monitor a file on the local indexer. I am setting it up through the Web UI to be sure it works. I get the following results in my splunkd.log
05-09-2018 16:05:44.453 -0500 INFO TailingProcessor - Parsing configuration stanza: monitor:///tmp/TaskStatus.test.log.
05-09-2018 16:05:44.453 -0500 INFO TailingProcessor - Adding watch on path: /tmp/TaskStatus.test.log.
But nothing actually shows up in the index. I've edited the file so I know it's changing and I was able to preview the file in the web interface and it loaded fine. The actual input itself is not working. Any thoughts on why?
The inputs.conf that gets created:
[monitor:///tmp/TaskStatus.test.log]
disabled = false
index = tasklogs
sourcetype =_json
I made the splunk user the owner and verified it had read/write permissions on the file. If I upload the file for one time indexing it works fine.
I can't think of any reason it wouldn't work.
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
The issue was it was stuck in ingestion queue. I changed how it acted when the file was in use in my inputs and props and it appears to be working now.
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
The issue was it was stuck in ingestion queue. I changed how it acted when the file was in use in my inputs and props and it appears to be working now.
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content

There are many possible reasons:
If timestamping is wrong, the events could be landing in times outside of your expected search window (in the future, for example).
Similar to the above, check MAX_DAYS_HENCE
and MAX_DAYS_AGO
(and associated logs).
The settings/size of that index may be such that events get expired just after they are indexed.
You might have a firewall running on that indexer blocking outgoing connections to port 9997/9998.
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content

Try splunk show inputstatus
on the CLI, as well as splunk list monitor
