Getting Data In

Data Not Showing Up In Distributed Search Environment

New Member

Environment has one search head and one search peer. Data is sent to a directory [item (1)] configured to be monitored and indexed by the search peer. Both the search head and search peer have the same "indexes.conf" entry for the index [see item 21)], and the index is showing up in the search head GUI. Search peer has entry in "inputs.conf" to monitor the directory where data is being sent [see item (3)]. When a file is copied into the directory, the expected behavior is for the file to be ingested into Splunk and consequently be searchable; however this behavior is not occurring.

We have other indexes on this environment that do work as intended, but for some reason this particular setup is not working. Any and all help would be appreciated.

Item (1)*

Item (2)
homePath = $SPLUNK_DB/MY_in_dex
thawedPath = $SPLUNK_DB/thawedpath/MY_in_dex
coldPath = $SPLUNK_DB/coldpath/MY_in_dex

Item (3)
index = My_in_dex

*[NOTE: this traversal does start from "/" on a *nix machine]

0 Karma

New Member

Thanks to everyone for their help!

It ended up being a latency issue, where the data too up to an hour to be ingested.

0 Karma

Esteemed Legend

Your indexes.conf is highly unusual (but I don't see why it shouldn't work); why is it not like this:

homePath = $SPLUNK_DB/MY_in_dex/db
coldPath = $SPLUNK_DB/MY_in_dex/colddb
thawedPath = $SPLUNK_DB/MY_in_dex/thaweddb
0 Karma


What role do you have, you can check the indexes under your role to which you have access. Probably this is not listed.

0 Karma



did you check the splunkd log on forwarder and Indexer??

You should have logs from the forwarder which stats that its trying to read the file which should be monitored.
If not you should check if it picked up the inputs.conf at all with btool. Maybe you did not restart the forwarder after deploying the inputs.conf to it?

In which order did you create the input and the index? somethimes if you deploy the input before creating the index the forwarder tries to send logs and its not getting it written to the index, so in the fishbucket its saved that it read it but it wasn´t indexed.

Maybe a permission issue? Check what indexes you are allowed to search

Or your indexer is not excepting the data being sent by the forwarder? You could check the _internal for tcpin connections from the forwarder to the indexer

0 Karma


You can also elevated the log level:

On the Splunk instance that is monitoring the files, navigate to the $SPLUNK_HOME/etc directory and edit the file:

modify the following settings and change INFO to DEBUG


save the file.

Restart the Splunk instance.

Take a look at the log: $SPLUNK_HOME/var/log/splunk/splunkd.log

Look for the names of the files you were monitoring, the debug information should tell you why they were skipped.
Set the values back to INFO after you figure out your problem.

0 Karma
Get Updates on the Splunk Community!

.conf24 | Day 0

Hello Splunk Community! My name is Chris, and I'm based in Canberra, Australia's capital, and I travelled for ...

Enhance Security Visibility with Splunk Enterprise Security 7.1 through Threat ...

 (view in My Videos)Struggling with alert fatigue, lack of context, and prioritization around security ...

Troubleshooting the OpenTelemetry Collector

  In this tech talk, you’ll learn how to troubleshoot the OpenTelemetry collector - from checking the ...