- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Nit getting the data into splunk indexer
Iam not able to see the file content in indexer,
After restarting the universal Forwarder what can be the reason
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
1. check if Splunk process is running on Splunk forwarder
ps -ef | grep splunkd
OR
cd $SPLUNK HOME/bin
./splunk status
2. Check if Splunk forwarder forwarding port is open by using below command
netstat -an | grep 9997
If output of above command is blank, then your port is not open. You need to open it.
3. Check on indexer if receiving is enabled on port 9997 and port 9997 is open on indexer
Check if receiving is configured : on indexer, go to setting>>forwarding and receiving >> check if receiving is enabled on port 9997. If not, enable it.
4. Check if you are able to ping indexer from forwarder host
ping indexer name
If you are not able to ping to the server, then check network issue
5. Confirm on indexer if your file is already indexed or not by using the below search query
In the Splunk UI, run the following search :-
index=_internal "FileInputTracker" **
As output of the search query, you will get a list of log files indexed.
6. Check if forwarder has completed processing log file (i.e. tailing process by using below URL)
https://splunk forwarder server name:8089/services/admin/inputstatus/TailingProcessor:FileStatus
In tailing process output you can check if forwarder is having an issue for processing file
7. Check out log file permissions which you are sending to Splunk. Verify if Splunk user has access to log file
8. Verify inputs.conf and outputs.conf for proper configuration
inputs.conf
Make sure the following configuration is present, and verify that the specified path has read access.
[monitor://<absolute path of the file to onboard>]
index=<index name>
sourcetype=<sourcetype name>
outputs.conf
[tcpout:group1]
server=x.x.x.x:9997
9. Verify Index Creation
Ensure the index is created on the indexer and matches the index specified in inputs.conf.
10. **Check splunkd.log on forwarder at location $SPLUNK_HOME/var/log/splunk for any errors. Like for messages that are from 'TcpOutputProc', they should give you an indication as to what is occurring when the forwarder tries to connect to the indexer
11. Checkout disk space availability on the indexer
12. Check out ulimit if you have installed forwarder on linux. and set it to unlimites or max (65535 -Splunk recommended)
- ulimit is limit set by default in linux is limit for number files opened by a process
- check ulimit command: ulimit -n
- set ulimit command: ulimit -n expected size
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content

If 1st one, then look that your UF has correct inputs.conf on place.
If 2nd one, then look that there is outputs.conf on place.
1st check those from server side. Then continue on UF side. Check that Splunk Forwarder service is running on it. Then look from splunkd.log what are happening there. Depending on your environment those conf files could be on UF or there could a DS which are managing all UFs.
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
@loknath To ensure proper monitoring, verify that the file you wish to track grants read access to the 'splunk' user.
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Verify the following details:
- Confirm whether the inputs.conf file is configured to point to the correct monitoring directory.
- Ensure that the index has been created on the indexer before sending data from the Universal Forwarder (UF).
- Check the connection between the UF and the indexer. Make sure the receiving port is enabled on the indexer.
- Review the internal logs on the Splunk UF to gather insights.
- Examine the outputs.conf file for correct configurations.
Please review these details thoroughly.
