Hi,
I have an issue with receiving data from one of the universal Forwarders in my environment. I have checked the internal logs of the UF and found some messages stating that "Watched File - will begin reading from offset at xxx". But, there was nothing found in the metrics.log and license_usage logs as well. I have checked everything like connectivity, inputs and outputs but nothing found differently.
What could be the issue? Please help me to resolve this. Thanks in advance.
Hi @siva_cg,
Can you please clarify that UF is not reading few of the log files on same server OR UF is reading log file but few of the events from that log file is missing in Splunk ?
If few of the events is missing in Splunk then it is mostly due to extractions issue, please check on your indexers splunkd.log about any line breaking issue, there might be possibility that event contains multiple timestamps and splunk is indexing those events with older or future timestamp.
Hi @harsmarvania57 ,
I am receiving only internal logs from that UF. No other logs are coming to indexers
Can you run below command on UF and check whether UF is reading the files which you configured in your monitor
stanza ?
$SPLUNK_HOME/bin/splunk list inputstatus
Hi @harsmarvania57,
I am not sure what is the issue but once we restart the service manually on the server, we started receiving logs from it. I don't have access to UF cli, so unable to run that. Thank you for the help.
Looks like splunk picked up the file, I would do a all time search on the index in case the dates are off in the source file.
index=_internal host=ufhost component=TailingProcessor
look for event_message if you can see your source path in the logs...hope this helps.
Hi @prakash007
Thanks for the inputs. I did searched for all time but no results. I am able to see my log source in the TailingProcessor.
what's the workflow in your case..
UF---->Indexer----->SH
try to look at the uf internal logs if they are talking to the indexers
index=_internal TcpOutputProc
I would login to the indexer and search for the index OR
look under /var/lib/splunk/(your index)/db/*/rawdata if you see any data on the indexer db itself to make sure the data is getting to the indexer.