Getting Data In

Monitoring inputs not parsing logs to indexers

phanichintha
Path Finder

Hello Team,

As we are parsing logs from Linux machine to Splunk indexer via Splunk Universal Forwarder in Linux machine, from monitor input paths "var/logs" am getting data in indexers but am not getting data from this path "monitor:///opt/apps/mule-runtimes/mule-ee-runtime-1/logs" please help what to do, for reference please check the below snap.

Path list.png

Labels (1)
0 Karma
1 Solution

isoutamo
SplunkTrust
SplunkTrust

Hi

have the Splunk UF user read access to this directory? And are you restarted UF after updating configurations?

Usually when you are monitoring directory you should add white lists there or other option is use file name and define sourcetypes for those at same time?

You could see what splunk thinks that it should be read by 

splunk btool inputs list monitor:///opt/apps/mule-runtimes/mule--ee-runtime-1/logs --debug

Another tool to see what it has read is 

splunk list inputstatus

r. Ismo 

View solution in original post

0 Karma

phanichintha
Path Finder

Thanks for the clue @isoutamo I did respective changes, and I got the solution.

0 Karma

isoutamo
SplunkTrust
SplunkTrust

Hi

have the Splunk UF user read access to this directory? And are you restarted UF after updating configurations?

Usually when you are monitoring directory you should add white lists there or other option is use file name and define sourcetypes for those at same time?

You could see what splunk thinks that it should be read by 

splunk btool inputs list monitor:///opt/apps/mule-runtimes/mule--ee-runtime-1/logs --debug

Another tool to see what it has read is 

splunk list inputstatus

r. Ismo 

0 Karma
Career Survey
First 500 qualified respondents will receive a $20 gift card! Tell us about your professional Splunk journey.
Get Updates on the Splunk Community!

Thanks for the Memories! Splunk University, .conf25, and our Community

Thank you to everyone in the Splunk Community who joined us for .conf25, which kicked off with our iconic ...

Data Persistence in the OpenTelemetry Collector

This blog post is part of an ongoing series on OpenTelemetry. What happens if the OpenTelemetry collector ...

Introducing Splunk 10.0: Smarter, Faster, and More Powerful Than Ever

Now On Demand Whether you're managing complex deployments or looking to future-proof your data ...