Getting Data In

Modular input log file and its ingestion for Appinspect

mtroianovskyi
Explorer

Our app's modular input is writing its logs into $SPLUNK_HOME/var/log/$APP_NAME/$LOG_NAME.log - this conforms to the Appinspect check Operating system standards - Check that applications only write to the following directories.

However, when we try to add the default/inputs.conf with the monitor stanza to ingest the modular input logs into _internal index, we get the failure - Check [fifo] or [monitor] stanza is not used in inputs.conf unless the input stanza is used to ingest data from $SPLUNK_HOME/var/log/splunk.

So one check suggests to use $SPLUNK_HOME/var/log/$APP_NAME while the other check suggests $SPLUNK_HOME/var/log/splunk instead. So it is not clear what directory has to be used for the custom app modular input logs.

0 Karma
1 Solution

mtroianovskyi
Explorer

As suggested by alacercogitatus on splunk-usergroups:

you should write to var/log/splunk/<appname>/modinput.log, and include a Diag.py so that you can do splunk diag --collect app:<appname> and only get your own files, and not the whole system

View solution in original post

0 Karma

mtroianovskyi
Explorer

As suggested by alacercogitatus on splunk-usergroups:

you should write to var/log/splunk/<appname>/modinput.log, and include a Diag.py so that you can do splunk diag --collect app:<appname> and only get your own files, and not the whole system

0 Karma
Career Survey
First 500 qualified respondents will receive a $20 gift card! Tell us about your professional Splunk journey.
Get Updates on the Splunk Community!

Thanks for the Memories! Splunk University, .conf25, and our Community

Thank you to everyone in the Splunk Community who joined us for .conf25, which kicked off with our iconic ...

Data Persistence in the OpenTelemetry Collector

This blog post is part of an ongoing series on OpenTelemetry. What happens if the OpenTelemetry collector ...

Introducing Splunk 10.0: Smarter, Faster, and More Powerful Than Ever

Now On Demand Whether you're managing complex deployments or looking to future-proof your data ...