I have a Splunk 9.1.2 server running RHEL 8 with about 50 clients. This is airgapped environment.
I have bunch of Linux (RHEL and Ubuntu) UFs and have configured inputs.conf to ingest files like /var/log/message; /var/log/secure; /var/log/audit/audit.log; /var/log/cron etc.
Recently, I noticed that only logs from /var/log/messages and /var/log/cron being ingested; specially I don't see /var/log/secure and /var/log/audit/audit.log.
I tried restarting splunk process on one of the UF and check splunkd.log and I don’t see any errors. Here is what I see for /var/log/secure in the splunkd.log (looks normal) (I have typed it, as I can copy/paste from the air gapped machine)
TailingProcessor [xxxxxx MainTailingThread] passing configuration stanza: monitor:///var/log/secure
TailingProcessor [xxxxxx MainTailingThread] Adding watch on path:///var/log/secure
WatchedFile [xxxxxx tailreader 0] – Will begin reading at offset=xxxx for file=`/var/log/secure`
Here is my inputs.conf
[default]
host = <indexer>
index = linux
[monitor:///var/log/secure]
disabled = false
[monitor:///var/log/messages]
disabled = false
[monitor:///var/log/audit/audit.log]
disabled = false
[monitor:///var/log/syslog]
disabled = false
File permission seems to be fine for all of those files. Please note, SELinux is enabled but file permission seems to be fine for all of those files. Initially, I did have to run "setfacl -R -m u:splunkfwd:rX /var/log" for Splunk to get access access to send logs to the indexer. btool also shown that I am using the correct inputs.conf.
Any idea, what's not misconfigured?
Thank you all for your help. I found the problem with my inputs.conf; it was right in the front of me but just didn't see it. In my inputs.conf, for some reason, I had a stanza "host = <indexer-name>". So all logs were getting to the indexer but under my indexer name except /var/log/messages and cron; hence I wasn't "seeing" them. I need to check why those files (messages and cron) were coming under my real UF name, maybe because, they probably have host name in the logs. The good part is, I learnt few new troubleshooting tips; thanks to you all, appreciate your help.
Thank you all for your help. I found the problem with my inputs.conf; it was right in the front of me but just didn't see it. In my inputs.conf, for some reason, I had a stanza "host = <indexer-name>". So all logs were getting to the indexer but under my indexer name except /var/log/messages and cron; hence I wasn't "seeing" them. I need to check why those files (messages and cron) were coming under my real UF name, maybe because, they probably have host name in the logs. The good part is, I learnt few new troubleshooting tips; thanks to you all, appreciate your help.
@PickleRick Thanks and I think, I had run them before but I tried again to verify and
splunk list monitor output matches with the splunk list inputstatus output. I will try btool next.
Well, it's not only whether they match but if they do contain the stuff you want ingested. If the "list monitor" doesn't show the files you want to read you're not gonna get them.
Sorry, I should have been more clear.
I do see the files (/var/log/cron and /var/log/audit/audit.log) I am troubleshooting when I run "splunk list monitor" and then it matches when I run "splunk list inputstatus"
The "inputstatus" command shows:
/var/log/share
file position=xxxxxx
size=<same-as-above>
percent=100
type=finished reading the file
/var/log/audit/audit.log
file position=xxxxxx
size=<same-as-above>
percent=100
type=open file
You could try to check the current tailing status Solved: Is there some way to see the current tailing statu... - Splunk Community
The mgmt. port must be opened temporarily on the UFs
Thanks @PaulPanther, I checked the link on my SH but not sure what exactly I am looking for. I did search for missing logs (secure and audit.log) but didn't see anything but at the same time didn't see mention of logs those are being ingested, like message and cron.
Thanks for your help.
The link you were pointed to is a very old thread. Now the same functionality is implemented with a special command so you can do (on your UFs, not on your SH!)
splunk list monitor
and
splunk list inputstatus
The first command wil, show you effective configuration of your monitor inputs. The second one will give you the state of your inputs.
Of course you can additionally verify your combined config using
splunk btool inputs list monitor --debug