All Apps and Add-ons

Reporting and Management for OSSEC: Why did Splunk stop indexing file /var/ossec/logs/alerts/alerts.log?

att35
Builder

Hi,

On one of our Indexers, Splunk has stopped indexing data from one of the OSSEC log files. We have Splunk for OSSEC app installed and is Indexing data from two files on the same machine. alt text

For sourcetype ossec_log, data is still getting indexed, but not for sourcetype ossec_alerts. This appears to start happening after we installed two add-on's. One for "Linux Auditd" and another for "Splunk Add-on for OSSEC". Could any of these might have caused this?

I still see the file entries under "Data Inputs" as shows in the above screenshot. The file "/var/ossec/logs/alerts/alerts.log" is getting data. I have since removed the Splunk Add-on for OSSEC and restarted Splunk, but that did not fix this.

Update:

Used inputstatus to compare this indexer with one where alerts.log is getting indexed correctly.

alt text

Kindly assist.

Thanks,

~ Abhi

0 Karma
1 Solution

att35
Builder

It seems Splunk just needed more time to go through all the tracked files. One of the directories we monitor contains roughly 8K small files, all being tracked for changes. This was delaying the time it needs to reach the OSSEC alerts.log file. Once I temporarily disabled this other directory, Splunk started ingesting logs from OSSEC immediately.

Following proved to be of immense help while troubleshooting.
http://blogs.splunk.com/2011/01/02/did-i-miss-christmas-2/

Thanks,

~ Abhi

View solution in original post

0 Karma

att35
Builder

It seems Splunk just needed more time to go through all the tracked files. One of the directories we monitor contains roughly 8K small files, all being tracked for changes. This was delaying the time it needs to reach the OSSEC alerts.log file. Once I temporarily disabled this other directory, Splunk started ingesting logs from OSSEC immediately.

Following proved to be of immense help while troubleshooting.
http://blogs.splunk.com/2011/01/02/did-i-miss-christmas-2/

Thanks,

~ Abhi

0 Karma

att35
Builder

Additional results from splunkd.log

06-05-2016 00:00:06.932 -0400 INFO  WatchedFile - Will begin reading at offset=0 for file='/var/ossec/logs/alerts/alerts.log'.

After restart:

06-05-2016 16:37:51.586 -0400 INFO  TailingProcessor - Adding watch on path: /var/ossec/logs/alerts.
06-05-2016 16:58:26.619 -0400 INFO  TailingProcessor - Parsing configuration stanza: monitor:///var/ossec/logs/alerts/alerts*.
06-05-2016 16:58:26.620 -0400 INFO  TailingProcessor - Adding watch on path: /var/ossec/logs/alerts.

Offset correctly listed for ossec.log

06-05-2016 16:58:26.852 -0400 INFO  WatchedFile - Will begin reading at offset=577851121 for file='/var/ossec/logs/ossec.log'.

I do not see the "WatchedFile" entry for "alerts.log" post restart. Does this mean that Splunk needs more time to traverse through all the directories till it reaches that file?

Thanks,

~ Abhi

0 Karma

att35
Builder

Is it possible to drill down to what's causing this behavior? This happened again on the same Indexer, this time while configuring props.conf for TA_linux-audit APP. After the restart, Splunk started monitoring all the files except /var/ossec/log/alerts/alerts.log.

Even thought the props configuration was for sourcetype "ossec_alerts", same which this file is tagged to, I don't see a relation why a edit to an App's props.conf would have with a file being read or not. Under inputstatus, it continues to list the Input directory as defined under "Data Inputs", but doesn't list the actual file.

Last time, after a few restarts it started reading the file but we are unable to find out what causes it to stop in the first place? Does the file's size( ~ 300 M) has anything to do with how long it takes to read/start indexing?

Thanks,

~ Abhi

0 Karma
Get Updates on the Splunk Community!

Routing logs with Splunk OTel Collector for Kubernetes

The Splunk Distribution of the OpenTelemetry (OTel) Collector is a product that provides a way to ingest ...

Welcome to the Splunk Community!

(view in My Videos) We're so glad you're here! The Splunk Community is place to connect, learn, give back, and ...

Tech Talk | Elevating Digital Service Excellence: The Synergy of Splunk RUM & APM

Elevating Digital Service Excellence: The Synergy of Real User Monitoring and Application Performance ...