Splunk Enterprise Security

Why are we missing cisco:ios sourcetype data between 12am-4am nightly?

AlbintEIG
Engager

We are collecting syslog with a syslog collector, and dumping it to text files. Splunk ingests those txt files from the drive using the Splunk Universal Forwarder and everything works perfectly for all syslog events except the switch data from sourcetype cisco:ios. Every night there is a gap in the data from 12a-4a. Meanwhile, all other syslog data is indexed and reporting properly with nothing missing. Every sourcetype is using the same method and source syslog server. Its only this cisco:ios sourcetype during these hours. At 4:00am everything resumes like nothing ever happened. The text files contain data straight through the night, so its not with the syslog server or the data collecting.

I am completely stumped.

Backups dont run at those times.

Has anyone ever seen anything like this? I feel like my sanity is being tested 🙂

0 Karma
1 Solution

Jeremiah
Motivator

Couple of things that could be causing this:

  1. You have some timestamp issue; the data is being indexed, but the timestamp is interpreted incorrectly, and so it only appears that the data isn't getting indexed. You should check the _internal metrics for your cisco:ios sourcetype to see if the ingestion rate really drops to 0.
  2. There is some misconfiguration or other issue with your syslog setup and file rotation that is causing Splunk to lose track of the file during that time period, or for some reason is unable to read the file.
  3. You haven't really explained your file structure / monitor configuration, but if you're writing data in some sort of time-based structure, your inputs are missing the files timestamped in that timerange. (far fetched, but could be...)
  4. Your forwarder is unable to keep up with the data ingestion rate, falls behind, then when your file rotates, it is able to "catch up". Check the splunkd.log file on the forwarder for queue errors or max throughput reached.

If none of that helps, it would be good to have more information about your setup, OS, syslog service and configuration, forwarder configuration, and so on.

View solution in original post

0 Karma

Jeremiah
Motivator

Couple of things that could be causing this:

  1. You have some timestamp issue; the data is being indexed, but the timestamp is interpreted incorrectly, and so it only appears that the data isn't getting indexed. You should check the _internal metrics for your cisco:ios sourcetype to see if the ingestion rate really drops to 0.
  2. There is some misconfiguration or other issue with your syslog setup and file rotation that is causing Splunk to lose track of the file during that time period, or for some reason is unable to read the file.
  3. You haven't really explained your file structure / monitor configuration, but if you're writing data in some sort of time-based structure, your inputs are missing the files timestamped in that timerange. (far fetched, but could be...)
  4. Your forwarder is unable to keep up with the data ingestion rate, falls behind, then when your file rotates, it is able to "catch up". Check the splunkd.log file on the forwarder for queue errors or max throughput reached.

If none of that helps, it would be good to have more information about your setup, OS, syslog service and configuration, forwarder configuration, and so on.

0 Karma

AlbintEIG
Engager

Thank you for the suggestions, I will work through these suggestions and let you know.

0 Karma
Get Updates on the Splunk Community!

Splunk Observability Synthetic Monitoring - Resolved Incident on Detector Alerts

We’ve discovered a bug that affected the auto-clear of Synthetic Detectors in the Splunk Synthetic Monitoring ...

Video | Tom’s Smartness Journey Continues

Remember Splunk Community member Tom Kopchak? If you caught the first episode of our Smartness interview ...

3-2-1 Go! How Fast Can You Debug Microservices with Observability Cloud?

3-2-1 Go! How Fast Can You Debug Microservices with Observability Cloud? Learn how unique features like ...