Monitoring Splunk

Has anyone seen this Error message: Monotonic time source didn't increase; is it stuck?


Since we've upgraded to 7.0 we're seeing this particular error show up in the logs:

10-17-2017 11:30:30.772 -0600 ERROR PipelineComponent - Monotonic time source didn't increase; is it stuck?

We weren't able to find much information regarding this error online and wanted to poll the audience to see if anyone has encountered this as well.


Got same error

Splunk Enterprise
Version: 8.0.2
Build: a7f645ddaf91

06-01-2020 13:04:41.446 -0400 ERROR PipelineComponent - Monotonic time source didn't increase; is it stuck?

0 Karma


I had the same question and I opened a Splunk case. This is the response:

"This is an error we have come across with some of our Windows customers, and seems more common of virtualized instances. The splunk process will periodically check the time of the OS system and will show this error if there is a difference (~15 ms) as an indication of the time progress internally. This is really an internal ERROR that should not be reported.

Reference: GetTickCount64 function

This issue is currently fixed in version 8.0.0, and if you would like to stop this error from occurring, you will need to look into upgrading to 8.0, otherwise, you can ignore this error message.​"

Splunk Employee
Splunk Employee

Whats your time stamp look like for that data source? Typically this would be with a timestamp that isnt recognized or something wrong with your data source...

0 Karma


I have this error on one heavy forwarder but not the other, all pulling the same configurations? So the datasources are the same on each environment, but only one throws this message followed by:

WARN TcpOutputProc - Tcpout Processor: The TCP output processor has paused the data flow. Forwarding to output group splunkcloud has been blocked for 39750 seconds. This will probably stall the data flow towards indexing and other network outputs. Review the receiving system's health in the Splunk Monitoring Console. It is probably not accepting data.

0 Karma

Path Finder

How do I go about identifying which source is having an issue? Looking at the splunkd.log it isn't obvious.

Thank you in advance,



The timestamp shown in the error I posted is directly from the splunkd.log file.

0 Karma
Get Updates on the Splunk Community!

Splunk Custom Visualizations App End of Life

The Splunk Custom Visualizations apps End of Life for SimpleXML will reach end of support on Dec 21, 2024, ...

Introducing Splunk Enterprise 9.2

WATCH HERE! Watch this Tech Talk to learn about the latest features and enhancements shipped in the new Splunk ...

Adoption of RUM and APM at Splunk

    Unleash the power of Splunk Observability   Watch Now In this can't miss Tech Talk! The Splunk Growth ...