Alerting

Why all my scheduled searches have stopped running without any apparent reason?

splunkIT
Splunk Employee
Splunk Employee

I have a splunk search head that has a bunches of scheduled searches which runs every minute or so. However, since after 5/15, not one scheduled search has ran, and no alert email has been received since.

I looked at the scheduler.log, and it seems to have confirmed that, as there were no new log entries since after 5/15. No error was found either. We had enabled debug on the scheduler, restarted splunkd, and problem persists. Scheduler.log still has no new entries since after 5/15. At this point, I am not sure where else to look to further trouble shooting this issue.

I have enabled DEBUG by manually editing $SPLUNK_HOME/etc/log.cfg, and flipped the following parameter from INFO to DEBUG:

Before:
category.SavedSplunker=INFO,scheduler

After:
category.SavedSplunker=DEBUG,scheduler

Restarted splunkd, but still no new entries in the scheduler.log.

1 Solution

mzax
Splunk Employee
Splunk Employee

If the scheduler is not scheduling jobs, it is probably disabled.

The SplunkLightForwarder app is disabling the scheduler processor.
Make sure that the SplunkLightForwarder will be disabled in order for the scheduler processor to be active.

Additional sign that the processor is disabled can be found in splunkd.log:
WARN pipeline - Empty pipeline (no processors): scheduler, exiting pipeline
Find it using the following search:
index=_internal source=*splunkd.log scheduler pipeline

View solution in original post

mzax
Splunk Employee
Splunk Employee

If the scheduler is not scheduling jobs, it is probably disabled.

The SplunkLightForwarder app is disabling the scheduler processor.
Make sure that the SplunkLightForwarder will be disabled in order for the scheduler processor to be active.

Additional sign that the processor is disabled can be found in splunkd.log:
WARN pipeline - Empty pipeline (no processors): scheduler, exiting pipeline
Find it using the following search:
index=_internal source=*splunkd.log scheduler pipeline

splunkIT
Splunk Employee
Splunk Employee

More evidence that enabling SplunkLightForwarder app disables scheduler functionality:

cat $SPLUNK_HOME/etc/apps/SplunkLightForwarder/local/app.conf
[install]
state = enabled

....which disables the scheduler in default-mode.conf:

cat $SPLUNK_HOME/etc/apps/SplunkLightForwarder/default/default-mode.conf
(...)
-- do not start the scheduler if in lwf mode
[pipeline:scheduler]
disabled_processors = LiveSplunks

Disabling this app seems to have done the trick. Thanks mzax for the tips.

0 Karma

splunkIT
Splunk Employee
Splunk Employee

+1 for mzax. I took another look at the splunkd.log, and did notice the following messages:

INFO PipelineComponent - Pipeline typing disabled in default-mode.conf file
INFO PipelineComponent - Launching the pipelines.
WARN pipeline - Empty pipeline (no processors): scheduler, exiting pipeline

I have also noticed the following errors in the splunkd.log:

ERROR TcpOutputProc - LightWeightForwarder/UniversalForwarder not configured. Please configure outputs.conf.

Lucas_K
Motivator

Have you tried turning on debug via splunkweb? Note: it will reset upon splunkd restart.
There are also a couple of other categories that might help ( SchedulerLauncherProcessor, SearchScheduler).

Can you manually run the job from splunk web interface? Any errors? (missing indexes etc).

Also check splunkd.log for any other issues that may effect how the jobs run (bundle replication issues to indexes etc).

Local disk space issues? ie. plenty of disk space/inodes free in the dispatch dir etc.

Get Updates on the Splunk Community!

Welcome to the Splunk Community!

(view in My Videos) We're so glad you're here! The Splunk Community is place to connect, learn, give back, and ...

Tech Talk | Elevating Digital Service Excellence: The Synergy of Splunk RUM & APM

Elevating Digital Service Excellence: The Synergy of Real User Monitoring and Application Performance ...

Adoption of RUM and APM at Splunk

    Unleash the power of Splunk Observability   Watch Now In this can't miss Tech Talk! The Splunk Growth ...