Splunk Search

How to find queued jobs from audit.log?

arpit_arora
Explorer

Hello, is there a way I can find if a particular job was queued by looking at the audit logs?
I never see the status of queued in the info field of audit.log. So how can I find the search_ids of all the jobs which were queued?

Tags (1)
0 Karma

arpit_arora
Explorer

I was able to find a solution.

We just look for the message of the following type in splunkd.log.

02-05-2018 12:06:53.070 +0000 WARN DispatchManager - enforceQuotas: username=z-splunk, search_id=1517832412.267986_D9651116-58B0-4F3D-95A8-1560E30C7C62 - QUEUED (reason='The maximum number of concurrent historical searches on this instance has been reached. concurrency_limit=26')

This will give the search IDs of all the search jobs which were queued before they were allowed to run.

0 Karma

somesoni2
Revered Legend

I believe the queued jobs do not have an entry in audit logs. You can use REST Endpoints for search jobs to find jobs which are queued.

| rest /services/search/jobs | where dispatchState="QUEUED" | table sid dispatchState eai:acl.owner eai:acl.owner normalizedSearch 
0 Karma

arpit_arora
Explorer

Yes I am aware of the REST but that only shows the real time status.

If I want to go back to yesterday's audit.log or any other logs for that matter, how can I find the jobs which were queued?

0 Karma
Get Updates on the Splunk Community!

Extending Observability Content to Splunk Cloud

Watch Now!   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to leverage ...

More Control Over Your Monitoring Costs with Archived Metrics!

What if there was a way you could keep all the metrics data you need while saving on storage costs?This is now ...

New in Observability Cloud - Explicit Bucket Histograms

Splunk introduces native support for histograms as a metric data type within Observability Cloud with Explicit ...