Hello, is there a way I can find if a particular job was queued by looking at the audit logs?
I never see the status of queued in the info field of audit.log. So how can I find the search_ids of all the jobs which were queued?
I was able to find a solution.
We just look for the message of the following type in splunkd.log.
02-05-2018 12:06:53.070 +0000 WARN DispatchManager - enforceQuotas: username=z-splunk, search_id=1517832412.267986_D9651116-58B0-4F3D-95A8-1560E30C7C62 - QUEUED (reason='The maximum number of concurrent historical searches on this instance has been reached. concurrency_limit=26')
This will give the search IDs of all the search jobs which were queued before they were allowed to run.
I believe the queued jobs do not have an entry in audit logs. You can use REST Endpoints for search jobs to find jobs which are queued.
| rest /services/search/jobs | where dispatchState="QUEUED" | table sid dispatchState eai:acl.owner eai:acl.owner normalizedSearch
Yes I am aware of the REST but that only shows the real time status.
If I want to go back to yesterday's audit.log or any other logs for that matter, how can I find the jobs which were queued?