Splunk Search

Is there a delay in the Splunk API server 'seeing' events?

sonamchauhan
Engager

Is there a delay in the Splunk API server 'seeing' events that are already indexed?

I use the Splunk API to query logs for some testcases. I can submit a job to the API server (`POST https://<SERVER>:8089/services/search/jobs`). That works fine. But intermittently, the search job returns no results (GET https://<SERVER>:8089/services/search/jobs/<JOB_ID>/results returns a 204/No Content HTTP header, and no HTTP body) 

I checked if there was an indexing delay using the command below. Apparently there was not - the relevant logs were ingested and indexed well in time. It's just the Splunk API server that intermittently returns no results. 

 

 

<SPLUNK QUERY> | eval indextime=strftime(_indextime,"%Y-%m-%d %H:%M:%S")

 

 

 
Any pointers to how I can dig into this further? I'm just a dev, not a Splunk admin, so guidelines on what I do next are much appreciated.

Labels (1)
0 Karma
1 Solution

sonamchauhan
Engager

OK, I may have solved my own problem (caused by lack of knowledge of how Splunk API jobs work). 

Basically, I had too short a delay between creating the job (POST job) and (GET results). I've increased it from 3 to 10 seconds and it seems to be behaving better

View solution in original post

0 Karma

sonamchauhan
Engager

OK, I may have solved my own problem (caused by lack of knowledge of how Splunk API jobs work). 

Basically, I had too short a delay between creating the job (POST job) and (GET results). I've increased it from 3 to 10 seconds and it seems to be behaving better

0 Karma
Get Updates on the Splunk Community!

Index This | When is October more than just the tenth month?

October 2025 Edition  Hayyy Splunk Education Enthusiasts and the Eternally Curious!   We’re back with this ...

Observe and Secure All Apps with Splunk

  Join Us for Our Next Tech Talk: Observe and Secure All Apps with SplunkAs organizations continue to innovate ...

What’s New & Next in Splunk SOAR

 Security teams today are dealing with more alerts, more tools, and more pressure than ever.  Join us for an ...