Splunk Search

Randomly inconsistent search result

Salim_Uddin
Engager

Hi,

I am executing a search on Splunk through my java application. The search query is executed through the following steps -

jobArgs = new Args();
jobArgs.put("exec_mode", "blocking");
jobArgs.put("earliest_time", startTime);
jobArgs.put("latest_time", endTime);
JobCollection jobs = service.getJobs();
job = jobs.create(searchQuery, jobArgs);

where searchQuery is "index= * ind.* | search ( DeviceId = ABC* ) | stats values(DeviceId)"

The time interval is configured as 10 second intervals.

I have seen that at random times, the above piece of code misses a record in the result set. When the same query is executed on the Splunk server with the same time interval configured in the filter, all the records are returned (including the missing one).

Please share some suggestions on why that may be happening.

Thanks and regards

Tags (1)
0 Karma

Salim_Uddin
Engager

Hi,
Thanks for the response.

The timings are -

Start time       2013-10-09T17:50:59.914 
End time         2013-10-09T17:52:37.196
Time of record   2013-10-09T17:52:36.000

Seems like there is a 1 second and 196 millisecond difference.

Approximately how much latency should we expect?

Thanks and regards

0 Karma

dwaddle
SplunkTrust
SplunkTrust

How close to "now" are your values for earliest_time and latest_time ? There is always some latency between an event being produced at it source and it winding up searchable in the index. If you are searching very close to "current time" you may not be allowing time for all events to be indexed.

0 Karma

lukejadamec
Super Champion

Index latency is going to vary from enterprise to enterprise. You should check this data stream at your enterprise to get useful information. Here is a post about collecting index time information:
http://answers.splunk.com/answers/42646/showing-indexed-time

Get Updates on the Splunk Community!

See just what you’ve been missing | Observability tracks at Splunk University

Looking to sharpen your observability skills so you can better understand how to collect and analyze data from ...

Weezer at .conf25? Say it ain’t so!

Hello Splunkers, The countdown to .conf25 is on-and we've just turned up the volume! We're thrilled to ...

How SC4S Makes Suricata Logs Ingestion Simple

Network security monitoring has become increasingly critical for organizations of all sizes. Splunk has ...