Splunk Search

Limited results when running searches via REST API

karan1337
Path Finder

Hi,

I am trying to run a search and get the results back via REST API using python. The way i am trying to get the results is as follows:-

//query: searchquery = 'search index=indexname sourcetype=sourcetypename earliest=-5h'

do....the....query

services_search_results_str = '/services/search/jobs/%s/results?output_mode=raw&count=0' % sid
searchresults = myhttp.request(base_url + services_search_results_str, 'GET')[1]

where base_url points to our splunk instance on a server. However, irrespective of the earliest time i specify, i always end up getting the first 50000 results even though i have more than 600,000 events for 1 hr and approximately 5 times of that number in 5 hours. Am i missing something? Is there a way to get the complete set of events?

0 Karma
1 Solution

tom_frotscher
Builder

Hi,

this is from the limits.conf:

[restapi]
maxresultrows = <integer>
* Maximum result rows to be returned by /events or /results getters from REST API.
* Defaults to 50000.

As you can see, there is a limit configured.

You have two options now:

1) Enhance the limit to a value that is suitable for you.
2) I think the better option is to repeat your call with a different offset. You split up your requests on this way. Take a look into the answer of this post:

http://answers.splunk.com/answers/25411/upper-limit-for-rest-api-limits-conf-maxresultrows.html

Greetings,

Tom

View solution in original post

fdi01
Motivator

| rest /services/search/jobs count=0 splunk_server=local |head 1
to Limited results to display use head command like: ...| head integer
ex:

|rest /services/search/jobs/%s/results output_mode=raw count=0|head 1

to have more results (set count give you maximum number of entries to return set value to zero to get all available entries. )
set count(count=30 to default) and offset(offset=0 to default)
ex:

|rest /services/search/jobs/%s/results output_mode=raw count=600,000 offset=6

or
services_search_results_str = '/services/search/jobs/%s/results?output_mode=raw&count=0&offset=0' % sid
searchresults = myhttp.request(base_url + services_search_results_str, 'GET')

0 Karma

tom_frotscher
Builder

Hi,

this is from the limits.conf:

[restapi]
maxresultrows = <integer>
* Maximum result rows to be returned by /events or /results getters from REST API.
* Defaults to 50000.

As you can see, there is a limit configured.

You have two options now:

1) Enhance the limit to a value that is suitable for you.
2) I think the better option is to repeat your call with a different offset. You split up your requests on this way. Take a look into the answer of this post:

http://answers.splunk.com/answers/25411/upper-limit-for-rest-api-limits-conf-maxresultrows.html

Greetings,

Tom

karan1337
Path Finder

Hi Tom,

Thanks for the reply. So if i choose to repeat my calls, is there a way to check the total number of records returned before processing them? That way, i would know exactly how many splits i will have to make and set the offset accordingly.

A follow up question is that if i provide an offset that is larger than the total number of records that exist, what does the REST API return?

0 Karma

tom_frotscher
Builder

Hi,

from my point of view, i would just do the calls with an static offset, for example 10,000. After every call, i would check the amount of returned events for this split. As soon as there are less than 10,000 results, i do not need to do an additional call. Therefore, you do not have to calculate the amount of splits.

For your last question: i would just test it, since i do not know how the api responds in this case 😛

Greetings

Tom

0 Karma

karan1337
Path Finder

Hi Tom,

I am facing another issue which is something similar. Can you please look at: http://answers.splunk.com/answers/252047/rest-api-search-and-gui-search-are-inconsistent.html

Thanks.

0 Karma

karan1337
Path Finder

Hi Tom,

Thanks. In my case, whenever the offset is larger than the total number of records, i get a string of length 0.

0 Karma
Get Updates on the Splunk Community!

New in Observability - Improvements to Custom Metrics SLOs, Log Observer Connect & ...

The latest enhancements to the Splunk observability portfolio deliver improved SLO management accuracy, better ...

Improve Data Pipelines Using Splunk Data Management

  Register Now   This Tech Talk will explore the pipeline management offerings Edge Processor and Ingest ...

3-2-1 Go! How Fast Can You Debug Microservices with Observability Cloud?

Register Join this Tech Talk to learn how unique features like Service Centric Views, Tag Spotlight, and ...