Splunk Search

[subsearch]: Subsearch produced 1313901 results, truncating to maxout 50000

pgadhari
Builder

I am getting subsearch error while using the join command in my search. I have to use join command to connect 2 sources and show the result. I have tried configuring limits.conf using following parameters in my search heads and indexers, but still getting this error for the query :

index=yammer sourcetype=yammer_messages replied_to_id=null |eval _time=strftime(_time,"%d-%m-%Y") |join sender_id
[search index=yammer sourcetype=yammer_references type=user |rename id as sender_id
|table sender_id full_name email]|rename _time as date|table date,email,id|dedup id|eval point = 10

[subsearch]: Subsearch produced 1313901 results, truncating to maxout 50000.

I have configured below settings in limits.conf for indexers and search head, did splunk restart, but still issue not resolved. Please help resolve the issue:

[subsearch]
# Maximum number of results to return from a subsearch.
maxout = 3000000

[join]
subsearch_maxout = 3000000
subsearch_maxtime = 60
subsearch_timeout = 120
0 Karma
1 Solution

to4kawa
Ultra Champion
index=yammer (sourcetype=yammer_messages replied_to_id=null) OR (sourcetype=yammer_references type=user)
| eval id=coalesce(id,sender_id)
| eval date=strftime(_time,"%d-%m-%Y") 
| stats values(date) as date values(email) as email by id 
| eval point = 10

Hi, @pgadhari
It is not necessary to join or subsearch.
By the way, is full_name required?

View solution in original post

0 Karma

woodcock
Esteemed Legend

Skip the subsearch and the join entirely; try this:

index="yammer" AND ((sourcetype="yammer_messages" AND replied_to_id="null") OR (sourcetype="yammer_references" type="user
"))
| eval sender_id = if(sourcetype="yammer_references", id, sender_id)
| fields _time email id sender_id
| stats min(_time) AS date values(*) AS * BY sender_id
| fieldformat date=strftime(_time,"%d-%m-%Y")
| table date, email, id
| dedup id
| eval point = 10

pgadhari
Builder

Sure. I will try it out. But I am facing some weird issue in the search. Sometimes the events are getting skipped, i think so ?

when i run search first time, the event count shows as proper like 5, but again when I run the same search the count shows 3 - the timelines are same for both searches. Have u faced this issue before ?

0 Karma

pgadhari
Builder

Sure. I will try this search also. Thanks.

0 Karma

codebuilder
Influencer

Are you running this in real time, or as a saved search?

----
An upvote would be appreciated and Accept Solution if it helps!
0 Karma

pgadhari
Builder

this is the saved search.

0 Karma

to4kawa
Ultra Champion
index=yammer (sourcetype=yammer_messages replied_to_id=null) OR (sourcetype=yammer_references type=user)
| eval id=coalesce(id,sender_id)
| eval date=strftime(_time,"%d-%m-%Y") 
| stats values(date) as date values(email) as email by id 
| eval point = 10

Hi, @pgadhari
It is not necessary to join or subsearch.
By the way, is full_name required?

0 Karma

pgadhari
Builder

Sure. I will try this query and revert. yes, full_name is required to display on the dashboard.

0 Karma

to4kawa
Ultra Champion
| stats values(date) as date values(email) as email by id
⇨
| stats values(date) as date values(email) as email values(full_name) as full_name by id
0 Karma

pgadhari
Builder

Sure. I will try it out. But I am facing some weird issue in the search. Sometimes the events are getting skipped, i think so ?

when i run search first time, the event count shows as proper like 5, but again when I run the same search the count shows 3 - the timelines are same for both searches. Have u faced this issue before ?

0 Karma
Get Updates on the Splunk Community!

Adoption of RUM and APM at Splunk

    Unleash the power of Splunk Observability   Watch Now In this can't miss Tech Talk! The Splunk Growth ...

Routing logs with Splunk OTel Collector for Kubernetes

The Splunk Distribution of the OpenTelemetry (OTel) Collector is a product that provides a way to ingest ...

Welcome to the Splunk Community!

(view in My Videos) We're so glad you're here! The Splunk Community is place to connect, learn, give back, and ...