Monitoring Splunk

How to solve " Events might not be returned in sub-second order due to search memory limits" without increasing limit

LearningGuy
Motivator

Hello,

How to solve " Events might not be returned in sub-second order due to search memory limits" without increasing the value of the following limits.conf setting:[search]:max_rawsize_perchunk?

I got a message after I scheduled a query to move more than 150k rows into a summary index.

I appreciate your help. Thank you

Labels (2)
0 Karma

marnall
Motivator

Can you split your query into a set of smaller queries that index those rows into a summary index?

LearningGuy
Motivator


How do I split my query from DBXquery (eg. 200k rows)and push it into a Summary Index at the same time?

| dbxquery   query=" SELECT * from Table_Test"    



the scheduled report for summary index will add something like this:
summaryindex  spool=t  uselb=t  addtime=t  index="summary" file="test_file" name="test" marker="hostname=\"https://testcom/\",report=\"test\""

Technically, I don't really need the _time because it is a static data, but it needs to get updated every day.

Thanks

0 Karma

bowesmana
SplunkTrust
SplunkTrust

Do you actually care what order the data is returned in - you are simply adding it to the summary index. The _time written to the summary will be whatever you want it to be, so just ignore the message, I don't believe it will affect the data in the summary.

LearningGuy
Motivator

Hi,

Thanks for your help.
No I do not care the order. I am afraid if I split the data and re-combine them it will return duplicate/missing data as it doesn't have a unique identifier.

Also I don't know how to split the data and keeping the same _time.  Please help answer this. Thanks
How do I split my query from DBXquery (eg. 200k rows)and push it into a Summary Index at the same time?

0 Karma

bowesmana
SplunkTrust
SplunkTrust

Sorry, I don't understand what you are talking about re splitting your data - what is being split with the dbxquery?

LearningGuy
Motivator

Hi,

What I meant by splitting the data is to split the number of rows. So, if my query has 200k rows, splitting into 2, it becomes 100k rows. Or @marnall suggested split into a smaller queries..  not sure if it's possible since I have a large query involving multiple DBs

I don't know how to do this in a scheduled report and write it into the same summary index with the same _time setting.  Please suggest.

Thanks

0 Karma

bowesmana
SplunkTrust
SplunkTrust

I don't see why it needs to be split - the events not coming back in subsecond order does not matter to you, so why not just add the 200k in one go - is that causing a problem?

LearningGuy
Motivator

Hi
So far, I didn't see any missing data, so it's not causing a problem, except for the error message.
I am not sure why Splunk has to throw the error message. 

Thank you for your help.

0 Karma
Got questions? Get answers!

Join the Splunk Community Slack to learn, troubleshoot, and make connections with fellow Splunk practitioners in real time!

Meet up IRL or virtually!

Join Splunk User Groups to connect and learn in-person by region or remotely by topic or industry.

Get Updates on the Splunk Community!

May 2026 Splunk Expert Sessions: Security & Observability

Level Up Your Operations: May 2026 Splunk Expert Sessions Whether you are refining your security posture or ...

Network to App: Observability Unlocked [May & June Series]

In today’s digital landscape, your environment is no longer confined to the data center. It spans complex ...

SPL2 Deep Dives, AppDynamics Integrations, SAML Made Simple and Much More on Splunk ...

Splunk Lantern is Splunk’s customer success center that provides practical guidance from Splunk experts on key ...