Monitoring Splunk

How to solve " Events might not be returned in sub-second order due to search memory limits" without increasing limit

LearningGuy
Builder

Hello,

How to solve " Events might not be returned in sub-second order due to search memory limits" without increasing the value of the following limits.conf setting:[search]:max_rawsize_perchunk?

I got a message after I scheduled a query to move more than 150k rows into a summary index.

I appreciate your help. Thank you

Labels (2)
0 Karma

marnall
Builder

Can you split your query into a set of smaller queries that index those rows into a summary index?

LearningGuy
Builder


How do I split my query from DBXquery (eg. 200k rows)and push it into a Summary Index at the same time?

| dbxquery   query=" SELECT * from Table_Test"    



the scheduled report for summary index will add something like this:
summaryindex  spool=t  uselb=t  addtime=t  index="summary" file="test_file" name="test" marker="hostname=\"https://testcom/\",report=\"test\""

Technically, I don't really need the _time because it is a static data, but it needs to get updated every day.

Thanks

0 Karma

bowesmana
SplunkTrust
SplunkTrust

Do you actually care what order the data is returned in - you are simply adding it to the summary index. The _time written to the summary will be whatever you want it to be, so just ignore the message, I don't believe it will affect the data in the summary.

LearningGuy
Builder

Hi,

Thanks for your help.
No I do not care the order. I am afraid if I split the data and re-combine them it will return duplicate/missing data as it doesn't have a unique identifier.

Also I don't know how to split the data and keeping the same _time.  Please help answer this. Thanks
How do I split my query from DBXquery (eg. 200k rows)and push it into a Summary Index at the same time?

0 Karma

bowesmana
SplunkTrust
SplunkTrust

Sorry, I don't understand what you are talking about re splitting your data - what is being split with the dbxquery?

LearningGuy
Builder

Hi,

What I meant by splitting the data is to split the number of rows. So, if my query has 200k rows, splitting into 2, it becomes 100k rows. Or @marnall suggested split into a smaller queries..  not sure if it's possible since I have a large query involving multiple DBs

I don't know how to do this in a scheduled report and write it into the same summary index with the same _time setting.  Please suggest.

Thanks

0 Karma

bowesmana
SplunkTrust
SplunkTrust

I don't see why it needs to be split - the events not coming back in subsecond order does not matter to you, so why not just add the 200k in one go - is that causing a problem?

LearningGuy
Builder

Hi
So far, I didn't see any missing data, so it's not causing a problem, except for the error message.
I am not sure why Splunk has to throw the error message. 

Thank you for your help.

0 Karma
Get Updates on the Splunk Community!

Accelerate Service Onboarding, Decomposition, Troubleshooting - and more with ITSI’s ...

Accelerate Service Onboarding, Decomposition, Troubleshooting - and more! Faster Time to ValueManaging and ...

New Release | Splunk Enterprise 9.3

Admins and Analyst can benefit from:  Seamlessly route data to your local file system to save on storage ...

2024 Splunk Career Impact Survey | Earn a $20 gift card for participating!

Hear ye, hear ye! The time has come again for Splunk's annual Career Impact Survey!  We need your help by ...