Monitoring Splunk

How to solve " Events might not be returned in sub-second order due to search memory limits" without increasing limit

LearningGuy
Builder

Hello,

How to solve " Events might not be returned in sub-second order due to search memory limits" without increasing the value of the following limits.conf setting:[search]:max_rawsize_perchunk?

I got a message after I scheduled a query to move more than 150k rows into a summary index.

I appreciate your help. Thank you

Labels (2)
0 Karma

marnall
Builder

Can you split your query into a set of smaller queries that index those rows into a summary index?

LearningGuy
Builder


How do I split my query from DBXquery (eg. 200k rows)and push it into a Summary Index at the same time?

| dbxquery   query=" SELECT * from Table_Test"    



the scheduled report for summary index will add something like this:
summaryindex  spool=t  uselb=t  addtime=t  index="summary" file="test_file" name="test" marker="hostname=\"https://testcom/\",report=\"test\""

Technically, I don't really need the _time because it is a static data, but it needs to get updated every day.

Thanks

0 Karma

bowesmana
SplunkTrust
SplunkTrust

Do you actually care what order the data is returned in - you are simply adding it to the summary index. The _time written to the summary will be whatever you want it to be, so just ignore the message, I don't believe it will affect the data in the summary.

LearningGuy
Builder

Hi,

Thanks for your help.
No I do not care the order. I am afraid if I split the data and re-combine them it will return duplicate/missing data as it doesn't have a unique identifier.

Also I don't know how to split the data and keeping the same _time.  Please help answer this. Thanks
How do I split my query from DBXquery (eg. 200k rows)and push it into a Summary Index at the same time?

0 Karma

bowesmana
SplunkTrust
SplunkTrust

Sorry, I don't understand what you are talking about re splitting your data - what is being split with the dbxquery?

LearningGuy
Builder

Hi,

What I meant by splitting the data is to split the number of rows. So, if my query has 200k rows, splitting into 2, it becomes 100k rows. Or @marnall suggested split into a smaller queries..  not sure if it's possible since I have a large query involving multiple DBs

I don't know how to do this in a scheduled report and write it into the same summary index with the same _time setting.  Please suggest.

Thanks

0 Karma

bowesmana
SplunkTrust
SplunkTrust

I don't see why it needs to be split - the events not coming back in subsecond order does not matter to you, so why not just add the 200k in one go - is that causing a problem?

LearningGuy
Builder

Hi
So far, I didn't see any missing data, so it's not causing a problem, except for the error message.
I am not sure why Splunk has to throw the error message. 

Thank you for your help.

0 Karma
Get Updates on the Splunk Community!

Introducing the Splunk Community Dashboard Challenge!

Welcome to Splunk Community Dashboard Challenge! This is your chance to showcase your skills in creating ...

Built-in Service Level Objectives Management to Bridge the Gap Between Service & ...

Wednesday, May 29, 2024  |  11AM PST / 2PM ESTRegister now and join us to learn more about how you can ...

Get Your Exclusive Splunk Certified Cybersecurity Defense Engineer Certification at ...

We’re excited to announce a new Splunk certification exam being released at .conf24! If you’re headed to Vegas ...