Hello Splunk enjoyers!
I loaded some data(10 000 000), with fields: updated_time, info, user and discription, to my new index "data_tmp".
So when i search, i got a problem Error in 'IndexScopedSearch': The search failed. More than 1000000 events found at time 1677582000.
My search:
So i tried to extract by updated_time like:
index = data_tmp
eval _time = strftime(updated,"%Y-%m-%d %H:%M:%S.%3N")
| convert ctime(_time)
| fieldformat _time = strftime(updated,"%Y-%m-%d %H:%M:%S.%3N")
but nothing works.
Can somebody help me with that?
thank you!
did you fix it ?
This is a fundamental problem with the data badly ingested into Splunk.
Splunk returns results in reverse chronological order so it needs to be able to sort the results properly based on the _original_ value of the _time field. (afterwards the _time field can be rewritten during the search pipeline and it won't affect the result order). If you have several hundred thousand events indexed at the same point in time, Splunk cannot sort them due to memory constraints.
It's not a problem with the search as such but it's a problem with the data - fix your data onboarding.