New dataset of a 70 MB log file. The timestamp of the log file was based on seconds the device operated, not a traditional month, day, year, etc... When I ingested the file to Splunk, it was assigned the default timestamp of the time of ingest since there was no discernible timestamp. I wanted to round the time to the nearest tenth of a second since further granularity is not needed at this point: index=main source="03182019.csv"
| eval appTime=round(time)
| stats c by appTime
| sort appTime
When I try doing this search I receive this error: Error in 'IndexScopedSearch': The search failed. More than 1000000 events found at time 1553486400.
Understandably, this is a lot of events but is there no way to increase the limit so searches like this can be run? Currently, it only returns chunks of the data and there are large amounts of it missing.