I need to know which of these methods is better for this scenario:
I have a big log of events that index 2.5 million of events every day, this log is a raw text that require a complex Regular Expression to get the fields and values, i have like 10 dashboard feeding from this log, one of them is a report view where me and my team search event with multiples filters that are dinamilly choose from tokens.
these reports takes to much time when the time range is seven day ago or more, it's very hard generate a report of the top 10 events, or the distributions of errors.
the problem is that the time range selected is very random, one day we need a today report, then a 3 months ago or especific day, I need a method to optimize this reports and reduce the duration of the jobs.
I have tried with make all the dashboard run a base search and then post process the results on each panel, this did'nt reduce the duration.
So, what you recommend, use a saved search, a summary index or data model?
keep in mind, the time range selected it's very variable
... View more