Hi, I'm working with a large amount of data. I wrote a main report that extracts all events (let's call them events A,B,C,D) from the last 30 days and do some manipulations for fields. And then i wrote 5 reports that filter the main saved report by events type and get only the relevant fields for each event: For example- the report for event A contain all fields relevant for event A, report for event B contains all fields relevant for event B and etc. My dashboard contains 5 tabs, one for each event (tab 1 for report A, tab 2 for report B,..), and triggers the relevant saved search report (reports A/B/C,..) Problems- all the reports run very slow My questions: 1. How to read only delta data each time? i mean, how to not read 30 days each time at once, if the query was already run today and i execute it one more time it should read only new data and use the history data that have already read in the previous run. 2. i read a bit about summary index. my reports extract all fields and not aggregate data. how to create my 6 reports (main+5 others) with summary index? As i said, - i use table command and not functions like top,count,.. in my query (my reports just extract relevant fields with some naming manipulations) * in case that you would recommend to use summary index i will appreciate if you could provide me example code, because i have 6 reports and not sure how work with summary index thanks, Maayan
... View more