So I have been working on migrating usecases from one splunk ES to splunk cloud for a client. They had around 760+ correlation searches created for similar usecases and field extraction. So I created a lookup and used lookup definition to create the correlation search.
They are pointing out that this might affect the notable creation when there is a knowledge bundle fail. Also they highlighted it will lead to skipped searches.
I have built similar cases for other environments with more huge lookup and search over huge data. My present lookup have 764 entries and the correlation search runs for every 15min cron looking back on the 1hour data.
so I would like the help of the experts, to answer
1. if consolidating 760 separate searches are efficient or 1search with the lookup is efficient and how
2. In case of knowledge bundle fail will it be affected and miss alerting?
3. Will it cause skipped searches if the schedule goes over?
p.s. - However my search should re trigger and anyhow is checking over last 1hour data every 15min
please help on this matter