I have a large data set (20 million) since 2015 which keeps on growing. In my case, I am supposed to use lookup and I found out that KV store is best since records in index are getting updated with _key(ORDER_KEY) remaining constant, hence my lookup will also be updating. Now with this huge set of growing data, will I land in to some sort of performance issue?
I thought of using multiple KV Store lookup broken down by month such as events from nov2015 will go to kvlookup_nov2015 and events from dec2015 will go to kvlookup_dec2015 based on ORDER_KEY creation time, all the collection and transforms.conf entries for lookup definitions will be made earlier only, but I am not able to achieve this as run time in search |outputlookup.
I tried the macro approach with eval based definition. |outputlookup `filename(ORDER_KEY)`.