I'm adding ~2k rows to a KVStore table with 14 fields and ~2 million rows. The outputlookup command takes nearly 2 hours.
The query is like this:
index=myindex earliest=-1d
| fields id,a,b,c,d,e,f,g,h,i,j,k,l,m
| inputlookup append=true kvtable | dedup id
| outputlookup kvtable
The initial part of the query typically returns a couple thousand rows.
From the job inspector:
duration components invocations inputcount outputcount
0.00 command.addinfo 6 2,138 2,138
2.50 command.dedup 49 2,216,938 2,214,898
0.00 command.fields 10 4,276 4,276
17.30 command.inputlookup 1 150,538 2,450,000
6,338.78 command.outputlookup 1 2,414,800 2,414,800
Is this normal? If not, can you suggestion some troubleshooting steps?
If you're just adding to the table, have you considered setting an appropriate _key value and using | outputlookup append=true
? Then you wouldn't need to overwrite the entire collection on every update.
Additionally, what does your collections.conf look like, especially accelerated fields for this collection?
If you're just adding to the table, have you considered setting an appropriate _key value and using | outputlookup append=true
? Then you wouldn't need to overwrite the entire collection on every update.
Thanks for your help. Finally was able to implement this and it's a big improvement.