Let's say I'm building an app, and I have data which needs to be loaded into the KV store once the app is installed. I also want to be careful about anything happening automatically, because a reinstall or upgrade will clobber any data already in there.
Question #1: How can I automate the initial load as much as possible?
Question #2: How can I prevent overwriting existing data in an upgrade scenario?
What are some good techniques?
Extending from @halr9000's answer
You can do a lookup on itself and exclude results that are already in the lookup_name
lookup.
| inputlookup filename.csv | lookup lookup_name identity_column OUTPUT identity_column as found_id | search NOT found_id=* | outputlookup lookup_name
The | search NOT found_id=*
will show all events that do not have a found_id field.
This has the downside that if your users delete records from the lookup, you'll re-add them.
Another option would be to ship filename_v1.csv
and filename_v2.csv
then do a similar thing there to seed the lookup.
One way would be to ship your initial data as a CSV file, and then import the data using inputlookup and outputlookup. Here is a sample search from this migrate to KV store page:
| inputlookup filename.csv | outputlookup lookup_name
Care would need to be taken to ensure that the operation isn't performed a second time, or the data would be overwritten.