Greetings,
I regularly update a KV Store with new IP addresses/websites to monitor for in my network traffic.
Sometimes I get redundant information, and put in the same IP's/website multiple times.
How can I prevent duplicates in the KV Store?
Hi @dteo827,
Do use dedup
command while updating the lookup. To make sure that your lookup don't have duplicate values at any point of time set key (primary key, which can not be duplicated) in kvstore (Reference).
This might work, but to play the devil's advocate here:
Ponder that the list has an IP and host of example.com
with ip 192.168.0.1
At some point, that key-value expires (i.e is old), perhaps because the IP has changed. At a later point in time, the ip 192.168.0.2
resolves to example.com
, which then should be put in the KV store. At this point, without using timestamps and additional logic, you cant be certain the dedupped hostname (for example) removes the correct entry in the KV-store.
I've had huge problems with the KV-store functionalities, where inputlookup is great in terms of providing data on a row by row basis, making it easy to discern duplicates etc, but has the requirement of being the first command in the pipe. lookup on the other hand can be anywhere in a search, but does not provide a way to separate colliding entries (i.e. the output will be similar to that of doing a | stats values(*) by x
If you are populating kv store with a search, then you can check the existence of IP's and store only those which are not inside, something like
<your search terms>|inputlookup lookup_name where NOT [|inputlookup lookup_name where IP="*"|fields IP]
Inputlookup has to be the first command of a search