I have a lookup table file csv. Every now and then I have to add a couple of domains to it along with a hard coded "1" (which I use as a flag).
I have a copy of the csv on my desktop, where I manually edit it, and then delete the old version in splunk, and create a new lookup, using the edited csv version.
Is there a more efficient way to update it?
Thank you
There are couple of alternatives, than having to re upload. If the number of entries are less.
| inputlookup mylookup.csv | append [|makeresults | eval domain="abc" | eval flag="1" | table domain flag] | outputlookup mylookup.csv
You can install the Lookup Editor
app, or you can just fix it in the SPL with something like this:
Your Search For New/Replacement Data here
| appendpipe [|inputlookup YourLookup.csv]
| dedup YourKeyFieldHere
| outputlookup YourLookup.csv
The dedup
will cause any new data to supersede any existing data and then the merged set is written back out.
Thank you for your reply, I will keep this in mind, however I have to use the above for my situation.
There are couple of alternatives, than having to re upload. If the number of entries are less.
| inputlookup mylookup.csv | append [|makeresults | eval domain="abc" | eval flag="1" | table domain flag] | outputlookup mylookup.csv
your code works great, I could not get it to work because of a 1D10T error, typo, Thank you!