Thought I'd add to this post, in regards to using a curl command to push a lookup file to a Splunk instance, as other Splunk users may find it useful. It's not a replacement for @mthcht excellent python scripts but it is often easy to use curl commands when testing and validating things. Here's a worked example that creates a simple lookup file (tested against Cloud stack and lookup editor v4.0.4) ... curl -sk --request POST https://localhost:8089/services/data/lookup_edit/lookup_contents \
-H "Authorization: Splunk $MYTOKEN" \
-H "Content-Type: application/x-www-form-urlencoded" \
-d timeout=10 \
-d namespace=search \
-d lookup_file=lookupfilename.csv \
-d contents=[[\"field1\",\"field2\"],[\"value1\",\"value2\"]] \
-d owner=nobody # n.b. owner is only needed when creating new lookups - a 'user' name creates the new lookup file with private permissions, whereas 'nobody' results in it being shared globally Note, the 'contents' format must be a 2D JSON array. To make this easier, 'contents' can also be added via a file, like this ... $ cat <<EOF > myLocalLookup.json
contents=[["field1","field2"],["value1","value2"]]
EOF
$ curl -sk --request POST https://localhost:8089/services/data/lookup_edit/lookup_contents \
-H "Authorization: Splunk $MYTOKEN" \
-H "Content-Type: application/x-www-form-urlencoded" \
-d timeout=10 \
-d namespace=search \
-d lookup_file=lookupfilename.csv \
-d @myLocalLookup.json \
-d owner=nobody Now, to really make this useful, existing CSV file's need to be formatted as JSON. There are multiple ways this could be done, but here is a simple python oneliner (*nix tested) that reads in a CSV file on stdin and outputs it as JSON. (python -c $'import sys;import csv;import json;\nwith sys.stdin as f: csv_array=list(csv.reader(f)); print("contents="+json.dumps(csv_array))' > mylocalLookup.json) < myLocalLookup.csv Hopefully. others may find this useful too.
... View more