anybody have experience for building an automation to import CSV from github location into Splunk lookup file, CSV files are constantly changing, and I need to automate daily updates
Hi @Zoe_
You may find the Webtools Add-on helpful here, you can use the custom curl command in the app to request your data and then parse it into a table, then use outputlookup to save it.
Here is an example I have used previously:
The SPL for this is:
| curl uri=https://raw.githubusercontent.com/livehybrid/TA-aws-trusted-advisor/refs/heads/main/package/lookups/trusted_advisor_checks.csv
| rex field=curl_message max_match=1000 "(?<data>.+)\n?"
| mvexpand data
| fields data
| rex field=data "^(?<id>[^,]+),(?<name>\"[^\"]+\"|[^,]+),(?<category>\"[^\"]+\"|[^,]+),(?<description>\".*\"|[^,]+)$"
| fields - data
🌟 Did this answer help you? If so, please consider:
Your feedback encourages the volunteers in this community to continue contributing
@livehybrid- This curl tool sounds useful.
And @Zoe_ you just need to add | outputlookup <your-lookup-name> at the end of @livehybrid 's query.