Splunk Search

How use a not local csv file as lookup ?

pbourit
New Member

Hi,

I use a csv file as a lookup in a search command like this :

sourcetype="airmantool" | rex ".\s(?[A-Z]+)\s+[(?\w+)]|(?.)" | sort _time | lookup AirmanTool_Lookup.csv AirmanTool_message AS AirmanTool_message OUTPUT Commentary AS AirmanTool_Explanation Procedure AS AirmanTool_Procedure

The csv file is currently on the local server (in the lookup field) but I want to use a csv on a distant server not on local. The csv is frequently updated.

How can i do ?

Tags (2)
0 Karma

reswob4
Builder

Agree with @changux, if you can find a way to periodically move the csv to the local server in an automated fashion, that would make life easier.

I had a very similar issue where I had a powershell script, which ran daily, creating a csv file on a windows server which was also configured as a heavy forwarder. Splunk monitored the output folder and read in each new file to send to one of the index servers. The search head then performed a scheduled search and created a new local lookup file using outputcsv.

This turned out to be inconsistent. Far simpler to have the powershell script save the output directly to the search head lookup folder (multiple ways to do this).

0 Karma

changux
Builder

Hi. Check this related answer:

http://answers.splunk.com/answers/124999/get-data-lookup-from-other-remote-peer.html

My opinion: i prefer do a crontab to copy the remote file to local every X minutes.

Get Updates on the Splunk Community!

Announcing Scheduled Export GA for Dashboard Studio

We're excited to announce the general availability of Scheduled Export for Dashboard Studio. Starting in ...

Extending Observability Content to Splunk Cloud

Watch Now!   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to leverage ...

More Control Over Your Monitoring Costs with Archived Metrics GA in US-AWS!

What if there was a way you could keep all the metrics data you need while saving on storage costs?This is now ...