Splunk Search

LookUp Files Issues

zacksoft
Contributor

I have a lookup file called PriceFactot.csv. I have defined this lookup table and then in query I use
| inputlookup PriceFactor.csv and get my data.

The thing is, PriceFactor.csv's content changes twice a day. SO each time I have to upload/define the new lookup in splunk , or else in the query in shows me stale data.

Is there anyway to make Splunk to keep reading the lookup file or dynamically update itself etc...or any other suggestion??

Tags (1)
0 Karma
1 Solution

gcusello
SplunkTrust
SplunkTrust

Hi @zacksoft,
you have to schedule a search to automatically update your lookup e.g. twice in a day.
You can do it scheduling the lookup update search (the one finishing with outputlookup PriceFactor.csv) as an alert (running e.g. at 7.00 and 13.00) so you'll automatically have your lookup updated.

If you have many records in you lookup, you could also think to use a summary index instead a lookup to update in the same way.

Ciao.
Giuseppe

View solution in original post

0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi @zacksoft,
you have to schedule a search to automatically update your lookup e.g. twice in a day.
You can do it scheduling the lookup update search (the one finishing with outputlookup PriceFactor.csv) as an alert (running e.g. at 7.00 and 13.00) so you'll automatically have your lookup updated.

If you have many records in you lookup, you could also think to use a summary index instead a lookup to update in the same way.

Ciao.
Giuseppe

0 Karma

zacksoft
Contributor

The lookup contents are externally updated by another program. I don't have control over it. And the look up is placed in a windows drive folder. What I am looking for is, to read the lookup automatically so that I can get the updated contents.

0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi @zacksoft,
as I said, you have:

  • to read the csv file using a Universal Forwarder (if it's located in a different server than the Indexer) or the monitor function of the Indexer (if it's located in the Indexer);
  • schedule a search to create your lookup.

Only one question: when you read the content of the csv, you add records to the lookup or override it?
because, if you override it you could think to don't use the lookup but store the csv in an index and run a simple search on this index: in this way you'll have always updated data.

Ciao.
Giuseppe

0 Karma

zacksoft
Contributor

Thanks @gcusello . It overrides. It doesn't add records.

0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi @zacksoft,
check what's the execution time and the number of results: if the search isn't heavy and you have less that 50,000 results, you can use it in your searches.
Anyway, you can schedule the search to populate the lookup.

Ciao.
Giuseppe

0 Karma
Get Updates on the Splunk Community!

Earn a $35 Gift Card for Answering our Splunk Admins & App Developer Survey

Survey for Splunk Admins and App Developers is open now! | Earn a $35 gift card!      Hello there,  Splunk ...

Continuing Innovation & New Integrations Unlock Full Stack Observability For Your ...

You’ve probably heard the latest about AppDynamics joining the Splunk Observability portfolio, deepening our ...

Monitoring Amazon Elastic Kubernetes Service (EKS)

As we’ve seen, integrating Kubernetes environments with Splunk Observability Cloud is a quick and easy way to ...