Splunk Enterprise Security

Can Splunk read a CSV file and automatically upload it as a lookup?

siv
Loves-to-Learn Lots

Can Splunk read a CSV file located on a remote server using a forwarder and automatically upload it as a lookup?
what i know there is two option, upload csv as lookup or read line by line from the file as a log

Labels (1)
Tags (2)
0 Karma

splunkmarroko
Engager

Splunk UF can read a CSV file but cannot as lookup. UF recognize it as regular log file. and no splunk does not automatically upload it as lookup.

you might consider using splunk UF to monitor the CSV file. create a monitoring stanza:
[monitor://path...]

sourcetype=

index=

and then set the props.conf on the indexer with this setting:
[theCSVfile]
INDEXED_EXTRACTION=csv

0 Karma

livehybrid
Super Champion

Hi @siv 

If you have a CSV on a forwarder that you want to become a lookup in Splunk then the best way to achieve this is probably to monitor (using monitor:// in inputs.conf) the file and send it to a specific index on your Splunk indexers.

Then, Create scheduled search which searches that index and retrieves the sent data and outputs it to a lookup (using | outputlookup command). Depending on how/when the CSV is updated may depend on exactly how the resulting search ends up, but ultimately this should be a viable solution.

There may be other solutions but would require significantly more engineering effort. 

🌟 Did this answer help you? If so, please consider:

  • Adding karma to show it was useful
  • Marking it as the solution if it resolved your issue
  • Commenting if you need any clarification

Your feedback encourages the volunteers in this community to continue contributing

0 Karma

PickleRick
SplunkTrust
SplunkTrust

No. You cannot read a lookup contents directly using a forwarder. If you want that functionality (I needed it once so that users could "edit" one particular lookup but not any other ones), you need to read the csv file contents as events into a temporary index and create a scheduled search which will read those events and do | outputlookup at the end.

A bit complicated because you have to keep track when you last updated the lookup so you don't overwrite it each time.

0 Karma

kiran_panchavat
Influencer

@siv 

There are two methods of ingesting: 

  1. Upload with Splunk Web: This is a one-time process done manually by the user. (Note that uploading via Splunk Web has a 500 Mb limit on file size.)
  2. Monitor from a filesystem with a UF or other forwarder: This method is for on-going ingestion over a period of time and may not require any manual intervention by the user once setup.

You will need to create an app with an inputs.conf that specifies the file or path to monitor.

[monitor:///opt/test/data/internal_export_local.csv]
sourcetype=mycsvsourcetype
index=test

Create an accompanying props.conf file:

[mycsvsourcetype]
FIELD_DELIMITER=,
FIELD_NAMES=host,source,sourcetype,component
 
Either create the app directly on the system ingesting the file, or create it on the Deployment Server and deploy it to the system ingesting the file, whether that’s Splunk Enterprise or a system with the Splunk Universal Forwarder installed. Once Splunkd is restarted on that system, Splunk will begin to ingest the new file.
 
Refer this 
 
 
Did this help? If yes, please consider giving kudos, marking it as the solution, or commenting for clarification — your feedback keeps the community going!
0 Karma
Get Updates on the Splunk Community!

The All New Performance Insights for Splunk

Splunk gives you amazing tools to analyze system data and make business-critical decisions, react to issues, ...

Good Sourcetype Naming

When it comes to getting data in, one of the earliest decisions made is what to use as a sourcetype. Often, ...

See your relevant APM services, dashboards, and alerts in one place with the updated ...

As a Splunk Observability user, you have a lot of data you have to manage, prioritize, and troubleshoot on a ...