Splunk Search

Help with CSV - Special Case

Path Finder

Hello Splunkers!


We have a situation here and need your help and experience. We are looking for best practice to work with Large CSV files (1Million Rows at least) to produce fast searches and fast dashboards.


The case is also special as these CSV files is updated daily on the below manner:

  • It's a daily generated report from another system and this is the only why to send data to Splunk.
  • It could have modification as  (new Rows with new data/ new modified values of old data/ OR Full Remove of some Rows)

So, we need to update Splunk daily on the change of the files.


The only was I can see is to remove the index data and re-index the CSV files everyday!


I don't know actually how to do that if we need to automate the whole process or if there is a best practice better than this approach.


Appreciate your help.😊

Labels (1)
0 Karma

Path Finder

Thanks @venkatasri  for your collaboration to help. Will try to  configure it and let you know how it worked 😄 

0 Karma


Hi @Muwafi 

  • CSV lookups are defined for small sets of data, 1 Million is definitely huge 
  • KV store is right choice for your case
  • As you mentioned CSV is the only way you get the details out of your systems on schedule basis, you should research about how to import CSV into KV store
  • KV stores are relatively faster and accept large datasets which helps for your faster dashboard/query loading
  • KV Stores requires at least one KV Store Collection which is a database stores data in key/value pairs. Docs cover how to set-up the same if not already exist in your environment.

Following links  would be the starting point to read about them,

About lookups - Splunk DocumentationDefine a KV Store lookup in Splunk Web - Splunk Documentation


An upvote would be appreciated if it helps!

0 Karma
Did you miss .conf21 Virtual?

Good news! The event's keynotes and many of its breakout sessions are now available online, and still totally FREE!