Getting Data In

Input multiple csv files

darioapis
Explorer

Hi, I am trying to load multiple csv files to my search result. I know that can be done using the append command, but I am wonder if it can be done using something like thisinputcsv some*.csv. Thanks

Tags (2)
0 Karma

sbaror11
Explorer

I need to read csv files *dynamically*. That is, every week a new file is added. So every week I need to read an additional file. Is there a way to build an automated dynamic command? 
I can get the list of all files with  rest/servicesNS/-/-/data/lookup-table-files | search title="file*.csv". 
But what's next? 

my current workaround is to build the cmd using the above, and then copying it to the spl line and running it. This is not automatic of course. 

 All the examples above are for a static list of files. 

0 Karma

woodcock
Esteemed Legend

If your files have more thank 50500 lines, other methods may truncate your events due to subsearch limits. The only what that I have found to pull in multiple CSVs without truncation on any version of Splunk is like this:

|inputlookup file1.csv
| appendpipe [ |inputlookup file2.csv ]
| appendpipe [ |inputlookup file3.csv ]
...
| appendpipe [ |inputlookup fileN.csv ]
0 Karma

richgalloway
SplunkTrust
SplunkTrust

There could be many ways to do that, but it depends on your use case and data. More detail would be helpful.

One option: | inputlookup file1.csv | inputlookup append=t file2.csv | ....

---
If this reply helps you, an upvote would be appreciated.
0 Karma
*NEW* Splunk Love Promo!
Snag a $25 Visa Gift Card for Giving Your Review!

It's another Splunk Love Special! For a limited time, you can review one of our select Splunk products through Gartner Peer Insights and receive a $25 Visa gift card!

Review:





Or Learn More in Our Blog >>