Getting Data In
Highlighted

Can't use inputcsv on a csv file just created with outputcsv - does inputcsv strictly handle embedded newlines?

New Member

I have a long running query (7 minutes or so) that I want to speed up, because I am doing variations of it over and over again. So I broke out the part that was common to all of them, and wrote it to a csv file. Then I tried to use that file as the source, and I get nothing. Event count is 0.

If I had to venture a guess, I would suppose that it might have to do with newlines embedded in the _raw log entries. Can splunk properly parse a csv file if some of the strings in the file contain newlines?

If the answer is yes, then does anybody have any other ideas what might cause a file just written with outputcsv to not be readable as an input source using inputcsv?

Tags (3)
0 Karma
Highlighted

Re: Can't use inputcsv on a csv file just created with outputcsv - does inputcsv strictly handle embedded newlines?

Communicator

I always use outputlookup for things like this. Lately I've move on to tscollect and tstats. I use the netapp an VMware app and these commands are used to speed up perf queries.

Not an answer to your problem, but another path to try if you need.

http://docs.splunk.com/Documentation/Splunk/6.2.1/SearchReference/Tscollect

Highlighted

Re: Can't use inputcsv on a csv file just created with outputcsv - does inputcsv strictly handle embedded newlines?

New Member

Unfortunately, I need more than time series data, I need the subset of records that I am searching over. The point is to do less work overall, skipping the processing of 2 million records that are established as irrelevant to my query.

0 Karma
Highlighted

Re: Can't use inputcsv on a csv file just created with outputcsv - does inputcsv strictly handle embedded newlines?

SplunkTrust
SplunkTrust

Depending on what you're actually trying to do, take a look at |loadjob: http://docs.splunk.com/Documentation/Splunk/6.2.1/SearchReference/loadjob

0 Karma