Hi,
I am trying to add a new lookup table using the GUI and get the above error.
I looked at the file with a hex editor and it looks like a plain ascii file.
Any ideas?
Thx
Yona
Hi,
I've encountered the same problem today. Just open your CSV with Windows Notepad and use "Save As.." paying attention to select UTF-8 in "Encoding" dropdown menu.
Simple as that, now Splunk recognizes the file 🙂
Hi,
I've encountered the same problem today. Just open your CSV with Windows Notepad and use "Save As.." paying attention to select UTF-8 in "Encoding" dropdown menu.
Simple as that, now Splunk recognizes the file 🙂
I've used this solution previously, but now, the UTF-8 file isn't maintaining the commas and so then the lookup isn't finding the field names. Any ideas?
figured it out, wasn't following the instructions carefully. @landen99 comment helped. you must open the csv file with notepad, not copy-paste the data into notepad to maintain the commas. hope this helps someone else.
Based on the comments, upvotes this answer has been accepted to be the correct one.
cheers, MuS
This should be the accepted answer! Worked first time for me.
great solution, thanks
Worked for me as well. As a test, before implementing the endcoding change, I selected everything in the file while opened in Notepad with Ctrl-A, cut, and paste, but that did not work. Save as UTF-8 was the solution for me.
I'm generating my .csv file using a python library for csv. I have tried changing my line endings from \r to \n to \r\n
I have also tried using unix2dos and dos2unix to change line endings. I don't see any special characters. I have tried the strings trick. It removed the ^m from the end of the line. That file would also not upload.
BTW I'm using OS X and am wondering if the Mac line endings might be causing the trouble.
Yes, Mac line endings are not accepted. Use Linux line endings instead. I thought dos2unix
would take care of this, but I guess not.
I use Komodo Edit (free version) on my Mac and I save my CSV files using its "UNIX line-endings preference". That seems to work for me.
I used Notepad++ to fix this problem.
Search for [^\x00-\x7F] in Regex Mode.
I used Notepad++ to fix this problem.
Search for [^\x00-\x7F] in Regex Mode.
Thanks every one for the kind help. Just a quick update: I used the same file on another system and things worked just fine. I am still don't understand what is the reason for it not to work on my machine. I looked carefully with a hexeditor and could not find any special character... I will put more time to it next week and will try to update if I find anything interesting.
I had the same problems until I figured out that splunk fails on reading CSV files containing special characters like the German Umlaut chars like "ä", "ö", "ü" or "ß". I suppose the problem would be the same with other special characters in French ("á") and Spanish ("ñ").
I am using a windows 7 machine, the lookup file is the following:
Process,in
LtMSSp,1
Kerberos,1
I looked with a hex editor and could not see any charater except for CR and LF at the end of each line in addition to the text.
Could there be a spurious NULL or ^Z at the end of the file? Did you try cutting and pasting the file into a new text file in Notepad and saving it from there?
Maybe there is a special character in the CSV file. If you are on a unix host, you could try the folowing:
strings orig.csv >new.csv
diff orig.csv new.csv
If the two files are the same, then they probably are plain text.
Need more information. Perhaps you can let us know your configuration settings, OS, and what the contents of the lookup file are?