Hi everyone,
I have produced a search, which formats events in a table with couple of columns. The data and column names use Cyrillic words and in the GUI these look just fine. However, when I try to export the table as CSV (via the "Export To" option) the data and column names are encoded incorrectly and are not readable.
Is there a setting which I can change so that this problem is fixed?
I've searched the other topics here in Communities, but didn't find an asnwer, e.g.:
Any help is appreciated,
Thanks!
Hello @scelikok,
I tested the solution you provided in your post and it worked for me. Thank you for the time you spent on researching the matter. I saved me a lot of time.
I appreciate your willing ness to help. Have a great day.
Best regards,
Ivan
Hi @vanvan,
If you see the exported CSV file readable in Notepad++/Sublime but in Excel, below fix should work for you. Excel needs BOM characters at the beginning of file to understand UTF8.
Please update readall function as below and restart Splunk service. This is for Splunk 8.0 and later.
$SPLUNK_HOME/lib/python3.7/site-packages/splunk/rest/__init__.py
def readall(self, blocksize=32768):
"""
Returns a generator reading blocks of data from the response
until all data has been read
"""
response = self.response
import codecs
counter = 0;
while True:
data = response.read(blocksize)
if not data:
break
if counter == 0:
data = b"".join((codecs.BOM_UTF8, data))
counter += 1
yield data
Thanks!
This is interesting, I'll try it.