encoding of exported CSV

Path Finder

Hi everyone,

I have produced a search, which formats events in a table with couple of columns. The data and column names use Cyrillic words and in the GUI these look just fine. However, when I try to export the table as CSV (via the "Export To" option) the data and column names are encoded incorrectly and are not readable. 

Is there a setting which I can change so that this problem is fixed?


I've searched the other topics here in Communities, but didn't find an asnwer, e.g.: 

Any help is appreciated,


Labels (2)
Tags (3)
0 Karma


Hi @vanvan,

If you see the exported CSV file readable in Notepad++/Sublime but in Excel, below fix should work for you. Excel needs BOM characters at the beginning of file to understand UTF8.

Please update readall function as below and restart Splunk service. This is for Splunk 8.0 and later.


def readall(self, blocksize=32768):
        Returns a generator reading blocks of data from the response
        until all data has been read
        response = self.response
        import codecs
        counter = 0;
        while True:
            data =
            if not data:
            if counter == 0:
                data = b"".join((codecs.BOM_UTF8, data))
                counter += 1
            yield data


If this reply helps you an upvote is appreciated.

Path Finder


This is interesting, I'll try it.

0 Karma
Did you miss .conf21 Virtual?

Good news! The event's keynotes and many of its breakout sessions are now available online, and still totally FREE!