I'm extracting a CSV and sending it over email. The extracted CSV sometimes contains lines whose length is greater than 990 characters (including commas, quotes etc. in csv).
In the received CSV lines with length > 990 characters are broken and a "!\r\n" (exclamation followed by line break) is being inserted at character #991.
This renders the CSV un-parsable by, say, excel.
Has anyone run into similar problem and found a solution for this.
I've gone through limits.conf and I'm not able to locate any relevant setting in it.
PS: I do not have an option to reduce the number of characters in a line.
Did you already find the solution? I figured out this:
This might have something to to how a email-client (splunk, sendemail.py) is calling a mail server.
Sometimes it make sense to encode attachements in base64 like recommended here:
I found the place where splunk is encoding the attachment when it is sent by email:
$SPLUNK_HOME/etc/apps/search/bin/sendemail.py
if not len(results) == 0 and len(''.join(results[0].keys())) > EMAIL_CSV_HEADER_CHAR_LIMIT:
Encoders.encode_base64(csvAttachment)
It seems that splunk is just encoding in base64 when the csv header is greater that 900 chars (see condition len(''.join(results[0].keys())) > EMAIL_CSV_HEADER_CHAR_LIMIT) .
When I modify the script like that (see condition below), the csv file attached to an email is not containing carriage returns after 990 chars:
if not len(results) == 0:
Encoders.encode_base64(csvAttachment)
Yes, I have the same problem and I'm looking for a solution