Reporting

ERROR:root:(552, '5.3.4 Message size exceeds fixed limit'...) while sending mail

mlevsh
Builder

Hi,
Sometimes we are getting the following error while running reports and sending alerts by email when alert is triggered:

ERROR ScriptRunner - stderr from '/export/opt/splunk/bin/python /export/opt/splunk/etc/apps/search/bin/sendemail.py "results_link=https://splunkderver:8443/app/splunk_iseca/@go?sid=scheduler__username_c3BsdW5rX2lzZWNh" "ssname=Alert name" "graceful=True" "trigger_time=1539637210" results_file="/export/opt/splunk/var/run/splunk/dispatch/scheduler__username_c3BsdW5rX2lzZWNh/results.csv.gz"':  ERROR:root:(552, '5.3.4 Message size exceeds fixed limit', u'splunk@splunkserver') while sending mail to: firstname.lastname@ourdomain.com

1) Contacted our messaging admins. They mentioned 20MB limit on attachment.
Size of our attachments is much less. it is about 28K.
They also checked if there were any attempts to receive the emails we had issues with - no indications of them at all.

Any advice in which direction should we troubleshoot?

0 Karma
1 Solution

mlevsh
Builder

We figured it out. The search for the alert produced result with about 5000 events. Users checked to add results to email on alert as csv file and also "inline". Un-checking "inline" solved the issue.

View solution in original post

0 Karma

mlevsh
Builder

We figured it out. The search for the alert produced result with about 5000 events. Users checked to add results to email on alert as csv file and also "inline". Un-checking "inline" solved the issue.

0 Karma
Get Updates on the Splunk Community!

Fun with Regular Expression - multiples of nine

Fun with Regular Expression - multiples of nineThis challenge was first posted on Slack #regex channel ...

[Live Demo] Watch SOC transformation in action with the reimagined Splunk Enterprise ...

Overwhelmed SOC? Splunk ES Has Your Back Tool sprawl, alert fatigue, and endless context switching are making ...

What’s New & Next in Splunk SOAR

Security teams today are dealing with more alerts, more tools, and more pressure than ever.  Join us on ...