Reporting

Why are the reports not sending email of more than ~70 events in Suse Linux enterprise?

hugohctint
Loves-to-Learn Lots

Reports and sendemail not sending an email of more than about 70 events - The maximum number of events varies depending of how much data per event. What I see is that sendemail.py is failing.

Sendemail in the search line is doing the same, and the error is: External search command 'sendemail' returned error code 1.

I tried to add an increase several parameters in the .conf files like indicated on the groups but it did not make a difference
https://answers.splunk.com/answers/542862/how-to-overcome-csv-max-results-to-email.html

Here is the splunkd.log filtered by the scheduler and sendemail events examples:

Scheduler:

04-18-2018 15:00:00.780 -0300 ERROR ScriptRunner - stderr from '/opt/splunk/bin/python /opt/splunk/etc/apps/search/bin/sendemail.py "results_link=https://quebec:8000/app/search/@go?sid=scheduler__admin__search__RMD5305669048a8da3b1_at_1524074400_13" "ssname=R4_TEST" "graceful=True" "trigger_time=1524074400" results_file="/opt/splunk/var/run/splunk/dispatch/scheduler__admin__search__RMD5305669048a8da3b1_at_1524074400_13/results.csv.gz"':      for line in csvr:
04-18-2018 15:00:00.780 -0300 ERROR ScriptRunner - stderr from '/opt/splunk/bin/python /opt/splunk/etc/apps/search/bin/sendemail.py "results_link=https://quebec:8000/app/search/@go?sid=scheduler__admin__search__RMD5305669048a8da3b1_at_1524074400_13" "ssname=R4_TEST" "graceful=True" "trigger_time=1524074400" results_file="/opt/splunk/var/run/splunk/dispatch/scheduler__admin__search__RMD5305669048a8da3b1_at_1524074400_13/results.csv.gz"':  _csv.Error: line contains NULL byte

Sendemail

04-12-2018 12:00:05.484 -0300 ERROR script - sid:scheduler__admin__search__RMD5bf6f3132e2acfda9_at_1523545200_31 External search command 'sendemail' returned error code 1.

Appreciate your help, thanks

0 Karma
Get Updates on the Splunk Community!

Buttercup Games: Further Dashboarding Techniques

Hello! We are excited to kick off a new series of blogs from SplunkTrust member ITWhisperer, who demonstrates ...

Message Parsing in SOCK

Introduction This blog post is part of an ongoing series on SOCK enablement. In this blog post, I will write ...

Exploring the OpenTelemetry Collector’s Kubernetes annotation-based discovery

We’ve already explored a few topics around observability in a Kubernetes environment -- Common Failures in a ...