Hello,
I am trying to troubleshoot sendemail.py since after an upgrate to red hat 9 our splunk stopped sending emails.
I understand the command to use the splunk python interpreter in the cli is:
splunk cmd python /opt/splunk/etc/apps/search/bin/sendemail.py
however, how do i combine the above with the below _internal search results so i can see what the interpreter would provide as feedback (such as errors).
_raw results of a sendemail:
subject="old: : $: server-prod - AlertLog_Check - 4 Log(s) ", encoded_subject="old: : $: server-prod - AlertLog_Check - 4 Log(s) ", results_link="https://MyWebsite:8080/app/search/@go?sid=scheduler__nobody__search__RMD50fd7c7e5334fc616_at_1712993040_1213", recipients="['sysadmin@MyWebsite.com']", server="localhost"
any examples would e greatly apreciated, thanks,
A totally blind Splunker with a mission 🙂
Hi @alfredoh14,
The best way to test sendemail.py is using a search:
| sendemail to="test@example.com" subject="Test Message"
The script reads configuration from alert_actions.conf using the Splunk REST API, and testing the script from the command-line isn't straightforward. (See the sendEmail function in sendemail.py.)
Messages will be logged to $SPLUNK_HOME/var/log/splunk/python.log. Errors will also be logged to search.log and displayed in Splunk Web.
The default log level is specified in $SPLUNK_HOME/etc/log.cfg:
[python]
splunk = INFO
You can change the log level in $SPLUNK_HOME/etc/log-local.cfg:
[python]
splunk = DEBUG
Restart Splunk after modifying log-local.cfg. Other Splunk Python scripts will produce verbose debug output. I recommend returning the log level to INFO when you're finished debugging.
You can search python.log directly from Splunk:
index=_internal source=*python.log* sendemail
I also recommend opening a support case. If you find a compatibility issue between sendemail.py and a specific RHEL 9 configuration in the latest maintenance release of a supported version of Splunk, either sendemail.py can be fixed or Splunk documentation can be updated.
that was awsome help, it really just solidify the assumption that I had which was that splunk was sending the emails to the postfix server and they were getting dropped by it.
My next step was to add debugging on postfix, and i found out that the "to" field was splunk@prod not splunk@prod.mydomain.com
the server name (hostname) on linux is prod
but when i access splunk web UI, it is https://prod.mydomain.com:8080/en-US
When I added the python debugging it only showed that the user who was sending it was splunk and the domain was (based on debug logs of python sendemail):
from': 'splunk', 'hostname': 'prod.mydomain.com',
this did not show me that the sendemail.py command would construct the from field to only splunk@prod (I had to look at the postfix logs for that info).
My stanza in /opt/splunk/etc/system/local/alert_action.conf:
[email]
hostname = prod.mydomain.com
again, we can send emails from cli using mailx but splunk cannot using sendemail.py because it is not cnstructing the from field correctly so although postfix sents it, the smtp server which receives it drops it.
so, do you have any idea of where I have to set a setting so that the from field is constructed correctly by the sendemail.py script?
Hi @alfredoh14,
By default, the From: address is splunk. Are you using default Splunk email settings and a local instance of postfix? Because no domain is specified, postfix likely appends the host name to the user name, i.e. splunk@prod, before forwarding the message to either an upstream relay or the recipient's mail server.
You can set the From: address in Splunk Web from Settings > Server settings > Email settings in the "Send emails as" setting. This will update $SPLUNK_HOME/etc/system/local/alert_actions.conf.
For example, using no-reply@mydomain.com:
[email]
from = no-reply@mydomain.com
Hi @alfredoh14,
The best way to test sendemail.py is using a search:
| sendemail to="test@example.com" subject="Test Message"
The script reads configuration from alert_actions.conf using the Splunk REST API, and testing the script from the command-line isn't straightforward. (See the sendEmail function in sendemail.py.)
Messages will be logged to $SPLUNK_HOME/var/log/splunk/python.log. Errors will also be logged to search.log and displayed in Splunk Web.
The default log level is specified in $SPLUNK_HOME/etc/log.cfg:
[python]
splunk = INFO
You can change the log level in $SPLUNK_HOME/etc/log-local.cfg:
[python]
splunk = DEBUG
Restart Splunk after modifying log-local.cfg. Other Splunk Python scripts will produce verbose debug output. I recommend returning the log level to INFO when you're finished debugging.
You can search python.log directly from Splunk:
index=_internal source=*python.log* sendemail
I also recommend opening a support case. If you find a compatibility issue between sendemail.py and a specific RHEL 9 configuration in the latest maintenance release of a supported version of Splunk, either sendemail.py can be fixed or Splunk documentation can be updated.
If you really do need to test from a shell, enable debug logging as previously noted, and then from the shell (Bash in this example) on the Splunk host, run:
# this script assumes your management port is 8089
$SPLUNK_HOME/bin/splunk login
$SPLUNK_HOME/bin/splunk cmd python $SPLUNK_HOME/etc/apps/search/bin/sendemail.py 'to="test@example.com" subject="Test Message"' << EOF
authString:$(echo -n $(cat ~/.splunk/authToken_splunk_8089))
sessionKey:$(echo -n $(sed -re 's/.*<sessionkey>(.*)<\/sessionkey>.*/\1/' ~/.splunk/authToken_splunk_8089))
owner:$(echo -n $(sed -re 's/.*<username>(.*)<\/username>.*/\1/' ~/.splunk/authToken_splunk_8089))
namespace:search
sid:
_time,_raw
"1713023419","This is the first event/row."
"1713023420","This is the second event/row."
EOF
$SPLUNK_HOME/bin/splunk logout
Note that the empty line between sid: and _time is mandatory. The empty line indicates to Intersplunk that CSV formatted search results follow. The setting:value entries before the empty line represent the Intersplunk header. sendemail.py makes several Splunk REST API calls and requires a session key and app context to work correctly.
The Splunk login command will create a new session and cache your username, session key, etc. in ~/.splunk/authToken_splunk_8089. The Splunk logout command will invalidate the session and remove ~/.splunk/authToken_splunk_8089.