Alerting

Anyone know why my email alert is truncated in the middle of my search string?

Super Champion

I created an alert that will email me any errors that come from my various scripted inputs. The search itself seems to work fine when I run it interactively, so I scheduled it. But when I get the email alert it seems to be truncated in a very odd way. At first I though it was my blackberry refusing to download the entire message, but I've ruled that out...

The search: (reformatted for readability)

index=_internal sourcetype=splunkd component=ExecProcessor "message from" NOT (splunk-regmon OR splunk-wmi)
| rex " - message from \"(?<script>[^\"]+)\""
| rex mode=sed "s/^.* - message from \"[^\"]+\" (.*)$/\1/"
| transaction fields="host,source,script" maxpause=45s
| dedup host, script, _raw
| fields host, script

My email alert looks like this:

To: splunkadmin@example.com
From:  no-reply@splunk.example.com
Subject:[Splunk Alert] Splunk Alert - Scripted input errors

Saved search results.

Name: 'Splunk Alert - Scripted input errors'
Query Terms: 'index=_internal sourcetype=splunkd component=ExecProcessor "message from" NOT (splunk-regmon OR splunk-wmi) | rex " - message from \"(?

Notice that the search is truncated and that I'm missing the actual search results.

Any ideas?


I'm running Splunk 4.1.5, on Ubuntu 8.04 (32 bit), with the postfix mailer, and HTML results.

Background:

I recently had a situation where I lost a bunch of events because of some stupid typo in one of my scripted inputs, so I figured I should setup an alert to avoid this in the future. I found out that any messages that your script writes to stderr (the "standard error" output stream) will be logged in splunk's internal logs with the ExecProcessor component. (I'm using a python input script; and python writes any unhanded exceptions to stderr by default, just like most programming languages). So from there it was simply a matter of filtering out errors created by Splunk's built-in scripted inputs, since I don't care about those errors, so that I can get an email with just the stuff I want to see.

0 Karma

Super Champion

I did a little looking around in the code, and it seems like my specific issue could be fixed pretty easily. I made the following modification starting at line 122 (version 4.1.5):

This code:

    query = argvals.get("ssquery", None)
    if query != None:
        intro += "Query Terms: \'" + query + "\'\n"

Replace with:

    query = argvals.get("ssquery", None)
    if query != None:
        if not plainText:
            query = saxutils.escape(query)
        intro += "Query Terms: \'" + query + "\'\n"

Keep in mind that this script is replaced every time you upgrade splunk (or whenever the "search" app is updated/replaced). So this is more of a temporary "hotfix" for anyone else dealing with this issue. (This was reported to Splunk support, SPL-34741, so a real fix should be forthcoming.)


BTW, it would appear that there are quite a number of different fields that are not properly HTML escaped, like the query field discussed above.

Actually, you can have a little fun with this: Here's an example that changes the color of the "host" column to red.

* | head 1 | fields host | rename "host" as "<font color=red>host</font>" | sendemail to=your.email@example.com subject="a colorful email from splunk" format=html sendresults=true

I guess you could pass this off as a feature, but it seems more like a hack, IMHO.

Splunk Employee
Splunk Employee

hmm, here is what i get when i run your saved search.. Name: 'lowells' Query Terms: 'index=_internal sourcetype=splunkd component=ExecProcessor "message from" NOT (splunk-regmon OR splunk-wmi)| rex " - message from \"(?[^\"]+)\""| rex mode=sed "s/^.* - message from \"[^\"]+\" (.*)$/\1/"| transaction fields="host,source,script" maxpause=45s| dedup host, script, _raw| fields host, script'

0 Karma

Splunk Employee
Splunk Employee

Lowell,

I tried your scenario but it doesnt seem to be behaving exactly like yours. ie. i do not get a truncation like yours.

here is a simplified search version of yours:

Saved search results. 

 Name: 'lowells2' 
 Query Terms: 'index=_internal sourcetype=splunkd component=ExecProcessor "message from" NOT (splunk-regmon OR splunk-wmi) | rex " - message from \"(?[^\"]+)\"" ' 

Note: the search query is not truncated like yours, (i just shortened the search)
However also note that it is not the correct query either.

| rex " - message from \"(?[^\"]+)\"" vs | rex " - message from \"(?<script>[^\"]+)\""

0 Karma

Super Champion

Thanks for checking into this. BTW, I've come up with a temporary workaround and posted it as a separate answer.

0 Karma

Super Champion

Looks like the problem was in my rex command:

| rex " - message from \"(?<script>[^\"]+)\""

Note that it contains <script>, which is an HTML tag. Something is happening to this somewhere either in splunk or in some email server, the body of the email looks to be truncated because of this tag. I don't know if this is an intentional behavior or not, but it's easy enough to work around. I'm guessing there are security implications here.


This search works fine:

index=_internal sourcetype=splunkd component=ExecProcessor "message from" NOT (splunk-regmon OR splunk-wmi OR /opt/splunk/etc/apps/unix/bin/*.sh)
| rex " - message from \"(?<inputscript>[^\"]+)\"" |
| rename inputscript as script
| rex mode=sed "s/^.* - message from \"[^\"]+\" (.*)$/\1/"
| transaction fields="host,source,script" maxpause=45s
| dedup host, script, _raw
| fields host, script

Side note: The email still has the wrong literal search string. The rex command that extracts the "inputscript" field, shows up in my email as: rex " - message from \"(?[^\"]+)\"". So don't expect to copy-n-paste that search from the email and have it work. I looked around in sendemail.py and it looks like these fields don't get passed through saxutils.escape() like the actual result values. Whoops.

I hope this helps other people. Avoid extracting a field called "script" using rex in your search if you want to email it...

0 Karma
State of Splunk Careers

Access the Splunk Careers Report to see real data that shows how Splunk mastery increases your value and job satisfaction.

Find out what your skills are worth!