Splunk Search

How to format an alert inline table to control what fields are displayed?

Explorer

I am testing an alert which sends out an email when members are added to an Active Directory group. It works fine, but when there are mass changes, we get email floods.

I set the alert from per incident to a rolling window. This works, but the details are lost. So I added an inline table. I only want two fields in the table, but no matter what I chang,e I get seemingly all the fields.

Is there a way to control what fields are in the inline table?

My alert search is:

sourcetype="WinEventLog:Security" eventtype=wineventlog_security EventCode=4728 OR EventCode=4756 OR EventCode=4732

I tried adding pipes to "fields" and "table", but neither changed the email.

0 Karma
1 Solution

Explorer

I FOUND THE ANSWER/BUG!!

Short version. After lots of trials and code reading and updates, I came across a post about cleaning up the format. The poster explains how to edit alert_action.conf to change the command= line. I try out his code, but nothing changed. However, upon restarting splunk I took note of the output (I had normally ignored the output, since it has run fine and been this way for years). Low and behold, stanza errors in two files

Invalid key in stanza [email] in /opt/splunk/etc/apps/search/local/alert_actions.conf, line 2: reportServerEnabled  (value:  0).
Invalid key in stanza [email] in /opt/splunk/etc/apps/search/local/alert_actions.conf, line 3: reportServerURL  (value:  ).
Invalid key in stanza [email] in /opt/splunk/etc/system/local/alert_actions.conf, line 2: reportServerEnabled  (value:  0).
Invalid key in stanza [email] in /opt/splunk/etc/system/local/alert_actions.conf, line 3: reportServerURL  (value:  ).

Once I commented out these lines and restarted splunk (now with no errors). The inline fields stuff worked correctly. I'm not sure which key in which file caused the issue.

View solution in original post

0 Karma

Explorer

I FOUND THE ANSWER/BUG!!

Short version. After lots of trials and code reading and updates, I came across a post about cleaning up the format. The poster explains how to edit alert_action.conf to change the command= line. I try out his code, but nothing changed. However, upon restarting splunk I took note of the output (I had normally ignored the output, since it has run fine and been this way for years). Low and behold, stanza errors in two files

Invalid key in stanza [email] in /opt/splunk/etc/apps/search/local/alert_actions.conf, line 2: reportServerEnabled  (value:  0).
Invalid key in stanza [email] in /opt/splunk/etc/apps/search/local/alert_actions.conf, line 3: reportServerURL  (value:  ).
Invalid key in stanza [email] in /opt/splunk/etc/system/local/alert_actions.conf, line 2: reportServerEnabled  (value:  0).
Invalid key in stanza [email] in /opt/splunk/etc/system/local/alert_actions.conf, line 3: reportServerURL  (value:  ).

Once I commented out these lines and restarted splunk (now with no errors). The inline fields stuff worked correctly. I'm not sure which key in which file caused the issue.

View solution in original post

0 Karma

Esteemed Legend

Try this as a workaround until there is a bugfix:

.... | table _time Message | outputcsv TempFile.csv | fields - Account_Domain Account_Name CategoryString ComputerName EventCode Group_Domain Group_Name Logon_ID Security_ID SourceName Target_Server_Name _raw action app change_type dest_nt_host host index linecount name object_category product result signature signature_id source sourcetype splunk_server | where ThisFieldDoesNotExist="So This Will Drop All Events" | appendpipe [|inputcsv TempFile.csv]
0 Karma

Explorer

Even this did not work. As a test, I ran a manual search and ended it with | table _time Message | sendemail sendresults=true inline=true

The email is sent was correct. It only included the two fields. I get the feeling that either the send email is a different function or the alert process is not sending the piped commands to the email process.

Since the "| sendemail" works, is there a way to run a scheduled alert but run sendemail only if results > 0? I don't want "per result", but maybe an hourly or sub-hourly email report of Group additions since last interval. With no email if nothing was added.

Thanks for all the help, btw.

0 Karma

Esteemed Legend

None of this gymnastics should be necessary but try this:

.... | fields - Account_Domain Account_Name CategoryString ComputerName EventCode Group_Domain Group_Name Logon_ID Security_ID SourceName Target_Server_Name _raw action app change_type dest_nt_host host index linecount name object_category product result signature signature_id source sourcetype splunk_server
0 Karma

Explorer

Seemingly the same result:

Account_Domain Account_Name CategoryString ComputerName EventCode Group_Domain Group_Name Logon_ID Message Security_ID SourceName Target_Server_Name _raw _time action app change_type dest_nt_host host index linecount name object_category product result signature signature_id source sourcetype splunk_server subject tag::app vendor vendor_privilege

I also watched the search in realtime and the search results showed limited fields. It seems to be an issue with the Alert Action of send email with the "inline table" function. I'm not aware of any modifications to the sendmail.py, which seems to be a common solution to changing send email behavior. Also note, I am running 6.3.2 on CentOS 6.4 or 6.5. We are looking to upgrade to 6.4 soon, perhaps this is a bug of some kind? I don't want to blind upgrade just to see if it helps, but this may accelerate the upgrade or decelerate this alert change.

0 Karma

Esteemed Legend

It is clearly a bug so I would open a support case. I had hoped there would be a work around, though.

0 Karma

Esteemed Legend

Show the entire search. Adding ... | table field1 field2 definitely should have done the trick.

0 Karma

Explorer

when I add |table _time Message I get the following fields when inline table is selected:

Account_Domain Account_Name CategoryString ComputerName EventCode Group_Domain Group_Name Logon_ID Message Security_ID SourceName Target_Server_Name _raw _time action app change_type dest_nt_host host index linecount name object_category product result signature signature_id source sourcetype splunk_server

0 Karma
State of Splunk Careers

Access the Splunk Careers Report to see real data that shows how Splunk mastery increases your value and job satisfaction.

Find out what your skills are worth!