Alerting

How to write fields from a custom alert action to a local disk?

redacted
Explorer

I am looking to execute on an alert and retrieve and write the available fields from EmailNotificationTokens documentation:
http://docs.splunk.com/Documentation/Splunk/6.5.1/Alert/EmailNotificationTokens
to a file on the localhost in csv or json

I have it writing to another index right now and that is working fine, however writing to index is just causing the issue all over again. I need the data to be able to be written close to real time to a file that can be sent to another server over scheduled scp connections.

I am close, but I just can't seem to find how to write to disk locally

Thanks

0 Karma
1 Solution

gcusello
SplunkTrust
SplunkTrust

Hi redacted,
Do you need to write on the local disk fields your EmailNotificationTokens or the search result?
If you need to write on the local disk, you could insert all the information you need in your search and then use the outputcsv command, the only problem is that Splunk writes csv only in a fixed folder $SPLUNK_HOME/var/run/splunk/csv.
If outputcsv could solve you need you can customize your output using something like this:

...| outputcsv append=true create_empty=true [search * | head 1 | eval query="monitor_splunk_".strftime(now(),"%Y_%m_%d") | fields query | format "" "" "" "" "" ""] singlefile=1

.
If otherwise, you need to write on a file information about the triggered alerts, you could use something like this:

index=_internal source="/opt/splunk/var/log/splunk/scheduler.log" result_count>0  | table _time thread_id app savedsearch_name result_count | join savedsearch_name [| rest /services/saved/searches | dedup search | table author eai:acl.app title alert.severity is_scheduled id qualifiedSearch dispatch.earliest_time | rename dispatch.earliest_time AS timerange title AS savedsearch_name eai:acl.app AS app| fields author app savedsearch_name  alert.severity timerange] | lookup alert_severity.csv severity AS alert.severity OUTPUT Severity | lookup alert_frequency.csv frequency AS timerange OUTPUT Frequency | eval wpname=mvindex(split(savedsearch_name," "), 0)  | stats values(author) AS Author values(app) AS App values(Severity) AS Severity values(Frequency) AS Frequency sparkline count AS Alarms sum(result_count) AS Events by savedsearch_name | sort  -Events severity  | rename savedsearch_name AS "Name" sparkline AS Sparkline Frequency As "Report Frequency" | fieldformat Events=tostring(Events,"commas")

Bye.
Giuseppe

View solution in original post

gcusello
SplunkTrust
SplunkTrust

Hi redacted,
Do you need to write on the local disk fields your EmailNotificationTokens or the search result?
If you need to write on the local disk, you could insert all the information you need in your search and then use the outputcsv command, the only problem is that Splunk writes csv only in a fixed folder $SPLUNK_HOME/var/run/splunk/csv.
If outputcsv could solve you need you can customize your output using something like this:

...| outputcsv append=true create_empty=true [search * | head 1 | eval query="monitor_splunk_".strftime(now(),"%Y_%m_%d") | fields query | format "" "" "" "" "" ""] singlefile=1

.
If otherwise, you need to write on a file information about the triggered alerts, you could use something like this:

index=_internal source="/opt/splunk/var/log/splunk/scheduler.log" result_count>0  | table _time thread_id app savedsearch_name result_count | join savedsearch_name [| rest /services/saved/searches | dedup search | table author eai:acl.app title alert.severity is_scheduled id qualifiedSearch dispatch.earliest_time | rename dispatch.earliest_time AS timerange title AS savedsearch_name eai:acl.app AS app| fields author app savedsearch_name  alert.severity timerange] | lookup alert_severity.csv severity AS alert.severity OUTPUT Severity | lookup alert_frequency.csv frequency AS timerange OUTPUT Frequency | eval wpname=mvindex(split(savedsearch_name," "), 0)  | stats values(author) AS Author values(app) AS App values(Severity) AS Severity values(Frequency) AS Frequency sparkline count AS Alarms sum(result_count) AS Events by savedsearch_name | sort  -Events severity  | rename savedsearch_name AS "Name" sparkline AS Sparkline Frequency As "Report Frequency" | fieldformat Events=tostring(Events,"commas")

Bye.
Giuseppe

redacted
Explorer

Thanks! this would also work, I did not think of the exportcsv option, I assume I could just keep appending to the same one to save a mess of files.

I did end up creating a bash script to pull the search id then, through a the REST api we are pulling down all the required fields and writing them to a json file using the python SDK.

This is fitting requirements and getting us the info in the best format for parsing to another system.

0 Karma
Career Survey
First 500 qualified respondents will receive a $20 gift card! Tell us about your professional Splunk journey.

Can’t make it to .conf25? Join us online!

Get Updates on the Splunk Community!

Community Content Calendar, September edition

Welcome to another insightful post from our Community Content Calendar! We're thrilled to continue bringing ...

Splunkbase Unveils New App Listing Management Public Preview

Splunkbase Unveils New App Listing Management Public PreviewWe're thrilled to announce the public preview of ...

Leveraging Automated Threat Analysis Across the Splunk Ecosystem

Are you leveraging automation to its fullest potential in your threat detection strategy?Our upcoming Security ...