Alerting

How to pass alert results to custom alert action?

vinod94
Contributor

how do I pass my search output results to a custom alert action script(test.py) which has some static parameters ? for ex. I have a statistics table which has two columns Hosts and Count.. I want to pass the results of these two columns to a static parameter say Description.

Ive come across SPLUNK_ARG_8(sys.argv[8])... tried this but it am getting an error " Alert script returned error code 1., search='sendalert test_dropdown results_file="/opt/splunk/var/run/splunk/dispatch/scheduler_adminsearch_RMD51340b9f59d2d65d1_at_1564127640_31/results.csv.gz " .

How do I use it in my script? Or is der any other way. Any suggestions? Below is part of my script

def openany(p):
       if p.endswith(".gz"):
               return gzip.open(p)
       else:
               return open(p)

results_file = sys.argv[8]

for row in csv.DictReader(openany(results_file)):
        description = "Alert Triggered for " + row["component"] + " value is " + row["count"]


# TODO: Implement your alert action logic here
        url = "https://ensrqbrq8xubd.x.pipedream.net"

        payload = '''{\"Description\\\":\\\"''' + description + '''\\\"}'''.encode('utf8')

    headers = {
    'content-type': "application/json"
    }

        response = requests.request("POST", url, data = payload, headers=headers)
Tags (1)
1 Solution

harsmarvania57
Ultra Champion

Hi,

Have a look at documentation for Custom Alert Action https://docs.splunk.com/Documentation/Splunk/7.3.0/AdvancedDev/ModAlertsIntro for how to setup custom alert action.

Here is sample script https://docs.splunk.com/Documentation/Splunk/7.3.0/AdvancedDev/ModAlertsBasicExample splunk has provided. When you look at the output, you can see results_file in payload output. Read that file in your script and then you can use that output data in your script to do other magic as per your requirement.

View solution in original post

0 Karma

harsmarvania57
Ultra Champion

Hi,

Have a look at documentation for Custom Alert Action https://docs.splunk.com/Documentation/Splunk/7.3.0/AdvancedDev/ModAlertsIntro for how to setup custom alert action.

Here is sample script https://docs.splunk.com/Documentation/Splunk/7.3.0/AdvancedDev/ModAlertsBasicExample splunk has provided. When you look at the output, you can see results_file in payload output. Read that file in your script and then you can use that output data in your script to do other magic as per your requirement.

0 Karma

vinod94
Contributor

Hi @harsmarvania57 ,

I've tried reading the file...but it throwing me an error..

search='sendalert test_dropdown results_file="/opt/splunk/var/run/splunk/dispatch/scheduler_adminsearch_RMD51340b9f59d2d65d1_at_1564127640_31/results.csv.gz " 
0 Karma

harsmarvania57
Ultra Champion

You need to read that file in your python script, not using Splunk Query.

0 Karma

vinod94
Contributor

I am reading with the script only

here's the sample ..

 def openany(p):
        if p.endswith(".gz"):
                return gzip.open(p)
        else:
                return open(p)
 results_file = sys.argv[8]

 for row in csv.DictReader(openany(results_file)):
         description = "Alert Triggered for " + row["component"] + " value is " + row["count"]

not sure if iam doing it in the right way!

0 Karma

harsmarvania57
Ultra Champion

Here is the portion of the script which I have in custom alert action and in alert_actions.conf, I have configured to retrieve payload in json format ( payload_format = json )

if len(sys.argv) > 1 and sys.argv[1] == "--execute":
    payload = json.loads(sys.stdin.read())
    result_file = payload['results_file']
    with gzip.open(result_file, 'rb') as f:
        reader = csv.reader(f)
        header_line = next(reader)
        data = list(reader)

kmarx
Explorer

Just a note: I had to use gzip.open() with mode='rt' for text instead of binary

0 Karma
Get Updates on the Splunk Community!

Strengthen Your Future: A Look Back at Splunk 10 Innovations and .conf25 Highlights!

The Big One: Splunk 10 is Here!  The moment many of you have been waiting for has arrived! We are thrilled to ...

Now Offering the AI Assistant Usage Dashboard in Cloud Monitoring Console

Today, we’re excited to announce the release of a brand new AI assistant usage dashboard in Cloud Monitoring ...

Stay Connected: Your Guide to October Tech Talks, Office Hours, and Webinars!

What are Community Office Hours? Community Office Hours is an interactive 60-minute Zoom series where ...