Splunk Enterprise Security

KV Store Lookup to Email with timestamp search

Splunkometry88
Explorer

Hi Team

I am searching to confirm the SPL to poll a KV Store check the status of the es_notable_events when a status changes and then send an email, also would like to check timestamps to work out what the newest event is

 

The SPL i have working to send (most of the time) is below 

| inputlookup es_notable_events where status="1"
| sendemail to=email@address.com format=raw subject=Splunk Notable Event sendresults=true

Appreciate any help 

 

Labels (2)
0 Karma

thambisetty
SplunkTrust
SplunkTrust

would you like send an email whenever status is updated or only status is updated with 1.

————————————
If this helps, give a like below.

Splunkometry88
Explorer

Hi There!

 

I would like to send the email when the status is either 1 or 5 i.e. Open/Closed 

This is a clunky way of trying to interface with an external ticketing system

0 Karma

thambisetty
SplunkTrust
SplunkTrust

es_notable_events is lookup updated from saved search(ESS - Notable Events) for every 5 minutes. So if you use es_notable_events to send email notification on status change of notable, you need to wait 1-5 minutes to that change updated in es_notable_events. Hence you can directly use the search of saved search(ESS - Notable Events) which is what I am using here in this solution below:

Firstly, you need to write all your output of below search to lookup table where you will create new fields updated_time value would be now() ,  assuming we have sent an email to all for now.

time_range for below search depends on for how long you wanna keep a track of notable events. es_notable_events lookup will have only last 48 hours of events. if you think all notables status will be changed within 48 hours , you can set time range is 48 hours, I believe you might take more than 48 hours to resolve notable/incident. and this lookup will be growing in future to have track of changes for which we need to send an email 

`notable` 
| search NOT `suppression` 
| eval timeDiff_type=case(_time>=relative_time(now(), "-24h@h"),"current", 1=1, "historical") 
| expandtoken rule_title 
| table _time,event_id,security_domain,urgency,rule_name,rule_title,src,dest,src_user,user,dvc,status,status_group,owner,timeDiff_type,governance,control
| eval updated_time=now()
| outputlookup email_tracker_for_status_change.csv

Assuming you schedule the search for every 10 minutes, it's up to you how frequently you want to run.

`notable` 
| search NOT `suppression` 
| eval timeDiff_type=case(_time>=relative_time(now(), "-24h@h"),"current", 1=1, "historical") 
| expandtoken rule_title 
| table _time,event_id,security_domain,urgency,rule_name,rule_title,src,dest,src_user,user,dvc,status,status_group,owner,timeDiff_type,governance,control

  append intermediate lookup(email_tracker_for_status_change.csv) which we have created earlier

and write updated results to lookup.

| eval updated_time=now()
| append [|inputlookup email_tracker_for_status_change.csv]
| eventstats max(updated_time) as latest_updated_time dc(status) as dc_status by event_id
| where updated_time=latest_updated_time
| outputlookup email_tracker_for_status_change.csv
| where dc_status=2
| fields - dc_status,latest_updated_time

 

final search would be: you can create alert and schedule, you can test before moving this to prod:

`notable` 
| search NOT `suppression` 
| eval timeDiff_type=case(_time>=relative_time(now(), "-24h@h"),"current", 1=1, "historical") 
| expandtoken rule_title 
| table _time,event_id,security_domain,urgency,rule_name,rule_title,src,dest,src_user,user,dvc,status,status_group,owner,timeDiff_type,governance,control

| eval updated_time=now()
| append [|inputlookup email_tracker_for_status_change.csv]
| eventstats max(updated_time) as latest_updated_time dc(status) as dc_status by event_id
| where updated_time=latest_updated_time
| outputlookup email_tracker_for_status_change.csv
| where dc_status=2
| fields - dc_status,latest_updated_time

 

————————————
If this helps, give a like below.
Get Updates on the Splunk Community!

Webinar Recap | Revolutionizing IT Operations: The Transformative Power of AI and ML ...

The Transformative Power of AI and ML in Enhancing Observability   In the realm of IT operations, the ...

.conf24 | Registration Open!

Hello, hello! I come bearing good news: Registration for .conf24 is now open!   conf is Splunk’s rad annual ...

ICYMI - Check out the latest releases of Splunk Edge Processor

Splunk is pleased to announce the latest enhancements to Splunk Edge Processor.  HEC Receiver authorization ...