Activity Feed
- Posted Re: Question on shell script for linux on Splunk Dev. 09-17-2020 04:01 PM
- Karma Re: Question on shell script for linux for thambisetty. 09-17-2020 03:58 PM
- Posted Question on shell script for linux on Splunk Dev. 09-16-2020 03:54 PM
- Karma Re: KV Store Lookup to Email with timestamp search for thambisetty. 09-10-2020 03:07 PM
- Karma Re: Monitor Notable Event KV Store, Send Email via scheduled search for thambisetty. 09-10-2020 03:06 PM
- Karma KV Store Lookup to Email with timestamp search for thambisetty. 09-10-2020 12:09 AM
- Posted Re: KV Store Lookup to Email with timestamp search on Splunk Enterprise Security. 09-10-2020 12:08 AM
- Posted KV Store Lookup to Email with timestamp search on Splunk Enterprise Security. 09-09-2020 11:42 PM
- Posted Monitor Notable Event KV Store, Send Email via scheduled search on Reporting. 09-09-2020 05:01 PM
- Karma Re: Send email on Notable Event close action for Jhunter. 09-06-2020 03:09 PM
- Karma Re: Adding manually downloaded Threat Intel file into Splunk ES for starcher. 09-06-2020 03:08 PM
- Posted Send email on Notable Event close action on Splunk Enterprise Security. 09-02-2020 03:55 PM
- Posted Adding manually downloaded Threat Intel file into Splunk ES on Splunk Enterprise Security. 09-01-2020 04:18 PM
Topics I've Started
Subject | Karma | Author | Latest Post |
---|---|---|---|
0 | |||
0 | |||
0 | |||
0 | |||
0 |
09-18-2020
01:06 AM
if your API doesn't have auth then you can directly call API request from Splunk Enterprise security. you can refer fields with field names. Look at existing feeds to get an idea.
... View more
09-10-2020
01:07 AM
1 Karma
duplicate question: https://community.splunk.com/t5/Splunk-Enterprise-Security/KV-Store-Lookup-to-Email-with-timestamp-search/m-p/518826/highlight/false#M9248
... View more
09-10-2020
01:06 AM
1 Karma
es_notable_events is lookup updated from saved search(ESS - Notable Events) for every 5 minutes. So if you use es_notable_events to send email notification on status change of notable, you need to wait 1-5 minutes to that change updated in es_notable_events. Hence you can directly use the search of saved search(ESS - Notable Events) which is what I am using here in this solution below: Firstly, you need to write all your output of below search to lookup table where you will create new fields updated_time value would be now() , assuming we have sent an email to all for now. time_range for below search depends on for how long you wanna keep a track of notable events. es_notable_events lookup will have only last 48 hours of events. if you think all notables status will be changed within 48 hours , you can set time range is 48 hours, I believe you might take more than 48 hours to resolve notable/incident. and this lookup will be growing in future to have track of changes for which we need to send an email `notable`
| search NOT `suppression`
| eval timeDiff_type=case(_time>=relative_time(now(), "-24h@h"),"current", 1=1, "historical")
| expandtoken rule_title
| table _time,event_id,security_domain,urgency,rule_name,rule_title,src,dest,src_user,user,dvc,status,status_group,owner,timeDiff_type,governance,control
| eval updated_time=now()
| outputlookup email_tracker_for_status_change.csv Assuming you schedule the search for every 10 minutes, it's up to you how frequently you want to run. `notable`
| search NOT `suppression`
| eval timeDiff_type=case(_time>=relative_time(now(), "-24h@h"),"current", 1=1, "historical")
| expandtoken rule_title
| table _time,event_id,security_domain,urgency,rule_name,rule_title,src,dest,src_user,user,dvc,status,status_group,owner,timeDiff_type,governance,control append intermediate lookup(email_tracker_for_status_change.csv) which we have created earlier and write updated results to lookup. | eval updated_time=now()
| append [|inputlookup email_tracker_for_status_change.csv]
| eventstats max(updated_time) as latest_updated_time dc(status) as dc_status by event_id
| where updated_time=latest_updated_time
| outputlookup email_tracker_for_status_change.csv
| where dc_status=2
| fields - dc_status,latest_updated_time final search would be: you can create alert and schedule, you can test before moving this to prod: `notable`
| search NOT `suppression`
| eval timeDiff_type=case(_time>=relative_time(now(), "-24h@h"),"current", 1=1, "historical")
| expandtoken rule_title
| table _time,event_id,security_domain,urgency,rule_name,rule_title,src,dest,src_user,user,dvc,status,status_group,owner,timeDiff_type,governance,control
| eval updated_time=now()
| append [|inputlookup email_tracker_for_status_change.csv]
| eventstats max(updated_time) as latest_updated_time dc(status) as dc_status by event_id
| where updated_time=latest_updated_time
| outputlookup email_tracker_for_status_change.csv
| where dc_status=2
| fields - dc_status,latest_updated_time
... View more
09-04-2020
02:43 PM
1 Karma
The only thing I can think of is a new correlation search (or scheduled search - an Alert with email as trigger actions) that looks at the incident_review.csv (or the macro `incident_review` which has better context) and tracks status changes for notables going from 1 to 5. One way without thinking about the logic too deeply is to create a new CSV with all notables with unclosed status (coming from the incident_review.csv) Have the search run every 5-15 minutes (it shouldn't be resource intensive) and use a lookup command against incident_review.csv and look for where one of the unclosed notables has changed to a closed status. Hope this helps..
... View more
09-04-2020
05:55 AM
1 Karma
Unfortunately There is no custom download like that. You'd need to write some code like python to download the file. Ensure it is in the right csv format. THEN tell ES to monitor ES/Ingest that file.
... View more