Alerting

How to generate an alert when a user logs in for the first time

bgagliardi1
Path Finder

Is there a way to generate 1 alert for the first time a user logs into something?

I've been thinking through this all morning and came up with a potential way to go about it - I've done my search and sorted by _time so that the first event is at the top, and remove all other "duplicate" events specific to the email(username) field. If someone knows how to generate 1, and only 1, alert per unique event specific to the raw data that could potentially work too.

index=myindex “actor.email”=* “events{}.name”=login_success | bucket _time span=1d | iplocation ipAddress | stats count by _time,“actor.email”,“events{}.name”,ipAddress,City,Region,Country | where count=1 | sort _time | dedup actor.email

I recognize that the bucket _time is useless in this case ;).

Thanks,

BG

Tags (2)
0 Karma
1 Solution

DalJeanis
Legend

One way to do this is to run an all-time search (the bin assumes you want to thereafter run it every hour)...

index=foo search that gets all user logons
| stats min(_time) as mintime by userfield
| bin _time as reporttime span=1h
| outputlookup append=f mylookup.csv

Then every hour you run something like this...

earliest=-1h@h latest=@h index=foo (your search that gets logons)
| bin _time as reporttime span=1h
| stats min(_time) as mintime min(reporttime) as reporttime by userfield
| inputlookup append=t  mylookup.csv
| stats min(mintime) as mintime min(reporttime) as reporttime by userfield
| outputlookup append=f mylookup.csv

| rename COMMENT as "now we find which ones were new.  They will have rounded down to be equal to the earliest parameter"
| addinfo
| where reporttime = info_min_time

View solution in original post

0 Karma

DalJeanis
Legend

One way to do this is to run an all-time search (the bin assumes you want to thereafter run it every hour)...

index=foo search that gets all user logons
| stats min(_time) as mintime by userfield
| bin _time as reporttime span=1h
| outputlookup append=f mylookup.csv

Then every hour you run something like this...

earliest=-1h@h latest=@h index=foo (your search that gets logons)
| bin _time as reporttime span=1h
| stats min(_time) as mintime min(reporttime) as reporttime by userfield
| inputlookup append=t  mylookup.csv
| stats min(mintime) as mintime min(reporttime) as reporttime by userfield
| outputlookup append=f mylookup.csv

| rename COMMENT as "now we find which ones were new.  They will have rounded down to be equal to the earliest parameter"
| addinfo
| where reporttime = info_min_time
0 Karma

jconger
Splunk Employee
Splunk Employee

Check out the Splunk Security Essentials app (this is a free app). There are a lot of great examples in there including "First Time Seen Searches". From the documentation:

"First time analysis detects the first time that an action is performed. This helps you identify out of the ordinary behavior that could indicate suspicious or malicious activity. For example, service accounts typically log in to the same set of servers. If a service account logs into a new device one day, or logs in interactively, that new behavior could indicate malicious activity."

0 Karma

xpac
SplunkTrust
SplunkTrust

Well, I think I would generate a lookup, and write all users to that lookup. You could then later, run another search every X minutes over the last X minutes to fetch all users from that time, do the lookup on them, and then create an alert for every user that was not in that lookup before.

0 Karma
Get Updates on the Splunk Community!

Earn a $35 Gift Card for Answering our Splunk Admins & App Developer Survey

Survey for Splunk Admins and App Developers is open now! | Earn a $35 gift card!      Hello there,  Splunk ...

Continuing Innovation & New Integrations Unlock Full Stack Observability For Your ...

You’ve probably heard the latest about AppDynamics joining the Splunk Observability portfolio, deepening our ...

Monitoring Amazon Elastic Kubernetes Service (EKS)

As we’ve seen, integrating Kubernetes environments with Splunk Observability Cloud is a quick and easy way to ...