Security

Query to find failed logins by domain admins

swamysanjanaput
Explorer

I was using the below search query to find the failed logins by domain admins however i was asked not to use lookup file cause client wanted the real time data.

macro EventCode=4625 | stats count values(Workstation_Name) AS Workstation_Name, Values(Source_Network_Address)
AS Source_IP_Address, values(host) AS Host by Account_Name | where count > 5
| search [| inputlookup xxxxxxx.csv | rex field=user "(?\w+)\W(?\w+)" | table Account_Name]

So now i have two searches which i need to combine to get the real time data however i am unable to do so and not getting the correct results.
Query 1 : To find failed logins for all users
macro EventCode=4625 | stats count values(Workstation_Name) AS Workstation_Name, Values(Source_Network_Address)
AS Source_IP_Address, values(host) AS Host by Account_Name | where count > 5

Query 2: To fetch all domain admin details (FYI: We are not using AD add-in our environment to read AD directly)

index=xxxxxxxx
| rex field=distinguishedName "DC=(?[\w]+)"
| eval user_domain=upper(user_domain)
| eval user=user_domain + "\" + sAMAccountName
| eval user=upper(user)
| eval lastLogonTimestamp=strptime(lastLogonTimestamp, "%Y-%m-%dT%H:%M:%s.%6N")
| stats latest(lastLogonTimestamp) as lastlogon latest(timeCollected) as timecollected values(memberOf{}) as memberof by user
| fields - lastlogon timecollected
| search memberof IN ("CN=xxxx*", "CN=xxxx,CN=xxxx*")

I need to now combine query1 and query2 searches to get real time data(i.e. failed logins by domain admins details in real time). So, could anyone help me in merging these two queries. Thanks in advance

Tags (2)
0 Karma

dmarling
Builder

How often is the data on query 2 written to those logs? Is it being written at the same time as the data that is being ingested by query 1? If so, the query will can be written to combine the two different requests into a single query that has the stats join the data together by the account_name. If not, we can still do that, but we will have to get a little more creative and I'll need to know the frequency that that data is ingested so we will know how far back to look when creating the query.

If this comment/answer was helpful, please up vote it. Thank you.
0 Karma
Get Updates on the Splunk Community!

What's new in Splunk Cloud Platform 9.1.2312?

Hi Splunky people! We are excited to share the newest updates in Splunk Cloud Platform 9.1.2312! Analysts can ...

What’s New in Splunk Security Essentials 3.8.0?

Splunk Security Essentials (SSE) is an app that can amplify the power of your existing Splunk Cloud Platform, ...

Let’s Get You Certified – Vegas-Style at .conf24

Are you ready to level up your Splunk game? Then, let’s get you certified live at .conf24 – our annual user ...