Splunk Search

How do you make a 'stats count' where field count > 20... is grouped by another field?

mandyh
New Member

We need a report that lists the USERIDS that have more than 20 failed logins per DBNAME (a failed login is RETURNCODE!=0). In the report, I would like to see not just the USERID, RETURNCODE, DBNAME and count, but I would also like to see it grouped by USERHOST.

However, when I include USERHOST in the query, it does not include instances of when, for example, 15 failed logins come from HOST A and 15 failed logins come from HOST B:

ACTION="100" RETURNCODE!=0 |stats count by DBNAME, USERID, RETURNCODE, USERHOST | search count >=20

The above query will only include users that have 20 or more failed logins from the same host.

I can omit USERHOST and then the report will include the combined total. But then we have to run separate reports to query and see what the USERHOSTs are:

ACTION="100" RETURNCODE!=0 |stats count by DBNAME, USERID, RETURNCODE | search count >=20

Any help would be greatly appreciated. TIA!

Tags (4)
0 Karma
1 Solution

Vijeta
Influencer

Try this

ACTION="100" RETURNCODE!=0 |stats count , values(USERHOST)  as USERHOST by DBNAME, USERID, RETURNCODE | search count >=20

View solution in original post

0 Karma

Vijeta
Influencer

Try this

ACTION="100" RETURNCODE!=0 |stats count , values(USERHOST)  as USERHOST by DBNAME, USERID, RETURNCODE | search count >=20
0 Karma

mandyh
New Member

Thank you! That query works and reports what we need.

0 Karma
Get Updates on the Splunk Community!

Splunk Observability Cloud's AI Assistant in Action Series: Auditing Compliance and ...

This is the third post in the Splunk Observability Cloud’s AI Assistant in Action series that digs into how to ...

Splunk Community Badges!

  Hey everyone! Ready to earn some serious bragging rights in the community? Along with our existing badges ...

What You Read The Most: Splunk Lantern’s Most Popular Articles!

Splunk Lantern is a Splunk customer success center that provides advice from Splunk experts on valuable data ...