Splunk Search

Print searches with zero results

jfraley
Explorer

I have the following search:

index=ldap_csv |rename uid as user, extraced_host as host 
   | join user [search source=/var/log/secure   vendor_action="Accepted"
   | stats count as total_logins,  earliest(_time) as "first_login" latest(_time) as "last_login" by host user 
   |eval "first_login"=strftime(first_login, "%m/%d/%y - %I:%M:%S:%p")
   |eval "last_login"=strftime(last_login, "%m/%d/%y - %I:%M:%S:%p") ] 
   |table user host  first_login last_login total_logins

that returns exactly what I need except the results with zero total_logins. I have read and tried a number of solutions, but can not get the results I need. Any guidance here would be appreciated.

Tags (2)
0 Karma
1 Solution

woodcock
Esteemed Legend

Try this:

(index=ldap_csv) OR (index=YouShoulAlwaysSpecifyAnIndex AND sorucetype=AndSourcetypeToo AND source=/var/log/secure AND vendor_action="Accepted"
| eval user=if(index=="ldap_csv", uid, user)
| eval host=if(index=="ldap_csv", extraced_host, host)
| stats count AS total_logins,  min(_time) AS first_login max(_time) AS last_login dc(index) AS index_count values(index) AS indexes BY host user
| search indexes="ldap_csv"
| fieldformat first_login = strftime(first_login, "%m/%d/%y - %I:%M:%S:%p")
| fieldformat last_login = strftime(last_login, "%m/%d/%y - %I:%M:%S:%p")
| table user host first_login last_login total_logins index_count

View solution in original post

0 Karma

woodcock
Esteemed Legend

Try this:

(index=ldap_csv) OR (index=YouShoulAlwaysSpecifyAnIndex AND sorucetype=AndSourcetypeToo AND source=/var/log/secure AND vendor_action="Accepted"
| eval user=if(index=="ldap_csv", uid, user)
| eval host=if(index=="ldap_csv", extraced_host, host)
| stats count AS total_logins,  min(_time) AS first_login max(_time) AS last_login dc(index) AS index_count values(index) AS indexes BY host user
| search indexes="ldap_csv"
| fieldformat first_login = strftime(first_login, "%m/%d/%y - %I:%M:%S:%p")
| fieldformat last_login = strftime(last_login, "%m/%d/%y - %I:%M:%S:%p")
| table user host first_login last_login total_logins index_count
0 Karma

jfraley
Explorer

Thanks, with just a few modifications I have exactly what I was looking for.

This is what I now have:

(index=ldap_csv) OR (index=syslog_main AND sourcetype="linux_secure" AND source=/var/log/secure AND vendor_action="Accepted")
| eval user=if(index=="ldap_csv", uid, user)
| eval host=if(index=="ldap_csv", extracted_host, host)
| stats count AS total_logins, min(_time) AS first_login max(_time) AS last_login dc(index) AS index_count values(index) AS indexes BY host user
| search indexes="ldap_csv"
| fieldformat first_login = strftime(first_login, "%m/%d/%y - %I:%M:%S:%p")
| fieldformat last_login = strftime(last_login, "%m/%d/%y - %I:%M:%S:%p")
| eval total_logins=total_logins-1
| eval first_login=if(first_login==last_login,"",first_login)
| eval last_login=if(first_login=="","",last_login)
| table user host first_login last_login total_logins index_count

0 Karma

jfraley
Explorer

I was able to get the results I needed using:

index=syslog_main sourcetype="linux_secure" vendor_action="Accepted" |stats count as found by user host |appendcols [search index="ldap_csv" |rename uid as user, extracted_host as host]|table user host found

0 Karma

jfraley
Explorer

I actually pasted the wrong search. This is what I have:

index="syslog_main" sourcetype=linux_secure vendor_action="Accepted"
[ search index=ldap_csv |fields uid extracted_host |rename uid as user, extracted_host as host ]
| stats count as total_logins, earliest(_time) as "first_login" latest(_time) as "last_login" by host user
| eval "first_login"=strftime(first_login, "%m/%d/%y - %I:%M:%S:%p")
|eval "last_login"=strftime(last_login, "%m/%d/%y - %I:%M:%S:%p")
| table user host first_login last_login total_logins

I only want the users and hosts from the index=ldap_csv in my table, I also need to know when there are no entries in the index=syslog_main for those users.

0 Karma

bandit
Motivator

Append query that returns a list of all users

something like this for your first query:
all user query here | table user | eval total_logins=0
| append [ search query with actual events ]
| stats count AS total_logins, min(_time) AS first_login max(_time) AS last_login BY host user index
| eval count=count-1

0 Karma

jfraley
Explorer

I tried your suggestion. I actually put the snippet in multiple places to see if I could get the 0 results, but I never got them.

0 Karma

bandit
Motivator

I switched the join to append.

0 Karma

woodcock
Esteemed Legend

Never use join. Try this:

(index=ldap_csv) OR (index=YouShoulAlwaysSpecifyAnIndex AND sorucetype=AndSourcetypeToo AND source=/var/log/secure AND vendor_action="Accepted"
| eval user=if(index=="ldap_csv", uid, user)
| eval host=if(index=="ldap_csv", extraced_host, host)
| stats count AS total_logins,  min(_time) AS first_login max(_time) AS last_login BY host user index
| search index!="ldap_csv"
| fieldformat first_login = strftime(first_login, "%m/%d/%y - %I:%M:%S:%p")
| fieldformat last_login = strftime(last_login, "%m/%d/%y - %I:%M:%S:%p")
| table user host first_login last_login total_logins
0 Karma

jfraley
Explorer

woodstock, I like your format, but the results are not what I am looking for. I only want the table to contain the users and hosts from the index=ldap_csv. Your results returned a concatenated table. I also need to know when the the user/host does not have any entries in the syslog_main index.

0 Karma

jfraley
Explorer

This is what I have now

index="syslog_main" sourcetype="linux_secure" vendor_action="Accepted"
[ search index=ldap_csv |fields uid extracted_host |rename uid as user, extracted_host as host ]
| stats count as total_logins, earliest(_time) as "first_login" latest(_time) as "last_login" by host user
| eval "first_login"=strftime(first_login, "%m/%d/%y - %I:%M:%S:%p")
|eval "last_login"=strftime(last_login, "%m/%d/%y - %I:%M:%S:%p")
| table user host first_login last_login total_logins

0 Karma

woodcock
Esteemed Legend

See my new answer.

0 Karma
Get Updates on the Splunk Community!

Webinar Recap | Revolutionizing IT Operations: The Transformative Power of AI and ML ...

The Transformative Power of AI and ML in Enhancing Observability   In the realm of IT operations, the ...

.conf24 | Registration Open!

Hello, hello! I come bearing good news: Registration for .conf24 is now open!   conf is Splunk’s rad annual ...

ICYMI - Check out the latest releases of Splunk Edge Processor

Splunk is pleased to announce the latest enhancements to Splunk Edge Processor.  HEC Receiver authorization ...