Splunk Search

Most recent event per host ( | Head 1)?

marquiselee
Path Finder

So I need to pull only the most recent event from each of 60+ hosts, and put them in a table.

I'm thinking something like " ...| head 1 per host " would do the job. Any thoughts?

Tags (3)
1 Solution

sdaniels
Splunk Employee
Splunk Employee

This should help:

Look in the comments of the answer to see gkanapathy's reply...similar to what you are thinking but look at his.
http://splunk-base.splunk.com/answers/22564/finding-last-event

Different approach:
http://splunk-base.splunk.com/answers/52891/most-recent-event-from-each-source

View solution in original post

l0gik
Explorer

I had to do a task very similar to this. I had to find the last mode of each district. I used the stats command as seen below:

index=a sourcetype=b MODE_CHANGE
| rename 
CONTENT.m:DISTRICT_NAME AS district, 
CONTENT.m:INSTANCE_MODE as mode,  
CONTENT.m:ALERT_TYPE as type
| stats latest(mode) AS latestMode, latest(district) as latestDistrict, latest(type) as latestType by district
| lookup subDivLookup.csv SubDiv_Name as district OUTPUT SubDiv_ID

| table district SubDiv_ID latestMode latestType  
| sort + district

In my instance there is a ALERT_TYPE that denotes a MODE_CHANGE has occurred. I am using the 'MODE_CHANGE' string after the sourcetype to filter for just these events. From there it is just a stats command to get the latest entry.

0 Karma

elmiguelo123
Engager

An alert like this would be better

index=* [inputlookup SourceInactivos.csv | fields Equipo | rename Equipo as source]
| table host _time
| eval dif=(now()-_time)/3600
| stats first(dif) AS Diferencia by host source
| sort 0 -Diferencia
| where Diferencia>3

whit this you can alert those asset that have not sent logs during the las 3 hours

reswob4
Builder

I was looking through all the Q/A regarding this question (and there are quite a few) and combining a couple suggestions, I tried this search:

<search>  | eval lastseen=strftime(_time, "%b %d %Y %H:%M:%S") | stats first(lastseen) by host

(I'm on 6.5 for reference)

It makes a table as below (sorry for the formatting). Of course you can rename the columns as well:

host        first(lastseen) 
1.1.2.245   Jun 10 2017 21:56:21
1.1.255.1   Jun 16 2017 13:41:43
1.1.2.5     Jun 16 2017 10:35:29
1.2.1.1     Jun 16 2017 10:58:05
1.6.10.1    Jun 15 2017 07:36:17

That seems to be something like what you may want as well...

EDIT:

Adding some other conditions, enables you to turn it into a nicely formatted alert:

| eval lastseen=strftime(_time, "%b %d %Y %H:%M:%S") | eval since=now()-_time | search since<10800 | stats first(lastseen) by host | rename first(lastseen) as "Last Heard From On"

10800 is three hours BTW. Set that time to whatever you want and the alert frequency and you should be good to go.

sdaniels
Splunk Employee
Splunk Employee

This should help:

Look in the comments of the answer to see gkanapathy's reply...similar to what you are thinking but look at his.
http://splunk-base.splunk.com/answers/22564/finding-last-event

Different approach:
http://splunk-base.splunk.com/answers/52891/most-recent-event-from-each-source

marquiselee
Path Finder

this works but is inefficient/slow. The problem is that 1/2 of the hosts are performing a vast majority of the cumulative task (millions) while other host can go days without performing even one task (hence no log event).

Id prefer it if it stopped searching against a host's log once the most recent event has been discovered.

Get Updates on the Splunk Community!

Splunk Forwarders and Forced Time Based Load Balancing

Splunk customers use universal forwarders to collect and send data to Splunk. A universal forwarder can send ...

NEW! Log Views in Splunk Observability Dashboards Gives Context From a Single Page

Today, Splunk Observability releases log views, a new feature for users to add their logs data from Splunk Log ...

Last Chance to Submit Your Paper For BSides Splunk - Deadline is August 12th!

Hello everyone! Don't wait to submit - The deadline is August 12th! We have truly missed the community so ...