Archive

Can any one provide me with a good search to monitor after hour employee login?

New Member

I need a good search to monitor after hour employee login, can anyone help please?

0 Karma

Ultra Champion

Do you want to search against raw events (if so: what does your data look like)? Or do you want to search against (accelerated) authentication data model or so?

What is your definition of "after hour", do you need to take into account different time zones?
Do you need to take into account different locations using different work hours / days (e.g. middle east having a different weekend, certain locations/teams working shifts around the clock)?
Do you need to take into account public holidays (again: region dependent)?

What have you tried so far? Are you getting stuck with something specific?

Legend

Hi essibong1,
I created a lookup containing all the days of each year; then I call it in a macro that I use in many searches:

[out_working_time]
definition = | eval day=strftime(_time,"%d/%m/%Y")\
| lookup SIEMCAL.csv day OUTPUT type\
| search Tipo=2 OR (Tipo=1 (date_hour>14 OR (date_hour<7 AND date_minute<45))) OR (Tipo=0 (date_hour>20 OR (date_hour<8 AND date_minute<45)))
iseval = 0

In this way I can manage working hours in one point, so I can easily modify it.

in my lookup I have working days (type=0) holydays, Saturdays and Sundays (type=2) and half working days (type=1).
In my example, I have as working time 7.45 - 20.00 in working days and 7.45 - 14.00 in half working days.

If you don't want to use the lookup you can search only for Saturdays and Sundays, you can use:

| search (date_hour<7 AND date_minute<45) OR (date_hour>20)

Ciao.
Giuseppe

0 Karma

Ultra Champion

date_* fields are a bit tricky to use for this.

  • there are various cases where these fields will not exist (I guess it is even a performance recommendation to disable them)
  • you can't use those fields when you use a datamodel tstats as the base of your query
  • these fields represent the actual strings as extracted from the RAW event. Whether or not those are useful, highly depends on how time zones are handled. If the timestamp in the raw events is in local time, it may actually be better to use that, rather than look at the normalized time field. If the event's timestamp is in for example UTC, but collected from around the world, it will give incorrect results (e.g. a login at 21:00UTC from a system in the US would be considered outside working hours (datehour > 20) but is actually during working hours (e.g 16:00 EST).

Also the following is not correct:

| search (date_hour<7 AND date_minute<45) OR (date_hour>20)

This will ignore events that happened at e.g. 6:55 (because dateminute is not <45). If you want to show all events from before 7:45, you need to do something like `((datehour=7 AND dateminute<45) OR datehour<7)`

0 Karma

Legend

Sorry, my mistake: the working time was 6.45 - 20.00.

Then I didn't have any problem with datehous and dateminute, anyway it's also possible to use strftime(time,"%H") and strftime(time,"%M").

Ciao.
Giuseppe

0 Karma

Ultra Champion

Those fields are indeed fine, if they are there and you don't run into timezone issues.

Regarding the working time filter: I think you missed my point. By doing AND date_minute < 45 you'll never get any events back that happen in the last 15 minutes of each hour. With your filter, you get events from 0:00 - 0:44, 1:00 - 1:44, 2:00 - 2:44, 3:00 - 3:44, 4:00 - 4:44, 5:00 - 5:44, 6:00 - 6:44 and then 21:00 - 23:59.

0 Karma

SplunkTrust
SplunkTrust

Are you indexing login events? Which platform?

---
If this reply helps you, an upvote would be appreciated.