Fellow Splunkers!
I am attempting to update fields within a lookup file, and fortunatley there are only 2 fields. I am in attempt to keep track of last login time to a system, which equates to user
and login
in a lookup table.
Here is my search:
index=myIndex sourcetype=mySourcetype eventtype=userLogin | stats latest(_time) as "login" by user | eval login=login | convert ctime(login) as login | outputlookup login servers_logins.csv
sometimes the user column will need to be appended with a new user, but mostly it will just be the login time that needs updated.
We are trying to keep history of when a user logs in for a couple years by appending this file, and use it to disabled inactive accounts.
I've read this answer, though I can't seem to get this to work for me as i've tried it above.
I know after I get this right, I can just schedule this search to run everyday in order to keep it up to date.
Any help would be greatly appreciated
In similar situations I do something that is comparable to what aljohnson posted but I don't use the append command. Rather I leverage that ability within the lookup command itself.
search | format with table | inputlookup append=t server_logins.csv | stats max(your time field name) by user | outputlookup server_logins.csv
While not needed for this use case you can take this a step further to purge old entries on lookups within the same search. For example if you wanted to dump users from this list after, say, 30 days you could something like
search | format with table | inputlookup append=t server_logins.csv | stats max(your time field name) by user | eval current_time = now() | where login_time > (current_time - (86400 * 30)) | outputlookup server_logins.csv
This is a common problem, and the solution isn't necessarily intuitive. Here is what I would do:
inputlookup
- this is necessary if we want to ever update our existing data.stats
Use outputlookup to save the reuslts
inputlookup server_logins.csv
| append
[search index=myIndex sourcetype=mySourcetype eventtype=userLogin
| stats latest(_time) as "login" by user
| eval login=login
| convert ctime(login) as login ]
| stats latest(login) as login by user
| outputlookup server_logins.csv
Now you can just schedule the search to run as often as you would like - adding earliest
and latest
to match.
If you are willing to write some code, I think that the KV store might be an even better alternative, since it supports direct update operations.