Very specific events related to searches have been deleted. How would someone go about determining who deleted them?
This provides a lot of info:
index=_internal sourcetype=splunkd_access
( method=POST OR method=DELETE )
( user!=splunk-system-user user!=- )
( uri_path=/servicesNS/* uri_path!="*/user-prefs/*" uri_path!="/servicesNS/*/*/*/jobs/*/control" uri_path!=/servicesNS/*/mobile_access* )
| replace "*/ui/views*" with "*/ui_views*", "*/props*" with "**", "*/distributed/peers*" with "*/distributed_peers*", "*/server/serverclasses*" with "*/server_class*" in uri_path
| where mvcount( split( uri_path , "/" ) ) > 6
| eval activity = case( method=="POST" AND like( uri_path , "%/acl" ) , "Permissions Update", method=="POST" AND NOT like( uri_path , "%/acl" ) , "Edited" , method="DELETE" , "Deleted" )
| rex field=uri_path "/servicesNS(/[^\/]+){3}/(?<object_type>[^\/]+)/(?<object_name>[^\/]+)"
| eval object_name = urldecode( object_name )
| table _time, user, object_name, object_type, activity
From your trail of questions, and the fact that the audit trail is returning no hits in the time you think something occurred, I'd have to suggest two directions to go.
Either (1) the person who did this knows a HUGE HONKING LOT about splunk, in which case you are better looking for capability, opportunity and motive, or (2) your assumption that something occurred is flawed in some way.
Unless you left your admin password as changeme, and an external agent got in to muck around with your system, the second scenario is far and away more likely.
The idea that a person currently unknown, but basically a regular user on your system, first did a problematic search, then deleted the search, then hid or eliminated evidence of the deletion, requires a lot more expertise and more steps than the idea that a bad setting somewhere or unknown aspect of splunk was the accidental cause of whatever you detected regarding the search.
As such, I'd suggest that you collect the symptoms of the disappearing search, or of the reason you think that a search existed in order to disappear, and ask a new question about how that could happen.
If you are referring to a user deleting data in an index, you can try the following. It assumes you are not referring to an internal index that begins with an underscore.
index=_internal sourcetype=splunkd_ui_access TERM(delete) q="*delete*" | eval q = urldecode(q) | where !match(q,"^search\s+index\s*=\s*_") | table _time user earliest latest q
would there be a way to search for a user temporarily assigned "can_delete"?
I don't if that is logged unless you have logging for turned up to Debug for the appropriate logging component. Maybe someone else can point you to something else.
While this search produces anyone entering "delete", there were no hits in the timeframe we were looking at. we will look into other methods of deleting events. Thanks for the response.
do you mean dashboards / reports were deleted or data was deleted?
regardless you can look in the index _audit and filter by action and user