I believe this is tied to the threatmatch modular input that dispatches a search to use regex to parse the domain out from the URL, for example in the http_collection. You can see that in action by opening up the http_collection with |inputlookup http_collection, while paying attention to the URL that and how the domain gets extracted from it. The Threat Gen search out the box actually is not responsible for searching domain IOCs in the web.url field, its the "threat matching" tab within threat intelligence management that provides the match configuration logic for "domain" in the Web.url field for that respective data model. I am unsure if this configuration changed out the box from version to version, but the search is just to allow that logic to create notable events based on how the threat match is configured.
... View more
I noticed that this is inconsistent as well despite dest_ip/src_ip clearly being present in the search, or the logs. I am curious if it has something to do with the src_ip present in the raw log, vs it being mapped at search time from the automatic lookups that ship with ES out the box that attempt to map it to an ES asset. I was hoping that this functionality would work, I am having to rely more upon dest/src which seem to work more as expected.
... View more
I have a use case in which I am attempting to create a historical KV Store, with key field value pairs as:
host = string (attempting to use as unique _key)
Vulnerabilities discovered on 1/1 = string
Vulnerabilities discovered on 1/8 = string
Vulnerabilities discovered on 1/15 = string
For the initial data, I am trying to list out all the vulnerability signatures that have been discovered on each device from a weekly scan delimited by ";" in an effort to consolidate all the signatures discovered, while aiming to update this for each output of the weekly scan by using the host as the _key value. I have this example query that explains what I am attempting to initially do with the scan output on 1/1:
| stats values(signature) as Signatures) by host
| eval Signatures=mvjoin(Signatures, "; ")
| rename Signatures as "Vulnerabilities scanned in 1/1
| outputlook HISTORICAL_VULN_KV
As you can imagine, I get a list of vulnerability signatures in a multivalue field for each host in our network.
What I am trying to figure out is how I can craft a scheduled weekly search to:
1. Append new vulnerability signatures to each host, if a host is already listed as a _key unique value within the existing KVStore
2. Create a new record if a new host (_key) has discovered vulnerabilities to start tracking this for future scans
3. Bonus: Dynamically rename Signatures to the scan date, in my experience with using rename it seems to only rename with a static string value, without being able to incorporate Splunk timestamp logic for dynamic field naming.
I feel like this should be rather easy to accomplish, but I have tried messing around with append=true and subsearches to get this to update new scan data for each host but I have been running into issues with updating, and maintaining results for multiple separate scan dates across each host. I would like capture everything properly, and then using outputlookup to update the original KV store to maintain a historical record for each weekly scan over time.
Do I need to be more mindful of how I could be using outputlookup in conjunction with key_field=host/_key?
... View more