Any Advice on Updating a Historical KV Store for Vulnerability Data with Scheduled Search?


I have a use case in which I am attempting to create a historical KV Store, with key field value pairs as:

host = string (attempting to use as unique _key)

Vulnerabilities discovered on 1/1 = string

Vulnerabilities discovered on 1/8 = string

Vulnerabilities discovered on 1/15 = string


For the initial data, I am trying to list out all the vulnerability signatures that have been discovered on each device from a weekly scan delimited by ";" in an effort to consolidate all the signatures discovered, while aiming to update this for each output of the weekly scan by using the host as the _key value. I have this example query that explains what I am attempting to initially do with the scan output on 1/1:


| stats values(signature) as Signatures) by host

| eval Signatures=mvjoin(Signatures, "; ")

| rename Signatures as "Vulnerabilities scanned in 1/1


As you can imagine, I get a list of vulnerability signatures in a multivalue field for each host in our network.

What I am trying to figure out is how I can craft a scheduled weekly search to:

1. Append new vulnerability signatures to each host, if a host is already listed as a _key unique value within the existing KVStore

2. Create a new record if a new host (_key) has discovered vulnerabilities to start tracking this for future scans

3. Bonus: Dynamically rename Signatures to the scan date, in my experience with using rename it seems to only rename with a static string value, without being able to incorporate Splunk timestamp logic for dynamic field naming. 


I feel like this should be rather easy to accomplish, but I have tried messing around with append=true and subsearches to get this to update new scan data for each host but I have been running into issues with updating, and maintaining results for multiple separate scan dates across each host. I would like capture everything properly, and then using outputlookup to update the original KV store to maintain a historical record for each weekly scan over time.

Do I need to be more mindful of how I could be using outputlookup in conjunction with key_field=host/_key?


Labels (2)
0 Karma
Get Updates on the Splunk Community!

Splunk Observability Cloud | Unified Identity - Now Available for Existing Splunk ...

Raise your hand if you’ve already forgotten your username or password when logging into an account. (We can’t ...

Index This | How many sides does a circle have?

February 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...

Registration for Splunk University is Now Open!

Are you ready for an adventure in learning?   Brace yourselves because Splunk University is back, and it's ...