Reporting

Any Advice on Updating a Historical KV Store for Vulnerability Data with Scheduled Search?

jaspersplunkfu
Engager

I have a use case in which I am attempting to create a historical KV Store, with key field value pairs as:

host = string (attempting to use as unique _key)

Vulnerabilities discovered on 1/1 = string

Vulnerabilities discovered on 1/8 = string

Vulnerabilities discovered on 1/15 = string

 

For the initial data, I am trying to list out all the vulnerability signatures that have been discovered on each device from a weekly scan delimited by ";" in an effort to consolidate all the signatures discovered, while aiming to update this for each output of the weekly scan by using the host as the _key value. I have this example query that explains what I am attempting to initially do with the scan output on 1/1:

index=vulnerabilitydata

| stats values(signature) as Signatures) by host

| eval Signatures=mvjoin(Signatures, "; ")

| rename Signatures as "Vulnerabilities scanned in 1/1

| outputlook HISTORICAL_VULN_KV

As you can imagine, I get a list of vulnerability signatures in a multivalue field for each host in our network.

What I am trying to figure out is how I can craft a scheduled weekly search to:

1. Append new vulnerability signatures to each host, if a host is already listed as a _key unique value within the existing KVStore

2. Create a new record if a new host (_key) has discovered vulnerabilities to start tracking this for future scans

3. Bonus: Dynamically rename Signatures to the scan date, in my experience with using rename it seems to only rename with a static string value, without being able to incorporate Splunk timestamp logic for dynamic field naming. 

 

I feel like this should be rather easy to accomplish, but I have tried messing around with append=true and subsearches to get this to update new scan data for each host but I have been running into issues with updating, and maintaining results for multiple separate scan dates across each host. I would like capture everything properly, and then using outputlookup to update the original KV store to maintain a historical record for each weekly scan over time.

Do I need to be more mindful of how I could be using outputlookup in conjunction with key_field=host/_key?

 

Labels (2)
0 Karma
Get Updates on the Splunk Community!

Streamline Data Ingestion With Deployment Server Essentials

REGISTER NOW!Every day the list of sources Admins are responsible for gets bigger and bigger, often making the ...

Remediate Threats Faster and Simplify Investigations With Splunk Enterprise Security ...

REGISTER NOW!Join us for a Tech Talk around our latest release of Splunk Enterprise Security 7.2! We’ll walk ...

Introduction to Splunk AI

WATCH NOWHow are you using AI in Splunk? Whether you see AI as a threat or opportunity, AI is here to stay. ...