Archive
Highlighted

Does Splunk meet 800-53 Audit Reduction requirements and a common Date-Time Stamp capability?

New Member

I am having trouble finding documentation that explicitly states Splunk's ability to perform audit reduction. I am also having difficulty finding out if Splunk meets the AU-8 requirement for a common DTG source.

Tags (1)
0 Karma
Highlighted

Re: Does Splunk meet 800-53 Audit Reduction requirements and a common Date-Time Stamp capability?

SplunkTrust
SplunkTrust

First off, a disclaimer: I only barely know what I'm talking about on the first topic. And even less on the second.

You are right, it is unlikely that you'll find somewhere where the docs "explicitly state" either of those, but perhaps the reasons I outline below will help you realize why.

For the first control task/control you mention, the control mentions

Audit reduction capability can include, for example, modern data
mining techniques with advanced data filters to identify anomalous behavior
in audit records.

And if you look at the 800-53 controls for AU-7, it gives a bit better information which I quote here too (though I pull out a few "see also things" to keep it shorter)

An audit reduction and report generation capability provides support
for near real-time audit review, analysis, and reporting requirements
described in AU-6 and after-the-fact investigations of security incidents.
Audit reduction and reporting tools do not alter original audit records.
The information system provides the capability to automatically process
audit records for events of interest based on selectable event criteria.

These are not a precisely defined single thing but more of a general strategy. Luckily, Splunk does this type of thing very well. Unluckily, it's probably a bit hard to find some documentation specifically saying "Splunk does that" because there's no built-in one stop "audit reduction" push-button to push. It's not hard to do these things - very easy in fact - but you do have to work out your own searches and dashboards for your environment. Mostly that's just getting the logs in then clicking around and filtering and then adding to dashboards.

When you refer back to AU-6 which is mentioned in both of those, you'll find a lot of those things, if not all, can be well handled in Splunk. For instance: "correlates audit records across different repositories" is one of Splunk's primary strengths.

Anecdotally, another 3-letter agency's rather rigorous requirements around this same topic were well served by Splunk's capabilities. The auditors were very pleased with Splunk's ability to summarize, categorize, sift and filter, and ultimately to bring to prominence those anomalies that we needed to chase after.

On the second thing, I'm not sure exactly what you mean but there's a couple of avenues to explore real quick.

First, Splunk doesn't "keep time" except as it gets it from the underlying OS. If you choose that as an OS and configure it properly to timesync as required, it should be no problem.

Second, the files/logs/information you feed into Splunk will all have their own time stamping which is honored by Splunk and is how they're recorded. There are cases where Splunk will "fix" a too-far-out timestamp, but it never overwrites the original log so that's always available, and this behavior is able to be changed/overridden on any particular input. And it's always able to be reported on or even alerted on.

Does this help?

0 Karma