Archive

How can I categorize alerts for metric tracking?

digital_alchemy
Path Finder

So, we've built several alerts based on the MITRE ATT&CK Framework and have them set to send an email when a search has a hit.

Now naturally the next step is that management wants us to tag those alerts with the MITRE framework as the source according to the technique alerted on in order to create a dashboard and provide metrics such as how many ATT&CK based alerts have we had per month etc...

We're kinda stuck as to how to enrich the alerts with this type of categorization/tagging within Splunk.

Any suggestions on a method or add on that may help with this?

0 Karma

strangelaw
Explorer

Actually,

what you might need in the end is bit more complicated than just taxonomy based on the MITRE. Here's my thinking process:

  • You need obviously capability to add taxonomy for the alerts raised by certain event search usecase (well, Alert Manager is sufficient for this).
  • Then you need information about the actual event source; e.g. what caused it being able to align the control or detection mechanism with the taxonomy (take example of some Privilege Escalation technique under MITRE's model).
  • Then you need context - e.g. for example server - application - service chain; including possible integrations. By what I mean with this is to import or use data from some CMDB containing that information. Otherwise you have just a bunch of information directly related to certain application or server where this event was sourced (purely example here to cover how this should work).

Based on these factors you should be able to align the detection method with attack phase and to follow it to the upstream for management reporting.

So yea...in the end you need couple of other factors being able to satisfy the management needs. However, this gives great information how well detection capabilities are performing and against what threats/types the detection is in place. Further thinking may allow possibilities to adjust detection datapoints too.

0 Karma

woodcock
Esteemed Legend

Start with the Alert Manager app. You may not decide to use it but by deconstructing it, you will see where/how you can access every possible detail about alerts that have been fired:

https://splunkbase.splunk.com/app/2665/

Anam
Community Manager
Community Manager

Hi @digital_alchemy

My name is Anam Siddique and I am the Community Content Specialist for Splunk Answers. Please accept the answer if the solution provided by @woodcock worked for you. And if it didn't please update with further comments so someone can help you. We have awesome users who contribute and it would be great if the community can benefit from their answer plus they can get credit/points for their work!

Thanks

0 Karma
.conf21 CFS Extended through 5/20!

Don't miss your chance
to share your Splunk
wisdom in-person or
virtually at .conf21!

Call for Speakers has
been extended through
Thursday, 5/20!