All Apps and Add-ons

Why does Indexer sometimes not classify AWS Guardduty data as proper source?

emcbolton
New Member

We are using HTTP Event Collector (HEC) to ingest AWS Guardduty Cloudwatch Events via AWS Kinesis Firehose. I have worked with a Splunk SE to get this working on two environments. One environment (dev license) is a single instance environment. The other environment is Distributed with a HEC instance separate from the Indexer. In both environments, the Indexer appears to not always classify the incoming data from Firehose as aws:cloudwatch:guardduty properly. The unclassified data doesn't end up in the Splunk Dashboard for Guardduty. What I mean by this, is that it will sometimes have the clean Splunk formatting (plus signs and all) and other times be multiple Guardduty events in one text blob (no Splunk formatting, just long string of text).

Is there a way to fix this, alert on this issue from Splunk, or have Splunk re-read the data?
Does this likely mean I need a Lambda function to break these into separate events (maybe it's a CloudWatch to Firehose issue) prior to being sent to Firehose?

Tags (1)
0 Karma
Get Updates on the Splunk Community!

.conf24 | Registration Open!

Hello, hello! I come bearing good news: Registration for .conf24 is now open!   conf is Splunk’s rad annual ...

ICYMI - Check out the latest releases of Splunk Edge Processor

Splunk is pleased to announce the latest enhancements to Splunk Edge Processor.  HEC Receiver authorization ...

Introducing the 2024 SplunkTrust!

Hello, Splunk Community! We are beyond thrilled to announce our newest group of SplunkTrust members!  The ...