I successfully forward my SES notifications to Splunk via the following process:
create SNS topic, e.g. "ses-notifications-topic"
create SQS queue, e.g. "ses-notifications-queue"
from the SQS queue's right-click menu, 'Configure queue" and change Default Visibility Timeout = 5 min (Splunk recommended value)
again from the right-click, 'Add subscription' (then select the SNS topic)
in the IAM policy for my Splunk service-account, I specifically add sqs:List*/Get*/DeleteMessage permissions and with that particular SQS queue as the Resource
in the Splunk Add-on for AWS, create new Input of type 'Custom / SQS', then point it at the SQS queue and set sourcetype=aws:ses
finally, back in the SES console, update the Notifications to send to the new SNS topic
The problem I have, actually, is with parsing the events correctly. The message details are not being extracted even though they're in JSON. Probably some simple setting I'm missing. (I also found a "Splunk SES App" from a third-party but it doesn't seem to work.)
P.S. I have the optional SES notification setting, "Include original headers" but I don't think that's the issue.
The only methods AWS lists for publishing the data is "Amazon CloudWatch or Amazon Kinesis Data Firehose, or by Amazon SNS notification".
You can't get around AWS rules for publishing the data. Maybe I don't understand your question?