I am successfully receiving events from AWS CloudWatch and CloudTrail for two different AWS accounts. However there are some events that I can see in AWS CloudWatch that are not appearing in Splunk. I have done a very simple search across all of our Splunk events for a string that appears in a log that's appearing in the AWS Console under CloudWatch.
this returns no results. All AWS inputs are going into index "pov_aws". (I have also checked all indexes just to be sure.)
I have enabled all regions and services in the Splunk App for AWS Configuration page under the CloudWatch Inputs. Similar events are occurring in both AWS Accounts but not reaching Splunk from either. Permissions in AWS are set as per the Splunk documentation page for pre-reqs.
We're using Splunk Cloud so don't have access to aws_cloudwatch_logs_tasks.conf - only what's accessible via the App screens. These don't go down to log group / stream level.
I'll check with our AWS guy tomorrow what groups/streams we have.
No obvious error in the splunk_ta_aws_cloudwatch_logs_main.log
I now have access to CloudWatch and can see the log message I'm interested here:
CloudWatch > Log Groups > Streams for /aws/lambda/TestFunction
I assume I need to raise a support call with Splunk Customer Support to check Splunk Cloud config to see if anything is being filtered?
If your data in CloudWatch Log won't be too big, then it's fine to use the AWS Add-on CloudWatchLog mod input to collect it. Otherwise, we recommend you use Kinesis stream and the Kinesis mod input: http://docs.splunk.com/Documentation/AddOns/released/AWS/CloudWatchLogs
You can use the AWS TA web UI to configure the CloudWatchLog input http://.../en-US/app/Splunk_TA_aws/inputs, the logGroup fields can be a list of log groups, such as /aws/lambda/s3table-chalice,/aws/lambda/ListBucketInfo,/aws/lambda/NoOp,/aws/lambda/GetBucketInfo,/aws/lambda/UpdateBucketInfo
If you are using the distributed environment of Splunk Cloud with heavyweight forwarder to collect AWS data, then yes, pls raise a ticket. Otherwise, for single instance Splunk Cloud instance, you can use the web UI to configure the CloudWatchLog input by yourself. Hope it helps.
Here are a few things to check:
Your Cloudwatch log events are grouped by log group and stream. Are you getting events from every log group and stream? Or is it possible the logs you are missing are in a log group you haven't yet added to the app on Splunk?
Check your aws_cloudwatch_logs_tasks.conf and make sure you aren't filtering out the logstream that contains your missing events.
Also, check the splunk_ta_aws_cloudwatch_logs_main.log log in your _internal index for any errors related to retrieving your Cloudwatch log events.