Archive

splunk-logging lambda & Cloudwatch logs

New Member

I have deployed a Lambda function from the "splunk logging" blueprint for collecting VPC Flow logs and Cloudwatch events.

Its working well however in addition to the VPC Flow Logs, I'm receiving thousands of cloudwatch events that are unreadable because the awslogs.data is being sent through using compression and Base64.

Does anyone have an updated Lambda function for handling these?

0 Karma
1 Solution

Splunk Employee
Splunk Employee

You will need to use the blueprints for Cloudwatch logs and not the generic Splunk logging function. There are two, one in python and the other in node.js; both will take the data from CloudWatch Logs and decompress the files and send them into Splunk using HEC. Here is a blog that can help you. I’ve also put together some docs / workshops that can help as well.

Once you replace the function, the data should start to flow in normally.

View solution in original post

If you are trying to test the splunk-cloudwatch-logs-processor lambda function, the default test event will fail with "Cannot read property 'data' of undefined". We had the same problem when trying to set this up for the first time. We discovered that the lambda function is expecting the json from CloudWatch and not the default test event. The CloudWatch json has a data field that the function decodes from base64. Replacing the test event with the json below should work as a test.

{
  "awslogs": {
    "data": "QVdTIGxvZyBjb250ZW50"
  }
}

 

Splunk Employee
Splunk Employee

You will need to use the blueprints for Cloudwatch logs and not the generic Splunk logging function. There are two, one in python and the other in node.js; both will take the data from CloudWatch Logs and decompress the files and send them into Splunk using HEC. Here is a blog that can help you. I’ve also put together some docs / workshops that can help as well.

Once you replace the function, the data should start to flow in normally.

View solution in original post

Observer

Hello Guys,

I'm using the Lambda-Python blueprint for processing clouduwatch logs (not a VPC flow logs).

My shard size in Kinesis Data stream is as 2.

The problem I'm facing now is, the lambda wasn't processing the entire logs. Instead, I'm getting some errors like (ProvisionedThroughputExceededException).

We are receiving a logs from 15+ AWS-Accounts to the kinesis stream.

I require a solution to overcome this issue in a deadly manner.

0 Karma

Splunk Employee
Splunk Employee
0 Karma

New Member

this blueprint shows the error to me. I verified the token and URL to be correct with curl command and its' fine.
the error is as below:

START RequestId: 831062c6-f1cf-427f-b115-fd3bd7c07g93 Version: $LATEST

2019-04-03T16:41:01.704Z 831062c6-f1cf-427f-b115-fd3bd7c07t73 Received event: { "version": "0", "id": "b3cb20eb-f86e-2952-3ad1-a86b1e9e0ft9", "detail-type": "Scheduled Event", "source": "aws.events", "account": "xxxxxxxxxx", "time": "2019-04-03T16:37:27Z", "region": "us-east-1", "resources": [ "arn:aws:events:us-east-1:xxxxxxxx:rule/testsplunk" ], "detail": {} }

2019-04-03T16:41:01.705Z 831062c6-f1cf-427f-b115-fd3bd7c07t73 TypeError: Cannot read property 'data' of undefined at exports.handler (/var/task/index.js:34:45)

0 Karma

Explorer

Hello, i am getting the same error. I just add the blueprint for cloudwatch and when making a test i get:

TypeError: Cannot read property 'data' of undefined
at exports.handler (/var/task/index.js:34:45)

any help on that?

0 Karma
State of Splunk Careers

Access the Splunk Careers Report to see real data that shows how Splunk mastery increases your value and job satisfaction.

Find out what your skills are worth!