Getting Data In

Does anyone have an updated Lambda function for handling splunk-logging lambda & Cloudwatch logs?

pobrien
New Member

I have deployed a Lambda function from the "splunk logging" blueprint for collecting VPC Flow logs and Cloudwatch events.

Its working well however in addition to the VPC Flow Logs, I'm receiving thousands of cloudwatch events that are unreadable because the awslogs.data is being sent through using compression and Base64.

Does anyone have an updated Lambda function for handling these?

Labels (1)
0 Karma
1 Solution

amiracle
Splunk Employee
Splunk Employee

You will need to use the blueprints for Cloudwatch logs and not the generic Splunk logging function. There are two, one in python and the other in node.js; both will take the data from CloudWatch Logs and decompress the files and send them into Splunk using HEC. Here is a blog that can help you. I’ve also put together some docs / workshops that can help as well.

Once you replace the function, the data should start to flow in normally.

View solution in original post

jschroederevers
Explorer

If you are trying to test the splunk-cloudwatch-logs-processor lambda function, the default test event will fail with "Cannot read property 'data' of undefined". We had the same problem when trying to set this up for the first time. We discovered that the lambda function is expecting the json from CloudWatch and not the default test event. The CloudWatch json has a data field that the function decodes from base64. Replacing the test event with the json below should work as a test.

{
  "awslogs": {
    "data": "QVdTIGxvZyBjb250ZW50"
  }
}

 

amiracle
Splunk Employee
Splunk Employee

You will need to use the blueprints for Cloudwatch logs and not the generic Splunk logging function. There are two, one in python and the other in node.js; both will take the data from CloudWatch Logs and decompress the files and send them into Splunk using HEC. Here is a blog that can help you. I’ve also put together some docs / workshops that can help as well.

Once you replace the function, the data should start to flow in normally.

sbombardier
Observer

Hi amiracle,  you previously said in your post "You will need to use the blueprints for Cloudwatch logs and not the generic Splunk logging function. There are two, one in python and the other in node.js; "
However I cannot find anywhere the on in python. Only nodejs is offered for Lambda functions. Would it be possible to get the latest source code of this blueprint in Python (Send CloudWatch logs to a Splunk host)?

The reason for this is mainly that I need to integrate systems managers parameters and secrets asynchronously within the blueprint which is an issue I can't figure out in nodejs. Hopefully I will be in Python.  

Regards 


0 Karma

joshva0894
Observer

Hello Guys,

I'm using the Lambda-Python blueprint for processing clouduwatch logs (not a VPC flow logs).

My shard size in Kinesis Data stream is as 2.

The problem I'm facing now is, the lambda wasn't processing the entire logs. Instead, I'm getting some errors like (ProvisionedThroughputExceededException).

We are receiving a logs from 15+ AWS-Accounts to the kinesis stream.

I require a solution to overcome this issue in a deadly manner.

0 Karma

amiracle
Splunk Employee
Splunk Employee
0 Karma

radhas58
New Member

this blueprint shows the error to me. I verified the token and URL to be correct with curl command and its' fine.
the error is as below:

START RequestId: 831062c6-f1cf-427f-b115-fd3bd7c07g93 Version: $LATEST

2019-04-03T16:41:01.704Z 831062c6-f1cf-427f-b115-fd3bd7c07t73 Received event: { "version": "0", "id": "b3cb20eb-f86e-2952-3ad1-a86b1e9e0ft9", "detail-type": "Scheduled Event", "source": "aws.events", "account": "xxxxxxxxxx", "time": "2019-04-03T16:37:27Z", "region": "us-east-1", "resources": [ "arn:aws:events:us-east-1:xxxxxxxx:rule/testsplunk" ], "detail": {} }

2019-04-03T16:41:01.705Z 831062c6-f1cf-427f-b115-fd3bd7c07t73 TypeError: Cannot read property 'data' of undefined at exports.handler (/var/task/index.js:34:45)

0 Karma

pashfw
Explorer

Obviously the payload doesn't contain any "data" anymore as script is expecting.

Did anybody find a better script since then?

0 Karma

pashfw
Explorer

index.js:31

// CloudWatch Logs data is base64 encoded so decode here
const payload = Buffer.from(event.awslogs.data, 'base64');

It doesn't look like encoded actually

0 Karma

pashfw
Explorer

So the blueprint's code, written on Node.js, has some assumptions and wouldn't work without modification.

First of all in the given event there's no base64 encoding
Second, no gzip compression,
also there's no multiple events (CloudWatch trigger shoots once per event)

Long story short, try the "/services/collector/event" HEC endpoint and this simplified code in index.js (understanding is much recommended)

const loggerConfig = {
    url: process.env.SPLUNK_HEC_URL,
    token: process.env.SPLUNK_HEC_TOKEN,
};

const SplunkLogger = require('./lib/mysplunklogger');
const logger = new SplunkLogger(loggerConfig);

exports.handler = (event, context, callback) => {
    console.log('Received event:', JSON.stringify(event, null, 2));
            const parsed = JSON.parse(JSON.stringify(event,null,2));
            let count = 1;
                    /* Log event to Splunk with explicit event timestamp.
                    - Use optional 'context' argument to send Lambda metadata e.g. awsRequestId, functionName.
                    - Change "item.timestamp" below if time is specified in another field in the event.
                    - Change to "logger.log(item.message, context)" if no time field is present in event. */
                    //logger.logWithTime(parsed.timestamp, item.message, context);

                    /* Alternatively, UNCOMMENT logger call below if you want to override Splunk input settings */
                    /* Log event to Splunk with any combination of explicit timestamp, index, source, sourcetype, and host.
                    - Complete list of input settings available at http://docs.splunk.com/Documentation/Splunk/latest/RESTREF/RESTinput#services.2Fcollector */
                     logger.logEvent({
                         time: new Date(parsed.time).getTime() / 1000,
                         host: parsed.source, //'serverless',
                         source: `lambda:${context.functionName}`,
                         sourcetype: 'aws:cloudwatchlogs:yoursourcetype',
                    //     index: 'main',
                         event: parsed.detail,
                     });

            // Send all the events in a single batch to Splunk
            logger.flushAsync((error, response) => {
                if (error) {
                    callback(error);
                } else {
                    console.log(`Response from Splunk:\n${response}`);
                    console.log(`Successfully processed ${count} log event(s).`);
                    callback(null, count); // Return number of log events
                }
            });

};
0 Karma

bernardoortega
Path Finder

Hello, i am getting the same error. I just add the blueprint for cloudwatch and when making a test i get:

TypeError: Cannot read property 'data' of undefined
at exports.handler (/var/task/index.js:34:45)

any help on that?

0 Karma
Get Updates on the Splunk Community!

What's new in Splunk Cloud Platform 9.1.2312?

Hi Splunky people! We are excited to share the newest updates in Splunk Cloud Platform 9.1.2312! Analysts can ...

What’s New in Splunk Security Essentials 3.8.0?

Splunk Security Essentials (SSE) is an app that can amplify the power of your existing Splunk Cloud Platform, ...

Let’s Get You Certified – Vegas-Style at .conf24

Are you ready to level up your Splunk game? Then, let’s get you certified live at .conf24 – our annual user ...