Send data to a dashboard with AWS Lambda

  • Open a new AWS tab. It is time to configure a Lambda function that will consume data from the Kinesis Data Analytics stream and execute code to turn the data into a custom metric that will be published to a CloudWatch dashboard.

  • In the AWS Management Consol, select Lambda or use this quick link.

  • Click Create function

  • Choose Author from scratch, which should be selected by default.

  • Give your function a name, for example peculiar-cloudwatch-metrics and set the Runtime as Python 3.8

  • Click Create function

  • Copy and paste the following code into the body of your Lambda function (the Python file can also be found in the GitHub repository):

import json
import boto3
import base64
import datetime

cloudwatch = boto3.client('cloudwatch')

def lambda_handler(event, context):

    for record in event['records']:
    
        bytesArray = base64.b64decode(record['data'])
        my_json = bytesArray.decode('utf8').replace("'", '"')
        data = json.loads(my_json)
        payload = json.dumps(data)
    
        event_id = data["event_id"]
        event_type = data["event_type"]
        event_name = data["event_name"]
        event_timestamp = data["event_timestamp"]
        event_version = data["event_version"]
        item_id = data["item_id"]
        item_name = data["item_name"]
        item_amount = data["item_amount"]
        real_value = data["real_value"]
        virtual_value = data["virtual_value"]
        currency_type = data["currency_type"]
        country_id = data["country_id"]
        platform = data["platform"]
        
    
        response = cloudwatch.put_metric_data(
        MetricData = [
            {
                'MetricName': 'event_id',
                'Timestamp': datetime.datetime.now(),
                'Value': event_id,
                'StorageResolution': 1
            },
            {
                'MetricName': 'event_type',
                'Timestamp': datetime.datetime.now(),
                'Value': event_type,
                'StorageResolution': 1
            },
            {
                'MetricName': 'event_timestamp',
                'Timestamp': datetime.datetime.now(),
                'Value': event_timestamp,
                'StorageResolution': 1
            },
            {
                'MetricName': 'event_version',
                'Timestamp': datetime.datetime.now(),
                'Value': event_version,
                'StorageResolution': 1
            },
            {
                'MetricName': 'item_id',
                'Timestamp': datetime.datetime.now(),
                'Value': item_id,
                'StorageResolution': 1
            },
            {
                'MetricName': 'item_amount',
                'Timestamp': datetime.datetime.now(),
                'Value': item_amount,
                'StorageResolution': 1
            },
            {
                'MetricName': 'real_value',
                'Timestamp': datetime.datetime.now(),
                'Value': real_value,
                'StorageResolution': 1
            },
            {
                'MetricName': 'virtual_value',
                'Timestamp': datetime.datetime.now(),
                'Value': virtual_value,
                'StorageResolution': 1
            }
    
        ],
        Namespace='peculiar-wizards-data'
        )

This code will take the filtered data sent from the Kinesis Data Analytics stream and send it to a CloudWatch dashboard as custom metrics. It does this using the CloudWatch Boto 3 SDK for Python.

  • The last step to configure Lambda is making sure it has the appropriate permissions to publish metrics to CloudWatch. At the top, choose Permissions and click the IAM Execution Role name. If you named your Lambda function peculiar-cloudwatch-metrics, the role name should be similar to peculiar-cloudwatch-metrics-role-XXXXXXXX. This will take you to the IAM management console where you can edit role permissions.

  • In the IAM management console, select Attach policies. Select the CloudWatchFullAccess policy and select Attach policy.

  • Go back to your Lambda function and on the Configuration tab, Edit basic settings to increase the timeout to 3 minutes.

  • Finally, Deploy the Lambda function.

  • Now go back to the open tab with your Kinesis Data Analytics stream to connect to a destination. Choose Connect to a destination and choose the destination to be an AWS Lambda function.

  • Select the Lambda function you just created.

  • For In-application stream, select Choose an existing in-application stream and choose data_stream, which is a stream that is created in the continous filter SQL query.

  • Leave the rest of the configurations as default and click Save and continue. It might take a couple minutes to save and connect your Lambda function as the destination to the stream. Your final Kinesis Data Analytics stream configurations should look like this:

Check to make sure data is still being generated by the Kinesis Data Generator and is still being filtered with the Kinesis Data Analytics application.