eliasbrange.dev
Debug item-level modifications in DynamoDB with EventBridge Pipes and CloudWatch

Debug item-level modifications in DynamoDB with EventBridge Pipes and CloudWatch

2022-12-19
| #AWS #Serverless

Introduction

Have you ever wondered what happens in your DynamoDB table on the item level? Ever wanted to keep a trail in CloudWatch of everything that happens for debugging or audit purposes?

You can use DynamoDB streams to capture item-level modifications made to a table. Previously, to get stream records to CloudWatch, you had to write a Lambda function with custom glue code. Now, with the newly released EventBridge Pipes, this got even easier.

Read on to learn how to set up an EventBridge pipe between a DynamoDB stream and a CloudWatch log group.

Tutorial

This tutorial will show how to set this up through the AWS console.

1. Create a DynamoDB table

Head over to the DynamoDB console and click on Create table. The table configuration (name and keys) does not matter for the sake of this tutorial.

  1. Specify a Table name.
  2. Specify a name for the Partition key.
  3. Specify a name for the Sort key.
  4. Click on Create table.
Create dialog for a DynamoDB table
Create dialog for a DynamoDB table

2. Enable DynamoDB stream

On the left-hand side of the DynamoDB console, click on Tables > Update settings.

  1. Select your new table.
  2. Switch to the Exports and streams tab.
  3. Under DynamoDB stream details, click the Enable button.
Enable DynamoDB stream
Enable DynamoDB stream
  1. Select New and old images.
  2. Finally, click on Enable stream.
DynamoDB stream configuration
DynamoDB stream configuration

3. Create an EventBridge pipe

Now that you have a table go to the EventBridge console to create a pipe.

  1. Go to the Pipes console.
  2. Click on Create pipe.
EventBridge pipes console
EventBridge pipes console
  1. Give your pipe a name.
  2. Click on Source to open the source configuration.
  3. Select DynamoDB in the Source dropdown.
  4. Select the stream of your table in the DynamoDB stream dropdown.
  5. Click on Target to open the target configuration.
Set EventBridge pipe name and source
Set EventBridge pipe name and source
  1. Select CloudWatch log in the Target service dropdown.
  2. Pick a name for your Log Group.
  3. Finally, click on Create pipe.
Set EventBridge pipe target
Set EventBridge pipe target

4. Make some item-level changes to the table

Add, update, and remove a few items in your DynamoDB table. You can do this from the DynamoDB console.

5. Head over to CloudWatch

Head over to CloudWatch to see the work of your pipe in action.

  1. On the left-hand panel, click on Log groups.
  2. Find your Log group in the list and click on it.
CloudWatch Log groups
CloudWatch Log groups
  1. In the log group, you should find a stream with the name of your pipe. Click on it.
CloudWatch Log streams
CloudWatch Log streams

You should now find log entries for all item-level modifications to your table. Below you can see an event’s structure when you insert an item into your table.

CloudWatch logs for all DynamoDB stream events
CloudWatch logs for all DynamoDB stream events

The logs are structured, which makes it easy to query and filter log records with Logs Insights in the CloudWatch console.

You can use the following query to find all deleted items:

1
fields @timestamp, @message
2
| sort @timestamp desc
3
| filter eventName = "REMOVE"

Or, if you are interested events for a specific key:

1
fields @timestamp, @message
2
| sort @timestamp desc
3
| filter dynamodb.Keys.pk.S = "pk1"

Summary

This post was a short but, hopefully, useful one. Using pipes to capture all item-level modifications made to a DynamoDB table in CloudWatch can help you debug your solutions. It could also be used for auditing when you require a log on all item-level changes made to a table. Best of all, it required zero modifications to the application that writes to the table.


About the author

I'm Elias Brange, a Cloud Consultant and AWS Community Builder in the Serverless category. I'm on a mission to drive Serverless adoption and help others on their Serverless AWS journey.

Did you find this article helpful? Share it with your friends and colleagues using the buttons below. It could help them too!

Are you looking for more content like this? Follow me on LinkedIn & Twitter !