Event-Driven Architectures with AWS DynamoDB Streams

November 2nd, 2018 | Greg Straw | Development,Engineering

We’ve been working with AWS DynamoDB Streams on our customer projects, and we’ve found it to be a really powerful solution for creating event-driven pipelines. DynamoDB is one of the more recent features added to DynamoDB and can help address some of the common challenges when migrating from a relational schema to DynamoDB.

What are DynamoDB Streams?

If you don’t know the basics around DynamoDB, there are many good articles, blogs, and of course the AWS Documentation. I’d like to focus on some of the use cases, and the benefits of using one of its more recent features: DynamoDB Streams.

DynamoDB Streams enable you to trigger downstream actions based on the activity occurring in a DynamoDB table. Every time an item is added, changed, or removed, a stream event is triggered by capturing that change. This feature is very powerful when integrated with other AWS components.

When streams are enabled for a given table, AWS stands up a new service endpoint for making API requests specific to stream events. You can select from four options for the stream events:

  1. KEYS_ONLY – the key attributes of the changed item
  2. NEW_IMAGE – what the item changed-to
  3. OLD_IMAGE – what the item changed from
  4. NEW_AND_OLD_IMAGES

Read more