CategoriesServerless

Customize a Cognito Access Token with Rust

Identity and Access Management is a critical part of any application. And having a solution that provides customization can also be super important. Take for instance the ability to customize a Cognito Access token to extend functionality.

So many times developers and architects try and roll their own solution and while they do their best to meet OAuth and OIDC specifications, they just tend to fall short. Not to mention they end up with more maintenance and scaling issues than they planned. By leveraging a Serverless Identity Platform like Cognito, developers and architects gain a piece that takes care of the heavy lifting of identity and access for a user base of 1 to essentially as many as needed.

However, until very recently a gap in functionality that honestly allowed some insecure usage existed. Developers were using ID tokens as Access tokens because only those tokens could be customized within a Cognito sign-in workflow. That is no longer the case, as Access tokens can now be customized. I want to take a look at how to customize a Cognito Access Token with Rust.

AWS’ Cognito allows you to implement frictionless customer identity and access management that scales

AWS
CategoriesServerless

Web API with Rust and Fargate

I’ve been spending more and more time with Rust which is perfectly in line with my end-of-year planning. So far, my collection of articles is growing and so are my Rust skills. I can’t profess to be more than a novice at this point, but when has that stopped me from crafting something meaningful around a piece of technology? I hope you find this entry helpful and insightful as I dive into building a Web API with Rust and Fargate.

CategoriesServerless

API Gateway, Lambda, DynamoDB and Rust

It’s been a few weeks since I last wrote an article on using Rust with AWS. In the span of then and now, AWS officially released their Rust SDK for interacting with many of their services. If there was a barrier before this in my mind about using something in production that wasn’t generally available, that barrier is now gone. I also made a public commitment to building more examples in Rust in 2024 and while I’m a few weeks early, I just can’t contain my enthusiasm for learning this language that feels nothing like anything I’ve worked with before. Let’s take a look at building an API with API Gateway, Lambda, DynamoDB and Rust.

CategoriesDataProgrammingServerless

Partitioned S3 Bucket from DynamoDB

I’ve been working recently with some data that doesn’t naturally fit into my AWS HealthLake datastore. I have some additional information captured in a DynamoDB table that would be useful to blend with HealthLake but on its own is not an FHIR resource. I pondered on this for a while and came up with the idea of piping DynamoDB stream changes to S3 so that I could then pick up with AWS Glue. In this article, I want to show you an approach to building a partitioned S3 bucket from DynamoDB. Refining that further with Glue jobs, tables and crawlers will come later.

CategoriesProgrammingInfrastructureServerless

Consuming an SQS Event with Lambda and Rust

I’ve been trying to learn Rust for the better part of this year. My curiosity peaked a few years back when I learned the AWS-led Firecracker was developed with the language. And I’ve continued to want to learn it ever since. Fast-forward and I’m jumping both feet in. That’s usually how I work. I must admit that right now, I’m the most noob of noobs, but that’s not going to keep me from sharing what I’m up to and what I’m learning. For me, this blog is as much about sharing as it is about learning and communicating to those reading that it’s OK to be where you are in your journey. There are no straight lines. Only periods of growth and plateaus. In this article, I’ll walk you through consuming an SQS Event with Lambda and Rust.

CategoriesServerlessData

DynamoDB Incremental Export with Step Functions

When working on building solutions, the answer to some problems is often, it depends. For instance, if I need to deal with data as it changes and use DynamoDB, streams are the perfect feature to take advantage of. However, some data doesn’t need to be dealt with in real-time, once a day or every 30 minutes might be good enough. This was problematic up until recently, as AWS released incremental exports with DynamoDB. In this article, I want to explore building an incremental export with DynamoDB and Step Functions.

CategoriesServerlessProgramming

WebSocket with AWS API Gateway

I was working recently with some backend code and I needed to communicate the success or failure of the result back to my UI. I instantly knew that I needed to put together a WebSocket to handle this interaction between the backend and the front end. With all the Serverless and non-Serverless options out there though, which way do I go? How about plain old WebSockets with AWS API Gateway and Serverless?

CategoriesProgrammingServerless

DynamoDB Streams EventBridge Pipes Multiple Items

I’ve written a few articles lately on EventBridge Pipes and specifically around using them with DynamoDB Streams. I’ve written about Enrichment. And I’ve written about just straight Streaming. I believe that using EventBridge Pipes plays a nice part in a Serverless, Event-Driven approach. So in this article, I want to explore Streaming DynamoDB to EventBridge Pipes with multiple items in one table.

Several of the comments I received about Streaming DynamoDB to EventBridge Pipes were around, “What if I have multiple item collections in the same table?”. I intend to show a pattern for handling that exact problem in this article. At the bottom, you’ll find a working code sample that you can deploy and build on top of. I’ve used this exact setup in production, so rest assured that this is a great base to start from.

CategoriesDataServerless

DynamoDB Streams EventBridge Pipes Enrichment

I’ve been wanting to spend more time lately talking about AWS HealthLake. And then more specifically, Fast Healthcare Interoperable Resources (FHIR) which is the foundation for interoperability in healthcare information systems. I believe very strongly that Serverless is for more than just client and user-driven workflows. I wrote extensively about it here but I wanted to take a deeper dive into building out streams of dataflows. I’ve been using this pattern for quite some time in production, so let’s have a look at EventBridge Pipes enriching DynamoDB Streams.

CategoriesDataProgrammingServerless

AWS Step Functions Callback Pattern

Some operations in a system function asynchronously. Many times, those same operations must also happen to be responsible for coordinating external workflows to provide an overall status on the execution of the main workflow. A natural fit for this problem with AWS is to use Step Functions and make use of the Callback pattern. In this article, I’m going to walk through an example of the Callback pattern while using AWS’ HealthLake and its export capabilities as the backbone for the async job. Welcome to the AWS Step Functions Callback Pattern.

Callback Workflow Solution Architecture

Let’s first start with the overarching architecture diagram. The general premise of the solution is that AWS’ HealthLake allows the export of all resources “since the last time”. By using Step Functions, Lambdas, SQS, DynamoDB, S3, Distributed Maps and EventBridge I’m going to build the ultimate Serverless Callback workflow. I feel like outside of Kinesis and SNS, I’ve touched them all in this one.

AWS Step Functions Callback Pattern Architecture

There’s quite a bit going on in here so I’m going to break it down into segments which will be:

  1. Triggering the State Machine
  2. Record Keeping and Run Status
  3. Running the Export and initiating the Callback
  4. Polling the Export and Restarting the State Machine
  5. Working the results
  6. Wrapping Up
  7. Dealing with Failure

Hang tight, there’s going to be a bunch of code and lots of detail. If you want to jump to code, it’s down at the bottom here