CategoriesData

Querying AWS Healthlake from Go

When working with Healthcare data when of the things that’s often mentioned or discussed is “Is your data interoperable?” As a developer and an architect, that’s a really loaded word to me because if I can expose my data over files, APIs or some consistent channel like TCP, then by definition my system is interoperable. Per my Mac dictionary “interoperable” is an adjective defined like this :: (of computer systems or software) able to exchange and make use of information ::

However where things get a little more nuanced is when the definition includes some common healthcare specific formats and more specifically HL7’s FHIR. So when you have this problem, there are certain tools that you need to use. There are several opensource solutions that you could select but when you are an AWS’ customer, you start with AWS first. And they just so happy to have a set of capabilities wrapped around a product called Healthlake.

CategoriesInfrastructure

Handling “Poison Pill” Messages with AWS Kinesis and Lambdas

Queues and streams are fundamentally different in how they handle readers consuming their information.

With an SQS Queue you can have many consumers but generally one consumer will win reading the message and in the event of success the message is purged from the queue or upon failure that message is returned back to the queue. It technically doesn’t get deleted, yet the its visibility property is changed. Hence why the VisibilityTimeout on the queue matters. If your code processes messages in more time than that property then you are going to get messages that constantly get put back on the queue for retry.

CategoriesServerless

Creating an Async Integration with AWS Step Functions from API Gateway via CDK

I often have the scenario where there is a client which makes a request to an endpoint and they just want to make sure that payload was delivered but not necessarily concerned about the outcome. A pretty simple Async operation that happens over a quick Sync channel.

In the past, I’ve done my best either with a Lambda function to make sure it was so simple that it was incapable of failure. As I progressed further into that solution, I started using AWS Integrations to drop the payload off in an SQS Queue and then having a Lambda read that queue and then decide what to do.

CategoriesObservability

Tracing HTTP Requests with Go and Datadog

Small follow up on the last post regarding tracing. I’m a huge fan of Event Driven systems or EDA (Event Driven Architecture) but sometimes you do need to make that synchronous HTTP request in order to fetch more data. Perhaps you are building a “saga” or sometimes events just published what happened and to whom it happened but not specifics about the actual event. For that you need to return back out and fetch more info.

When that happens, you’ll need to use a HTTP Client for making that request. And when doing so, it often sort of turns into a black hole, especially if you have multiple calls to make and you need to distinguish them. Enter again the Datadog libraries. With a simple wrapping of the client, when you make requests WithContext you will get a nicer and prettier display of what the span is. In the case below, I usually like to set the VERB that was requested in addition to the URL. Feel free to use/show whatever makes sense to you

CategoriesInfrastructure

AWS CDK Pipeline

Deploying code (assets) into AWS has never been easier than it is right now. A few months back our engineering team made the decision to go all in on AWS CDK and with that included the need/desire for full pipeline automation. We’d been using a smattering of Python/Node, CloudFormation and CodeCommit plus CodePipeline code for all of our services and honestly it works fine once it’s set but getting it set per service became a pain. And honestly making modifications for idiosyncrasies for some of the services just was plain awful. So off we went and during that exploration phase we found the opinionated little construct called AWS CDK Pipelines. Below our walk through what it all meant for us.

CategoriesInfrastructure

Intro to CDK

AWS CDK (Cloud Developer Kit) is a new way to develop cloud infrastructure as it relates to AWS by brining your favorite programming language to apply abstractions on top of CloudFormation. This won’t be a super in-depth post on the tech and how to apply it (I’ll follow up with more articles later) but I’d like outline some of the benefits and reasons that you might consider your next feature’s infrastructure be coded up with it.