Microservices with AWS Serverless

Developing microservices using AWS Lambda, Amazon API Gateway and Amazon DynamoDB

What are Microservices?

Microservices have been used for building applications for quite a long time up until now. In contrast to monolithic applications, which are tightly coupled and highly dependent, microservices are loosely coupled, independent of each other, scale easily and reduce time-to-market. A microservice should have only one responsibility and can be used by other services through its API interface while each service manages its own database. A microservice should only have to deal with its domain model and not with another service’s domain model.

Microservices Architecture

Figure 1 shows a simple microservice architecture with three independent components (order, user and payment service). While incoming requests go through an API Gateway first, they are then routed to the respective microservice based on it’s endpoint. The service then handles retrieving/persisting of the data on its own database based on the client’s request.

A strong reason why the microservice architecture is gaining more and more popularity within software development teams, is that each service can be deployed independently of the other microservices an application consists of. This means faster deployments, smaller and less complex codebases and bug fixes, as well as a faster time-to-market. Another benefit is that teams can choose their preferred programming language and tech-stack for each service, as each microservice usually comes with its own code repository and deployment pipeline. So, a Java developer team and a Golang developer team can develop the same application while working independently on different microservice components of the application. All in all, development teams can work more autonomously, meaning they can write code, test and deploy it regardless of other teams.


It is important to mention though, that all the mentioned benefits of microservices also come with drawbacks. For example, some operations involving multiple microservices can be complex and difficult to troubleshoot. Hence, it is crucial in a microservice application to implement proper logging and monitoring so that even complex operations can be traced and understood. Keeping that in mind, the microservice architecture offers a great way to make developing applications more efficient.

Introduction to AWS Serverless

Amazon Maintains Cloud Sead as Microsoft Edges Closer

Figure 2 indicates Amazon’s popularity as a worldwide cloud infrastructure provider with a 31% market share in the first quarter of 2024

Amazon Web Services, or short AWS, is the leading cloud infrastructure provider up until now (see figure 2), offering a large variety of different services which provide software developers and enterprises the needed infrastructure to develop various applications. As part of these services, AWS offers different deployment options including EC2, ECS, or AWS Serverless. The latter option means that developers do not manage any underlying infrastructure but rather use what AWS provides which makes these services easy to use and maintain. There are a set of AWS services which can be leveraged to run an application in the cloud without the need to take care of things like autoscaling or replacing instances, leading to less operational overhead for developers. A subset of these consists of (but is not limited to):
 

  • AWS Lambda
  • Amazon API Gateway
  • Amazon DynamoDB
  • Amazon Simple Queue Service
  • Amazon EventBridge


These services play together well and integrate seamlessly, enabling security, data persistence, and inter-service communication through queues, notifications and events. Furthermore, one can use AWS CloudWatch for application monitoring and alerting, which is essential when developing a distributed application.


Next to the benefits of more efficient development, less overhead and faster time-to-market, leveraging AWS Serverless also results in low costs because of its pay-for-value billing model. This means you only pay for the resources used and never pay for over-provisioning as AWS takes care of optimizing your resources depending on low or peak demands of your application.

Abbildung 3: Golang-Struct, welches die Eigenschaften einer Zahlung definiert und wie diese in einem JSON-Objekt benannt werden
Bildquelle

AWS Lambda, Amazon API Gateway & Amazon DynamoDB

Leveraging only three out of AWS Serverless’ services, one can create a production ready application smoothly. These services are AWS Lambda, Amazon API Gateway and Amazon DynamoDB and will be explained further in the following.


AWS Lambda
functions offer a fast and easy way to build, deploy and maintain small service units. In contrast to EC2 instances or Kubernetes containers, a lambda function does not run continuously. Rather, it gets triggered by some sort of event or request. Which also means that the price model is tied to how often a lambda function is triggered and how long it runs after being triggered (with the free tier option, 1.000.000 requests per month are for free as well as 400.000 GB-seconds of compute time per month, for more information about AWS Lambda’s pricing model, see: https://aws.amazon.com/lambda/pricing/) which makes AWS lambda a cost efficient option to deploy microservices. While a lambda function runs on-demand, scaling the lambda function is managed by AWS. One thing to keep in mind though, is that there are certain limits to lambda functions. These include the following:
 

  • triggering up to 1000 concurrent lambda functions per AWS region (anything higher than that will result in throttling; however, limit increase can be requested)
  • each lambda function cannot run longer than 15 minutes (note that this fits the scope of a microservice nicely)
  • the size of the compressed zip file to upload to the lambda function must not exceed 50 MB
     

All that is needed to deploy a service using a lambda function, next to an AWS account and necessary permissions, is to create the lambda function with a valid runtime (including Golang, Python, Java runtimes and many more. For all current runtime options see: https://docs.aws.amazon.com/lambda/latest/dg/lambda-runtimes.html) and upload the code, which the lambda should execute, to the function. The code needs to be a compressed zip file and can be uploaded directly or picked from an Amazon S3 location. Within that code, a handler function is needed, which will serve as the entry point of the application. The above steps can either be done via the AWS Management Console, AWS CLI or frameworks like AWS Serverless Application Model (SAM) or the Serverless Framework.


Using Amazon API Gateway and Amazon DynamoDB along with AWS Lambda is a fitting choice as all these services are serverless and integrate seamlessly (see figure 3). With Amazon API Gateway you can add authentication and authorization, caching, API versioning as well as rate limiting to your services. It offers a great way to enhance security, manage different environments as well as manage traffic to your services. There is no charge for the API Gateway but for the traffic that goes through it. An API Gateway exposes the REST API provided by a lambda function and can easily invoke the function on incoming requests. Authentication for your service can be smoothly added by enabling authentication through either
 

  • IAM roles
  • Amazon Cognito
  • Lambda Authorizer (custom authorization through a dedicated lambda function)

 
Another benefit is that you can deploy different versions of a service by deploying to different stages, where each stage can have its own configuration parameters and can be rolled back while keeping a history of deployments (useful for managing e.g. dev, test, prod environments). Another nice addition to Amazon API Gateway is that you can create usage plans for your customers that define who can access an API as well as how fast and often the API can be accessed per customer adding an additional layer of security to the service.


Either one central API Gateway can be used in front of all microservices of an application or a dedicated API Gateway per microservice can be setup, enabling more fine-grained configurations on a per service basis.

basic AWS Serverless architecture setup

Figure 3 depicts a basic AWS Serverless architecture setup where a client interacts with a service through Amazon API Gateway

Amazon DynamoDB is a serverless, non-relational database that scales horizontally and offers high performance (low latency on data retrieval) and high availability (across multiple availability zones) with almost unlimited throughput and storage. It works with distributed tables that can hold values with arbitrary attributes, as long as a primary key, consisting of a partition key (HASH) or a partition and a sort key (HASH+RANGE), is provided. Each row in a table should hold all the necessary data without the need for additional queries. Except for the primary key, which needs to be defined at table creation time, all other items’ attributes can be added over time. Furthermore, DynamoDB offers a great way to query data fast with parallel table scans and filtered queries with global (alternative primary key, can be added/edited after table creation) and local (alternative sort key, must be defined at table creation time) secondary indexes, making queries more efficient and flexible.
 

Taken all together, the benefits of Amazon DynamoDB like a flexible schema, no servers to manage, zero downtime, and automatic scaling to adjust fast to current capacity demands, make this AWS service a well-fitting choice as a data storage for a microservice.
 

Furthermore, the above-mentioned services integrate seamlessly with Amazon CloudWatch – a monitoring and alerting service – which easily enhances observability, error tracing and robustness of a service as well as the developers’ everyday work.

Conclusion

Overall, AWS Serverless offers customers a way to build, develop and deploy microservices efficiently, while focusing on the important parts of a service, leading to more productive development teams and thus a faster time-to-market. Of course, whether AWS Serverless is the right choice should be decided on a per service basis and can be a complex decision process that involves many factors. This can be overwhelming at first but once the initial obstacles are removed, the development on AWS Serverless will be a smooth one.