New API Development Platform, join Blackbird Beta and try it now Learn More

Back to blog
CANARY RELEASES, EDGE STACK API GATEWAY

Handling Canary Releases with the Edge Stack API Gateway

Kayode Adeniyi
December 1, 2023 | 11 min read

In this walkthrough, we’ll discover the transformative potential of canary releases in software development, demonstrating step-by-step how canary releases enable you to deploy stable versions of new features and updates with confidence and minimize risk. We'll explore how Edge Stack API Gateway works to make canary deployments easy to implement and manage. By the end, you'll be equipped with the knowledge you need to leverage canary releases confidently in your projects.

Prerequisites

To get the most out of this tutorial, you’ll need:

  • A basic understanding of JavaScript and node
  • How Docker and Docker-Compose work
  • A basic understanding of Edge Stack API Gateway
  • Docker and Postman installed on your computer

What are Canary Releases?

Have you ever wondered how companies introduce new production versions or updates without affecting the user experience? Well, that's where Canary releases come into play.


Canary releases are a deployment strategy that software development teams use to test new versions of the application or features on a small subset of users before rolling them out to everyone. The idea behind this is to minimize the risk of introducing new features or updates that may negatively impact the application's user experience.


Here's how Canary releases or deployments work: instead of deploying a new feature or update to all users simultaneously, the software development team deploys it to a small subset of users, typically 1-5%. This small subset of users is called the Canary group. The Canary group is then closely monitored to see if the new feature or update works as expected. If everything goes well, the new feature or update is gradually rolled out to more users until it's available to everyone. The rollout can be stopped or rolled back to the previous version if issues arise.


Canary releases are essential in deploying new features or updates without negatively affecting user experience. Companies like Google, Facebook, and Netflix incorporate canary deployments in their software development cycles to gather feedback for regression testing. Canary releases have proven effective in minimizing the risk of introducing new features, making them a crucial step in version deployments.


How do you perform canary releases using the Edge Stack API Gateway?

Edge Stack API gateway offers precise control over Canary releases by employing a weighted round-robin method for distributing traffic across various services. This strategy is complemented by the collection of detailed metrics for all services, which facilitates the comparison of canary and production service performance.


The weight attribute:

The weight attribute determines the proportion of traffic to be directed to a specific resource via a particular mapping. The attribute's value, expressed as an integer percentage ranging from 0 to 100, is balanced by Edge Stack to ensure that the total weight of all mappings associated with a resource equals 100%. When only one mapping is present, it receives 100% of the traffic, regardless of its weight assignment.


Assigning a weight to a mapping is relevant only when multiple mappings are designated for the same resource, and it is generally not advisable to assign a weight to the "default" mapping, which is typically responsible for handling most traffic. Allowing Edge Stack to direct all unassigned traffic to the default mapping simplifies updating weights.


Now that you understand how the weight attribute works in Edge Stack, let’s talk about the sample application we’ll use for this tutorial.


Let’s get hands-on with Edge Stack API Gateway!

In this section, we will clone a Node.js backend application and understand its architecture. Furthermore, we will run the application locally and verify it works by testing it with Postman. We built this application before this tutorial, so you can focus on grasping the concept of canary deployment and its implementation using Edge Stack rather than coding the whole application yourself.


Step 1: Understanding the sample node app works


This simple architecture depicts how Edge Stack will balance an API endpoint request to different node applications. Both the Node.js applications expose the same endpoint but with a different response for demonstration purposes.


Furthermore, we can see from the architecture that 60% of the traffic will be routed toward the canary application, whereas 40% will be routed toward the simple application. This functionality is achieved by configuring the routes in the Edge Stack mapping with a weight attribute we discussed in the previous section.

Step 2: Clone the Application

It is time to clone the Node.js project whose architecture we previously discussed. To clone the project, run the following command in your terminalgit clone git@github.com:Adeniyikayodee/AmbassadorCanary.git


Application directory and code

canary-deployment

|-----backend

| |----- simple_application

| |----- canary_application

| |----- ambassador

| |-----application_mapping.yaml

|-----release

|-----docker-compose.yaml


For context:

  • simple_application: This backend application has a GET endpoint that serves a message in the response.
  • canary_application: This backend application is the same as the simple_application, but the difference lies in the GET endpoint. This application serves the GET endpoint with a different message to identify it as a canary release.
  • ambassador: This directory has a file with routes configured to hit the appropriate endpoints of both the backend applications.

application_mapping.yaml:

---apiVersion: getambassador.io/v2kind: Mappingname: get_contentprefix: /v1/content/rewrite: ''service: canary_application:3001weight: 60---apiVersion: getambassador.io/v2kind: Mappingname: get_canary_contentprefix: /v1/content/rewrite: ''service: simple_application:3000weight: 40

Mapping Attribute Explanation

The configuration file above defines routing rules for HTTP traffic within the Kubernetes cluster.

  • The Mapping object is used to tell Edge Stack how to route traffic.
  • The apiVersion attribute specifies the API version for Edge Stack that the configuration is designed for.
  • `kind` denotes that this is a `Mapping` kind, which is used to manage how requests are forwarded to services.
  • The `name` attribute gives a unique identifier to each route, making it possible to distinguish between them.
  • `prefix` specifies the initial part of the URL path that Edge Stack will match against incoming requests.
  • The `rewrite` field, left empty in this example, would normally be used to alter the path of the request before forwarding it to the specified service.
  • The `service` attribute indicates the name of the service to which traffic should be routed, along with the port number on which the service is exposed.
  • Finally, `weight` is used for canary deployments where you might want to introduce a new version of the application and gradually shift traffic to it. It defines the proportion of traffic that should be forwarded to each service.

For this specific YAML:

1. The first `Mapping` routes 60% of traffic hitting `/v1/content/` to the `canary_application` service on port `3001`.

2. The second `Mapping` routes the remaining 40% of traffic to the `simple_application` service on port `3000`.


Run the following command to apply these routing rules to your cluster:


kubectl apply -f application_mapping.yaml


This will configure the traffic routing as specified, which is useful for canary deployments or A/B testing.

Step 3: Test the Application

We are familiar with the architecture of the application and Edge Stacks' configuration; it’s time to test it out. To start the application, run the following command from the root of the project in your terminal:

sudo docker-compose up -d --build


To list the running Docker containers, run the following command in your terminal:

sudo docker ps

Open Postman and simultaneously hit the endpoint given below to evaluate the canary


http://localhost:8080/v1/content/fetch


In the first image here, the first traffic is routed to the simple_application on the backend that has 40% allocated traffic.


The second image above shows routing the traffic to the canary_application with 60% allocated traffic.


Both images demonstrate Edge Stack's load balancing mechanism, which distributes incoming requests to different service instances based on pre-configured weights assigned to each route. This functionality is integral to Edge Stack API Gateway, facilitating the execution of canary releases.


Using weighted routing allows for incremental testing of new features in the production environment, following their validation in development or local test environments.


Consequently, it is possible to introduce changes to the production with minimal risk and higher confidence in the new version's stability.

That’s a Wrap on Canary Releases & Edge Stack

This article aimed to shed light on what canary releases are, how they work with Edge Stack API Gateway and how Edge Stack handles routing with the help of an architecture diagram. Furthermore, we explained how to configure Edge Stack to load balance traffic servers, which helped us understand how canary deployments work. Hopefully, you’re a canary releases pro now!


Here you can learn more about Edge Stack and its architecture or explore Edge Stack API Gateway documentation.


Good luck Canary-ing! 😇