Skip to main content

7 posts tagged with "Cloud-native"

View All Tags

· 10 min read

This fourth iteration of my cloud-native project, https://github.com/dfberry/cloud-native-todo, added the steps of creating the cloud resources (provisioning) and pushing code to those resources (deployment).

Diagram showing local and cloud areas with actions of provision and deployment between them.

For this cloud-native project, I knew there would be a Docker image of the project in a registry but I wasn't sure of the fastest steps to create the image from the repository, push it to the registry or how it was pulled into the hosting environment. The authentication part to push to a registry and from which tool is usually what takes a minute or two. Anything that improved that auth flow would be welcome.

Sticking with tools I know to go as fast as possible, I used Azure Developer CLI for the infrastructure.

Install Azure Developer CLI as a dev container feature in Visual Studio Code

Installation of Azure Developer CLI into dev containers is easy with a feature. Find the feature and add it to the ./.devcontainer/devcontainer.json.

// Features to add to the dev container. More info: https://containers.dev/features.
"features": {
"ghcr.io/azure/azure-dev/azd:latest": {}
},

Use the Visual Studio Code command palette to select Dev Containers: Rebuild and reopen in container. Check the version of the Azure Developer CLI installed with the following command:

azd version

The response:

azd version 1.5.0 (commit 012ae734904e0c376ce5074605a6d0d3f05789ee)

Create the infrastructure code with Azure Developer CLI

I've done most of this work before in other projects. I didn't really expect to learn anything new. However, GitHub Universe 2023 and Microsoft Ignite 2023 both took place between iteration 003 and my start on this iteration, 004. While I still used Copilot Chat as my pair programming buddy, I also leaned into any new feature I heard of from these two industry conferences. The Azure Developer CLI's azd init feature had an update (version 1.50) and I wanted to see what it would do. It asked Copilot Chat a couple of questions then it created the required files and folders. It took hours of Bicep development and compressed it into 30 seconds. Amazing!!!

Screenshot of Visual Studio Code using azd init to create infrastructure of project.

Did it correctly configure the infrastructure for this project? Yes. When I add a second app to this project, further down the road, I'll rerun azd init in a new branch.

The azd init process created a ./next-steps.md which was a huge help in validation.

Screenshot of Visual Studio Code displaying next steps file

Get cloud resource environment variables from Azure Developer CLI

The next steps covered environment variables because your project may need access to cloud resource secrets, connection strings, resource names, database names, and other settings created during provisioning to complete deployment tests. Azure Developer CLI gives you access this list of environment variables with azd env get-values to create your own .env file for your project.

I created a Bash script to get those values so I could test the endpoint.

#!/bin/bash
# Usage: <script> <path-for-env-file>
# Example: ./scripts/postdeploy.sh "./api-todo-test"
echo "postdeploy.sh"

set -x

echo "Getting param 1"
ENV_PATH="$1/.env" || ".env"
echo "ENV_PATH: $ENV_PATH"

echo "Remove old .env file"
rm -f $ENV_PATH

echo "Getting values from azd"
azd env get-values > $ENV_PATH

# Check if the .env exists
if [ ! -f "$ENV_PATH" ]; then
echo "*** .env file not found at $1"
exit 1
fi

# Run the npm test command
echo "Run test at $1"
cd "$1" && npm test

echo "Test completed"
exit 0

This script is called in the ./azure.yaml file in the post deployment hook:

postdeploy: 
shell: sh
run: |
echo "***** Root postdeploy"
./scripts/postdeploy.sh "./api-todo-test"

Develop containers for cloud-native apps

When I tried to use Azure Developer CLI to provision the project with azd up, the provision failed because the CLI couldn't find the tools in the environment to build and push the image to the Azure Container Registy.

Screenshot of Visual Studio Code terminal displaying result of azd up as docker tools are missing

While Docker isn't specifically required to run Azure Developer CLI, it's logical to assume if I intend to create images, I need the tools to do that. Copilot advised me to create a new Dockerfile for the dev container. This would have added another level of complexity and maintenance. Instead, I chose to use a dev container feature for docker-in-docker which leaves that complexity to the owner of the feature.

Fix for dev container won't start

I love Docker and I love dev containers but occasionally containers just don't start and the error messages are so low-level that they generally aren't helpful. The whole point of containers is that they consistently work but I develop on a Mac M1 and containers sometimes don't work well with M1.

When I added the docker-in-docker feature to the Visual Studio dev container and rebuilt the container, the container wouldn't start. I changed the configs and looked at the order of features, searched StackOverflow and GitHub and chatted with Copilot. Nothing helped. Using Visual Studio Code to rebuild the dev container without the cache didn't fix it either. Which is when I knew it was my environment.

The fix was to stop the dev container, delete all containers, images, and volumes associated with the dev container and start over completely. I didn't have any other projects in dev containers so I removed everything.

# Delete all containers
docker rm -f $(docker ps -a -q)

# Delete all images
docker rmi -f $(docker images -a -q)

# Delete all volumes
docker volume rm $(docker volume ls -q)

Deploy Express.js container image to Azure

Restart the dev container and the dev container started. At this point, I tried to provision again with azd up (provision & deploy) which succeeded. It's impressive how the services just work together without me having to figure out how to pass integration information around.

Screenshot of Visual Studio Code with successful deployment

Then I tried the endpoint for the API which is shown at the end of the output when azd up is done. I didn't get my project from the endpoint. The "Hello World" for containers responded at the endpoint which meant provisioning worked but deployment failed.

Find container image deployment error in Azure portal deployment log

The Azure resource group, the logical unit for all the resources in the infrastructure, has a deployment log. The Container App showed a failed status. The code is still a very simple Express.js app so the issue had to also be simple. I checked the deployment logs in the Azure portal and found the app's start script pointed to the wrong file.

Azure portal Container App deployment revision error

Following the error to the log shows the issue that the start file is incorrect.

Azure portal Container App logs

A quick fix to the Dockerfile.

# Wrong cmd
CMD [ "pm2-runtime", "start", "server.js" ]

# Correct cmd
CMD [ "pm2-runtime", "start", "dist/start.js" ]

Then azd up and the correct endpoint worked.

Add a version header to source code

While testing the deployment, I wanted to add versioning to the app so I knew changes to the project were displayed at the endpoint. The root request returns the version found in the ./api-todo/package.json, and the APIs return a x-api-version header with the value.

// eslint-disable-next-line @typescript-eslint/ban-ts-comment
// @ts-ignore: Ignoring TS6059 as we want to import version from package.json
import { version } from '../../package.json';

export function setVersionHeader(_, res, next) {
res.setHeader('x-api-version', version);
next();
}

The curl request returns the header when using --verbose.

Visual Studio code terminal with curl request showing x-api-version header displaying version 1.0.1

Add Playwright test to validate API

When I added playwright to the dev container and rebuilt the dev container, the container started but Playwright and its dependencies took up too much room. I increased the size of my container and limited by testing to Chrome. I also added the installation in the ./.devcontainer/post-create-command.sh script. By adding the installation here, when the container opens, I can see if it has enough room for a big dependency like Playwright and its browsers.

# ./.devcontainer/post-create-command.sh

#! /bin/bash
sudo apt-get clean
sudo apt update
npm i -g npm@latest
npm install
chmod -R +x ./scripts
npx playwright install --with-deps
echo "Node version" && node -v
echo "NPM version" && npm -v
echo "Git version" && git -v
echo "Docker version" && docker --version

The Playwright for the API tests the new header and the returned array of todos.

import { test, expect } from '@playwright/test';
import dotenv from 'dotenv';
dotenv.config();

const API_URL = process.env.API_TODO_URL || 'http://localhost:3000';
console.log('API_URL', API_URL);

import { version } from '../../api-todo/package.json';

test.use({
ignoreHTTPSErrors: true, // in case your certificate isn't properly signed
baseURL: API_URL,
extraHTTPHeaders: {
'Accept': 'application/vnd.github.v3+json',
// Add authorization token to all requests.
'Authorization': `token ${process.env.API_TOKEN}`,
}
});
test('should get all todos', async ({ request }) => {
const response = await request.get(`/todo`);
expect(response.ok()).toBeTruthy();

// Validate the x-api-version header
const headers = response.headers();
expect(headers).toHaveProperty('x-api-version');
expect(headers['x-api-version']).toEqual(version);


// Validate the response body
const todos = await response.json();
expect(Array.isArray(todos)).toBeTruthy();
expect(todos.length).toEqual(3);
});

Run the test from the workspace with npm run test --workspace=api-todo-test and see the test succeeded.

Screenshot of Visual Studio Code terminal with Playwright test results

Most fun - time savings

The best part about this project is the tooling. I can spend less time and enjoy that time more.

Cartoonish image of a clock

Currently Copilot shines with technologies that have a lot of Internet coverage including docs and troubleshooting. For this particular iteration, the only place Copilot didn't help was the annoying Docker issue when the dev container wouldn't start after adding the docker-in-docker dev container feature.

Wish list item #1 - azd test

While Azure Developer CLI provided provisioning and deployment, it didn't add testing. This seems like a natural next step for the project. It knows what the stack is because it created the infrastructure to support it. And it knows the endpoints because it displays them at the end of the deployment. Adding API tests seems within the tool's ability someday.

Wist list item #2 - docker-in-docker

Since the infrastructure required containers and the environment had the .devcontainer folder, adding docker-in-docker as a dev container feature is probably something Azure Developer CLI can fix in the future...perhaps a YAML snippet for the dev container feature in the ./next-steps.md:

"features": {
"ghcr.io/azure/azure-dev/azd:latest": {},
"ghcr.io/devcontainers/features/docker-in-docker:1":{}
},

Tips

There were a few things I found useful that I use moving forward in my development in the Tips list.

Results for 004 - create resources and deploy code

Once again Copilot saved a lot of time but it took backseat to the amazing work Azure Developer CLI provided with the entire DevOps flow. And notice there wasn't any auth flow for the Container registry to deal with when pushing images. That was all wrapped up in the Azure Developer CLI auth. Another time saver.

· 4 min read

Our cloud-native adventure begins with the API layer - the magical bridge between the front-end UI and the back-end services. For our Todo project, we're keeping the API simple and efficient. Express.js is our chosen framework, a tried-and-true Node.js project. With the power of Copilot Chat, we'll be speeding through the process in no time!

How do you typically approach building a new API for a cloud-native project?

Fire Up the Dev Container

In the previous chapter of our journey, 002-developer-environment-setup, we set up a robust dev environment. Now, it's time to bring it to life! Open it in GitHub Codespaces or locally on your computer with Visual Studio Code (Docker installation required).

Whether you're the lead developer or part of a team, whether you're working on a familiar project or exploring new territories, you've got options. Visual Studio Code and the dev container for local work, or Codespaces for a cloud-based approach.

Stay in the dev container

If you are like me, your local computer may not be a workhorse so docker may not be started when you start working on your project. You can start using Copilot chap locally, then realize you need the dev container for something. The Copilot chat stays with the environment, it doesn't move (at this time). If you are 20 questions into your conversation with a few side trips here and there, switching environments and not having the chat to reference is frustrating.

If you are using dev containers and Copilot chat, start and stay in the container for the entire conversation.

Building an API with Copilot Chat

In just half an hour, Copilot Chat helped me create a fully functional API, complete with types, linting, tests, and a build-test workflow. Here's a sneak peek into the prompts I used:

  1. Building a todo microservice with Node.js and TypeScript.
  2. Adding tests for the todo API.
  3. Refactoring server.ts for both server and test.
  4. Modifying server.ts for CRUD operations.
  5. Providing initial sample data.
  6. Creating an OpenAPI yaml for the API.
  7. Adding an OpenAPI UI route.
  8. Setting up ESLint with Prettier.
  9. Deciding .gitignore contents.
  10. Moving openapi.yaml to the dist folder using tsc.
  11. Creating a GitHub action for linting, building, and testing.
  12. Identifying missing microservice elements.
  13. Adding type safety.

Refining Types and Refactoring

There was some back-and-forth over types and refactoring. Copilot shone in evaluating incoming API request data. After a few prompts, the validation looked like this:


export interface Todo {
id: number;
title: string;
}

export interface PartialTodo {
id?: unknown;
title?: unknown;
}

const todoSchema = Joi.object({
id: Joi.number().greater(0).required(),
title: Joi.string().min(1).max(1000).required(),
}).unknown(false);

const todoPartialSchema = Joi.object({
id: Joi.number().greater(0),
title: Joi.string().min(1).max(1000).required(),
}).unknown(false);

export interface TodoValidation {
valid: boolean;
error: Error | null | string | ValidationErrorItem[];
todo: Todo | PartialTodo | null;
}

export const validateTodo = (
todo: PartialTodo,
isNewTodo: boolean = false
): TodoValidation => {
const schema = isNewTodo ? todoPartialSchema : todoSchema;
const { error } = schema.validate(todo);
if (error) {
return {
valid: false,
error: error.details,
todo: null,
};
}
return { valid: true, error: null, todo };
};

Copilot also excelled in creating logging handlers for requests and responses, and in adding those handlers to the route.

Copilot's Strengths

Copilot gave me a flying start. While some answers lacked details, running the app or tests quickly revealed any errors, which were easy to fix.

Copilot's Weaknesses

Despite the conversation and the wealth of examples, I encountered more issues than expected. For more obscure subjects, I'd recommend breaking down the steps more atomically.

Why Not Use Existing Code?

Sure, there are plenty of examples on GitHub. But navigating licenses and attributions can be tricky. I preferred to avoid any potential missteps.

Time Investment

From start to finish, the project took about 2 hours. There were a few hiccups along the way, but each step was small, making issues manageable. Considering everything, 2 hours is a solid benchmark for a proof-of-concept project.

· 6 min read

While this project is focused on cloud-native, I set up my environment the same way for every project, with GitHub repositories and dev containers. This allows me to move from my Windows laptop, to my Mac M1, and on to GitHub Codespaces when demo-ing.

YouTube video

How do you ensure a consistent developer setup on your team? Survey

Visual Studio Code + dev containers

If you haven't heard of dev containers, let me welcome you to the club. I ❤️ dev containers.

"A development container (or dev container for short) allows you to use a container as a full-featured development environment."

Visual Studio Code has taken this idea, realized how hard Docker can be for newbies, and sorted it out so it is generally painless for beginners. By the time you want to expand beyond the beginner environment, you are ready to ask the right questions.

Meet my good friend Docker

Imagine you want to work in environments other than your current computer, and you don't want to deal with the time and complexity of installations and configurations. A general configuration will do. That's kind of the mentality that eventually (down a long dark road of VirtualBox and Vagrant) led to Docker.

There are many (many!!!) versions of Docker images on Docker Hub for you. This generally just means knowing an image's name. Some images are from manufacturers and some are from community contributions, with improvements and opinions about what the image should have.

Visual Studio Code also has some images, meant specifically for development environments. The base image supports a lot of programming languages and tools so don't feel like you need to research images just yet. The basic image will do.

  1. Make sure Docker is running on your computer.
  2. Make sure you have the Visual Studio Code's Dev Containers extension added.
  3. Pick a Dev containers quickstart in the Visual Studio Docs.
  4. My favorite is the Open an existing folder in a container.
  5. The steps ask you questions and you answer. Its straightforward and to the point. No cumbersome CLI with many switches to understand.
  6. It picks an image for you and asks you if you need anything else. So early in the project, you don't. You can always come back later and check a box if you need something.

Where's the AI?

This is pretty routine for me - I've been in Dockerland for many years so I haven't necessarily asked Copilot for help. However, you can always ask Copilot for advice or next steps. Make sure you have any relevant code in the main Visual Studio editor so Copilot can find the best answer.

Prompt: I need to set up a Todo microservice. What Visual Studio Code dev container would you suggest?

GitHub Copilot: For a JavaScript project using npm and React, I would suggest using a Node.js based dev container. Visual Studio Code has a predefined Node.js & TypeScript dev container that could be a good starting point.

Here's a basic setup:

  1. Install the "Remote - Containers" extension in Visual Studio Code.
  2. Open your project folder in VS Code.
  3. Press F1 to open the command palette and run the "Remote-Containers: Add Development Container Configuration Files..." command.
  4. Choose "Node.js & TypeScript" from the list.

My perspective: This isn't the entire response. Copilot immediately went into dense Docker file syntax. Visual Studio Code does a great job of keeping you out of that.

My dev container setup for Node.js and TypeScript

Here is how I set up my dev container:

  1. I brought the dfberry/cloud-native-todo down to my local computer. You should fork then clone it if you want to do the same.
  2. Open the local repository in Visual Studio Code.
  3. Docker is running and the Visual Studio Code's Dev Containers extension is installed.
  4. Reopen the folder in a New Dev Container.
  5. That pops up a list of container configuration templates.
  6. I selected the one which uses my language of choice: Node.js & TypeScript and an appropriate version (18). If you choose a different template, your questions may be different.
  7. Visual Studio Code reopened with a new terminal prompt, which indicated I was in a dev container environment.
  8. I checked the Node.js version just to make sure: node --version and got back v18.18.0. My dev container works and all I did was answer questions. For a newbie to Docker and containerization, this is a huge time saver. You will eventually know Docker, but you don't need to know it on day 1.

Where is the new dev container?

Visual Studio Code added a file to the repository found in the .devcontainer folder, named devcontainer.json. Open it up.

You'll notice some settings are commented out, you'll eventually use them, but not just yet.

For now, the most important parts are the image which is equivalent to your operating system and its additional installations, and the customizations property which is probably empty.

Since I know I want to use Copilot as my pair programmer, I need to add that to the environment. In the Visual Studio extensions, search for GitHub Copilot Chat and select Install in Dev Container. This adds the appropriate setting to the customizations property.

One final setting I use a bit is the postCreateCommand. I usually add some npm packages for developer experience here but for now I just want to make sure the correct versions of the tools are in the environment.

At this point, my devcontainer.json looks like:

// For format details, see https://aka.ms/devcontainer.json. For config options, see the
// README at: https://github.com/devcontainers/templates/tree/main/src/typescript-node
{
"name": "Node.js & TypeScript",
// Or use a Dockerfile or Docker Compose file. More info: https://containers.dev/guide/dockerfile
"image": "mcr.microsoft.com/devcontainers/typescript-node:1-18-bookworm",

// Features to add to the dev container. More info: https://containers.dev/features.
"features": {
},

// Use 'forwardPorts' to make a list of ports inside the container available locally.
// "forwardPorts": [],

// Use 'postCreateCommand' to run commands after the container is created.
"postCreateCommand": "node -v && npm -v && git -v",

// Configure tool-specific properties.
"customizations": {
"vscode": {
"extensions": [
"GitHub.copilot-chat"
]
}
},

// Uncomment to connect as root instead. More info: https://aka.ms/dev-containers-non-root.
"remoteUser": "root"
}

Check the dev container into source control

Now that you have a working dev container, check it into source control and ask everyone on the team to try it out. This simple container should work on all hosts (host is the term for the machine running the container). More complicated dev containers may not work the first and take some tinkering.

Where's the Cloud-Native?

You may be wondering why there isn't any cloud-native code yet. Don't worry, I'll get to it. Setting up a dev environment that works for every member of your team paves the way to a smooth development experience. Take the time to make sure the dev container is configured correctly and works for everyone.

Sample Code Repository

Questions?

This is how I approach a consistent developer experience across a team. How do you do it? Let me know. Survey