Creating a Productive Local Development Environment with Kubernetes
Tools, practices, and configuration for creating an effective local development loop when building and deploying apps to Kubernetes.
Ask any developer what their top priority is when working with a new application or new technology stack and they will point to creating an effective local development environment. This is primarily focused on installing all of the tools they need to be productive. The goal is always to get a fast development feedback loop established that is as production-like as possible.
Although the goal remains the same when working with cloud native technologies, when adopting containers and Kubernetes there are a few more tools to install and configurations to tweak.
This guide applies to a developer simply experimenting with Kubernetes and also a new engineer onboarding onto a team deploying onto Kubernetes. The quicker a developer can get their local development environment configured, the quicker they can ship code to production. The gold standard is to ship code on the first day.
From Local Dev to Remote Deployment: Avoiding Cloud Complexity
Before cloud native architecture became the dominant approach to designing, deploying, and releasing software the local development story was much simpler.
Typically a developer would install the language runtime on their machine, download the application source code, and build and run the (often monolithic) application locally via their favourite IDE.
As applications and the underlying frameworks increased in complexity, the start time of an app in development increased. This often resulted in a slow coding feedback loop. This led to many web frameworks, IDEs, or custom tools enabling “hot reloading”. This capability allows code changes to be quickly visible (and testable) via the locally running application, without the need for a redeployment or restart.
With the rise in popularity of containers and Kubernetes has introduced more layers into a typical tech stack. There are clear advantages in relation to this, such as isolation and fault tolerance, but this has also meant that the local development setup has increased in complexity.
|Number of Services||1 (or a small number)||Many|
|Local Infra Required||Potentially a VM (controlled via Vagrant etc.)||Docker, Kubernetes, VM|
|Rebuild and Deploy via||IDE||Compile, Docker build, kubectl apply etc.|
|Hot Reload||Included in app framework||Not available out of the box|
|Integration Testing||External services via mocks, sandboxes, etc.||Internal and external services via mocks|
|Connecting to Remote Test Environment||SSH||kubectl --context, kubectl -n|
Supercharging Your Local Kubernetes Development Environment
Being able to effectively configure a local development environment for services deployed in Kubernetes is not dependent on a single tool or technique. A combination of approaches is required:
- Container Build Tools
- Hot Reload
- K8s-Aware Command Line
- Kubernetes Dashboard
[Podcast] Livin’ on the Edge #13: Dave sudia on Kubernetes Local Development, Building a PaaS, and Platform Personas
[Podcast] Livin’ on the Edge #14: Katie Gamanji on Kubernetes Tooling DX, GitOps, and the Cluster API
[Podcast] Livin’ on the Edge #10: Sam Newman on Microservice Ownership, Local Development, and Release Trains
Frequently Asked Questions
- Do I need to use a new IDE when developing applications for Kubernetes?
- What’s the difference between developing applications for Docker and developing applications for Kubernetes?
- What’s the best practice for testing Kubernetes-based applications locally?
- How can I develop and test my application when I can’t run all of my applications in a local Kubernetes cluster (minikube, k3s, kind etc)?