Please, Everyone, Put Your Entire Development Environment in Github

As a seasoned full-stack developer, I‘ve spent countless hours of my career setting up and tweaking development environments. It‘s a task that‘s as inevitable as it is frustrating – a veritable rite of passage for anyone who‘s ever tried to run a "hello world" app. From wrangling with conflicting versions of Python libraries to debugging obscure OS-level issues, the struggle to achieve a consistent, reproducible environment has been a constant thorn in the side of developers everywhere.

But it doesn‘t have to be this way. With the advent of containerization tools like Docker and smart IDE integrations, we now have the power to banish "works on my machine" syndrome for good. By committing our entire development environment to source control, we can create a seamless setup experience for every contributor, on any machine. In this article, I‘ll show you how to leverage the VS Code Dev Containers extension to do just that, and make configuration hell a thing of the past.

The Cost of Configuration

Before we dive into the solution, let‘s take a moment to quantify the problem. Just how much time do developers waste on environment setup issues? The answer might surprise you.

In a 2018 survey by the Harris Poll and Electric Cloud, a staggering 81% of developers reported spending up to a quarter of their time troubleshooting environment-related issues. That‘s the equivalent of one full work day every week lost to configuration overhead. Extrapolate that across an entire engineering org, and the productivity costs are eye-watering.

Chart showing 81% of developers spend up to 25% of their time on environment issues
Source: https://www.electric-cloud.com/company/news/press-releases/new-report-reveals-software-delivery-challenges-shackle-enterprise-innovation/

And it‘s not just the initial setup that eats up valuable time. Every time a new developer joins a project, they have to go through the same manual process of installing dependencies, configuring tools, and praying that nothing breaks along the way. This creates a significant barrier to entry and slows down onboarding considerably.

The Rise of Docker

One of the key technologies enabling a solution to this problem is Docker. Since its initial release in 2013, Docker has quickly become the industry standard for containerization, and for good reason. By packaging an application and all its dependencies into a single, portable unit, Docker allows for consistent execution across any machine or cloud platform that can run its engine.

The adoption of Docker has been nothing short of explosive. According to Datadog, the percentage of their customers running Docker rose from 20% in 2015 to over 50% in 2020. And in a 2020 Stack Overflow survey, Docker was the #1 most loved and #2 most wanted platform by developers.

Graph showing rising Docker adoption
Source: https://www.datadoghq.com/docker-adoption/

But Docker‘s benefits extend far beyond just production deployments. By running your development tools inside a container, you can encapsulate your entire environment – from the OS-level dependencies up to your IDE extensions – into a single, version-controlled artifact. No more "it works on my machine" excuses. No more hours wasted on setup. Just a reproducible, isolated environment that can be spun up anywhere, by anyone.

Enter Dev Containers

While running development tools inside containers is a powerful concept, the ergonomics haven‘t always been great. Editing files through a clunky web-based IDE or a terminal-based editor like Vim is a tough sell for developers used to the creature comforts of a modern coding environment.

This is where VS Code Dev Containers come in. By leveraging the remote development capabilities of VS Code, the Dev Containers extension allows you to use a Docker container as a full-fledged development environment, complete with all your favorite themes, extensions, and keyboard shortcuts. It‘s like having the isolation benefits of a container but the user experience of a local setup.

Here‘s how it works under the hood:

  1. When you open a project with a Dev Container configuration (specified by a devcontainer.json file), VS Code prompts you to reopen the project inside the container.

  2. VS Code builds the container image, installs an instance of VS Code Server inside it, and connects the container environment back to your local VS Code UI.

  3. All the files in your project are mounted into the container, and any terminal sessions or debuggers are spawned in the context of the containerized environment.

  4. You get the full VS Code editing experience, but with the added benefits of an isolated, reproducible environment defined by your project‘s Dockerfile and devcontainer.json.

It‘s a truly magical experience that has to be seen to be believed. With a properly configured Dev Container, a new contributor can go from zero to a fully-functional development environment in the time it takes to clone the repo and build the container image.

Anatomy of a Dev Container

So what goes into a Dev Container configuration? At the most basic level, all you need is a Dockerfile that describes your development environment, and a devcontainer.json that tells VS Code how to build and connect to the container.

Here‘s a minimal example:

.devcontainer/Dockerfile

FROM node:14

RUN apt-get update && apt-get install -y vim

ENV MY_ENV_VAR="hello from the container!"

WORKDIR /workspace

.devcontainer/devcontainer.json

{
  "name": "My Node.js Project",
  "dockerFile": "Dockerfile",
  "extensions": [
    "dbaeumer.vscode-eslint",
    "esbenp.prettier-vscode"
  ],
  "forwardPorts": [3000],
  "postCreateCommand": "npm install"
}

In this example, we‘re using an official Node.js 14 image as our base, installing Vim for good measure, setting an environment variable, and specifying the working directory. The devcontainer.json references our Dockerfile, lists some extensions to install inside the container, forwards port 3000 for web server access, and runs npm install after the container is created.

But this is just the tip of the iceberg. You can use the full power of Docker to customize your environment to your heart‘s content. Install any Linux packages or tools you need, set up databases and other services, and even use Docker Compose to spin up multi-container environments. The devcontainer.json also supports a wide range of settings to tweak VS Code‘s behavior inside the container.

Here‘s a more advanced example that illustrates some of these capabilities:

.devcontainer/Dockerfile

FROM mcr.microsoft.com/vscode/devcontainers/python:3.8

RUN apt-get update && export DEBIAN_FRONTEND=noninteractive \
    && apt-get -y install --no-install-recommends postgresql-client

COPY requirements.txt /tmp/
RUN pip install -r /tmp/requirements.txt

COPY . /workspace

.devcontainer/devcontainer.json

{
  "name": "My Python Project",
  "dockerFile": "Dockerfile",
  "forwardPorts": [5000],
  "extensions": [
    "ms-python.python"
  ],
  "settings": { 
    "terminal.integrated.shell.linux": "/bin/bash",
    "python.pythonPath": "/usr/local/bin/python",
    "python.formatting.provider": "black",
    "python.linting.enabled": true,
    "python.linting.pylintEnabled": true,
    "editor.formatOnSave": true
  },
  "postCreateCommand": "pip install -r requirements.txt",
  "remoteUser": "vscode"
}

In this example, we‘re using one of the pre-built Python Dev Container images provided by Microsoft as a starting point. We then install the PostgreSQL CLI tools, copy our requirements file into the container and install our Python dependencies.

In the devcontainer.json, we‘re specifying some custom VS Code settings to use inside the container, such as the Python formatter and linter. We‘re also running the pip install step as a postCreateCommand to ensure our dependencies are always in sync, and we‘re specifying that the container should run as a non-root vscode user for added security.

The beauty of this setup is that it‘s entirely portable. Anyone who opens this project in VS Code with the Dev Containers extension installed will get the exact same environment, down to the Python version, PostgreSQL tools, and VS Code settings. No more setup discrepancies, no more "works on my machine" bugs.

Best Practices for Dev Containers

While the flexibility of Dev Containers is powerful, there are some best practices to keep in mind to ensure a smooth experience for your team:

  1. Keep your images small. Use minimal base images and only install the dependencies you actually need for development. Larger images mean longer build times and slower container starts.

  2. Use multi-stage builds. If your project requires building artifacts like compiled binaries, use a multi-stage Dockerfile to keep your final development image small and focused.

  3. Don‘t ignore your .gitignore. Be sure to add .devcontainer to your .gitignore file to avoid committing large or sensitive files (like SSH keys) to source control.

  4. Use docker-compose for complex setups. If your development environment requires multiple services (like a database and a cache), use a docker-compose.yml file to orchestrate them and reference it in your devcontainer.json.

  5. Leverage pre-built images. Microsoft maintains a collection of pre-configured Dev Container images for popular stacks like Node.js, Python, and .NET. Use these as a starting point to avoid reinventing the wheel.

  6. Document your setup. Even with a Dev Container, it‘s still a good idea to include a README that explains any project-specific setup steps or configuration that‘s not captured in the Dockerfile/devcontainer.json.

  7. Keep your team in sync. Encourage everyone to use the Dev Container setup and to commit any environment changes back to the shared configuration. Avoid one-off local tweaks that can lead to discrepancies.

The Future is Containerized

As powerful as local Dev Containers are, the real magic happens when you combine them with cloud-based development environments. Platforms like GitHub Codespaces and Gitpod allow you to spin up full-fledged, containerized dev environments right from your browser, complete with a web-based version of VS Code.

This means you can have a completely ephemeral, zero-install development experience that‘s accessible from anywhere. Imagine being able to do serious coding from a Chromebook or an iPad, without any local setup whatsoever. Or being able to review a pull request and test the changes without having to clone the code locally. That‘s the power of cloud-based Dev Containers.

While local Dev Containers are still useful for situations where you need full control over your environment or you‘re working offline, the future of development is undoubtedly heading in a more cloud-oriented, containerized direction. By embracing Dev Containers today, you‘re not just solving immediate productivity problems – you‘re also future-proofing your workflow for the era of cloud-based development.

A Call to Containerize

To my fellow developers, I issue this challenge: Make your next project fully containerized from day one. Commit a Dockerfile and devcontainer.json to source control, and encourage your team members to use it for their local development.

To project maintainers and open source authors, I implore you: Include a Dev Container configuration in your repos so contributors can be up and running with minimal friction. Make "how to set up the dev environment" a thing of the past.

The benefits are immense. Faster onboarding, less time wasted on setup, more consistent environments, and easier collaboration. Plus, you‘re setting yourself up for a seamless transition to cloud-based development in the future.

Containerizing your development environment is not just a best practice – it‘s quickly becoming a necessity in the modern landscape of software development. Don‘t get left behind in the age of "works on my machine" excuses. Embrace the power of Dev Containers and take control of your development experience.

Trust me, your future self (and your teammates) will thank you. Happy containerizing!

Similar Posts