A Fast and Easy Docker Tutorial for Beginners (Video Series)

Docker logo

Introduction

Docker has taken the software development world by storm in recent years. But what exactly is Docker, and why should you care about it as a developer?

In a nutshell, Docker is a tool that lets you package your application along with all of its dependencies into a standardized unit called a container. Containers isolate software from its environment to ensure that it works consistently on any infrastructure.

This solves a common problem in software development known as the "it works on my machine" phenomenon. By bundling everything your app needs to run into a container, you can easily share it with others and deploy it to different environments without worrying about compatibility issues.

Containers are also lightweight and start up much faster than traditional virtual machines. This makes it easy to build, test, and deploy applications more quickly and efficiently.

In this tutorial series, we‘ll walk through the process of getting started with Docker in a beginner-friendly way. By the end, you‘ll have a solid grasp of key Docker concepts and be able to containerize a simple application yourself. Let‘s dive in!

Key Docker Concepts

Before we start working with Docker directly, let‘s take a moment to define some of its core building blocks:

Images: An image is a read-only template that contains the instructions for creating a Docker container. It‘s like a snapshot of a container at a point in time. Images are defined using a special file format called a Dockerfile.

Containers: A container is a runnable instance of a Docker image. You can think of a container as a lightweight, stand-alone executable package that includes everything needed to run your application (code, libraries, configuration files, etc).

Dockerfile: A Dockerfile is a text document that contains the commands used to assemble an image. It specifies what goes on in the environment inside your container, such as the OS, libraries, and code to install and run.

Now that we‘ve covered the key terminology, let‘s get our hands dirty with some actual code!

Getting Started with Docker

The first step is to install Docker on your local machine. The process differs slightly depending on your operating system.

Installing Docker on macOS

  1. Download the Docker for Mac installer from the official Docker website.
  2. Double-click the .dmg file to open it and drag the Docker icon to your Applications folder.
  3. Double-click Docker.app in the Applications folder to start Docker.
  4. When prompted, authorize the installation with your system password.
  5. Once Docker is up and running, you‘ll see the whale icon in your status bar.

Installing Docker on Windows

  1. Download the Docker for Windows installer from the official Docker website.
  2. Double-click the .exe file to run the installer.
  3. Follow the installation wizard to accept the license, authorize the installer, and proceed with the install.
  4. Once installation is complete, Docker will start automatically. The whale icon will show up in the status bar indicating that Docker is running.

Installing Docker on Linux

Installing Docker on Linux varies depending on the distribution you‘re using. You can find detailed instructions for your specific distro in the Docker documentation. But in general, the process involves:

  1. Setting up the Docker repository
  2. Installing the Docker Engine
  3. Starting the Docker daemon
  4. Verifying that Docker is installed correctly by running a test image

After installing Docker, let‘s confirm it‘s working by running the hello-world image:

docker run hello-world

You should see a message that starts with "Hello from Docker!" This indicates that your installation is working correctly.

Creating a Simple Docker Image

Now for the fun part – let‘s containerize an actual application! We‘ll create a simple Node.js app and package it into a Docker image.

First, create a new directory for your app and navigate into it:

mkdir my-app
cd my-app

Next, create a package.json file that describes your app and its dependencies:

{
  "name": "my-app",
  "version": "1.0.0",
  "description": "My first Docker app",
  "author": "Your Name <[email protected]>",
  "dependencies": {
    "express": "^4.16.1"
  }
}

Then create a file named app.js with the following content:

const express = require(‘express‘);

const app = express();

app.get(‘/‘, (req, res) => {
  res.send(‘Hello World!‘);
});

app.listen(8080, () => {
  console.log(‘Listening on port 8080‘);
});

This is a minimal Express web server that responds with "Hello World!" when you access the root route.

Now we need to define our Docker image using a Dockerfile. Create a file named Dockerfile in your app directory with the following:

FROM node:14

WORKDIR /app

COPY package.json .
RUN npm install

COPY . .

EXPOSE 8080
CMD [ "node", "app.js" ]

Let‘s break this down line by line:

  • FROM specifies the base image to start from, in this case the official Node 14 image.
  • WORKDIR sets the working directory for subsequent commands.
  • COPY copies files from your local machine into the image.
  • RUN executes commands inside the image, in this case installing your app‘s dependencies.
  • EXPOSE documents the port that the container listens on at runtime.
  • CMD specifies the default command to run when starting a container from this image.

With your Dockerfile ready, you can now build your Docker image:

docker build -t my-app .

This command builds an image tagged as "my-app" using the Dockerfile in the current directory. The resulting image will have Node.js and your application code.

Finally, you can run a container from your newly minted image:

docker run -p 8080:8080 my-app

The -p flag maps port 8080 in the container to port 8080 on your machine. Visit http://localhost:8080 in your browser and you should see the "Hello World!" message served up by your containerized Node.js app!

Useful Docker Commands

Here are some handy Docker commands to manage your images and containers:

List all images:

docker images

List all running containers:

docker ps

List all containers (including stopped ones):

docker ps -a 

Stop a running container:

docker stop <container id>

Remove a container:

docker rm <container id>

Remove an image:

docker rmi <image id>

Debugging Broken Builds

Inevitably, you‘ll encounter broken builds when working with Docker. A useful debugging tactic is to run an interactive shell inside your broken image to poke around.

For example, if your docker build fails, you can start a shell using your broken image with:

docker run -it <image id> /bin/bash

This will drop you into a bash prompt inside your image. From here you can explore the filesystem and manually run commands to identify the issue.

Introduction to Docker Compose

While running a single container is fine for development, real-world applications often involve multiple interacting services, each running in its own container. Docker Compose is a tool for defining and running these multi-container applications.

With Compose, you use a YAML file to define your app‘s services along with their configuration and dependencies. Then, with a single command, you create and start all the services from your configuration.

Here‘s an example docker-compose.yml file that defines a web service and a database service:

version: "3.9"
services:
  web:
    build: .
    ports:
      - "5000:5000"
  db:
    image: "postgres:14.1"
    ports:
      - "5432:5432"
    environment:
      POSTGRES_PASSWORD: example

To start this app, simply navigate to the directory containing your docker-compose.yml and run:

docker compose up

Compose will pull the necessary images, create containers for each service, and start them up in the correct order based on the dependencies you‘ve specified.

Conclusion

We‘ve covered a lot of ground in this tutorial series! To recap, we:

  • Got a high-level overview of what Docker is and why it‘s useful
  • Learned key Docker concepts like images, containers, and Dockerfiles
  • Installed Docker on different operating systems
  • Packaged a simple Node.js app into a Docker image
  • Ran containers from our custom image
  • Explored commands for managing images and containers
  • Introduced Docker Compose for multi-container apps

If you‘ve followed along, congrats! You now have a solid foundation in Docker fundamentals. But there‘s still much more to learn. To continue your Docker journey, I recommend diving deeper into:

  • Deploying your Docker app to the cloud
  • Using Docker volumes for persistent data storage
  • Setting up continuous integration/deployment pipelines with Docker
  • Orchestrating containers at scale with Kubernetes

The official Docker documentation is a great resource for all of these topics and more. Happy containerizing!

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *