How to Easily Run Any Linux Tool on Any Machine with Docker

As developers and IT professionals, we‘ve all been there before. You‘re happily coding away on your preferred operating system, when suddenly you realize you need a specific Linux tool to get the job done. Maybe it‘s a newer version of PHP than what‘s installed on your Linux workstation. Or perhaps you‘re on a MacBook and need to run sqlmap from Kali Linux. You might even be a Windows user who needs to spin up a quick NGINX web server.

In the past, this would mean firing up a virtual machine, dual booting Linux, or undergoing a lengthy install process to get the tool you need. But there‘s now a much simpler solution: Docker.

With Docker, you can easily run just about any Linux command line tool in an isolated container on any operating system. No need to disturb your main working environment or spend hours configuring a VM. Docker makes it trivial to run that crusty old Python 2.7 script or experiment with the latest and greatest build tools, all without cluttering up your system.

In this guide, I‘ll show you how to unlock the power of Docker to run Linux binaries anywhere. You‘ll learn how to install Docker on the OS of your choice, master key concepts like images and containers, and walk through several real-world examples. Let‘s get started!

Installing Docker

The first step is to install Docker on your development machine. The process is a bit different depending on your operating system:

Windows 10 and MacOS:

  • Download the appropriate "Docker Desktop" installer for Windows or Mac
  • Run the installer and follow the prompts
  • Once complete, you‘ll see the Docker "whale" icon in your system tray

Linux:
The install steps vary depending on your Linux distribution. Docker provides detailed guides for:

In general, the Linux install process involves adding the Docker repository, installing the Docker engine, and starting the Docker daemon.

Once installation is complete, open up a terminal and run:

docker --version

If everything is working, you‘ll see the Docker version number displayed.

Docker Images and Containers

Before we dive into some examples, let‘s briefly cover some key Docker concepts.

An image is an immutable file that contains the code, libraries, and dependencies needed to run an application. You can think of an image like a snapshot of a preconfigured Linux system with a specific tool installed. Docker images are created using a special file format called a Dockerfile.

Images are stored in registries like Docker Hub, which hosts official images and allows the community to share their own. When you execute a docker run command, Docker will automatically download the image if it‘s not already cached locally.

A running instance of an image is called a container. Containers are isolated from the host system and run in their own virtual environment. They can be started, stopped, and restarted independent of other containers and won‘t interfere with your main OS.

Example 1: Run PHP 7 Code on a PHP 5 Machine

Let‘s walk through our first example. Suppose you‘re on a Linux workstation with PHP 5 installed system-wide. You receive some code from a client that will only run on PHP 7 due to use of new language features. Rather than go through the hassle of upgrading your main PHP install, we can use Docker to execute the script.

First, create a file called code.php with the following:

<?php
echo 1 <=> 2;  

This uses PHP 7‘s spaceship operator which doesn‘t exist in PHP 5.

Next, run this command in the same directory as the script:

docker run -it --rm -v $(pwd):/app php:7.4-cli php /app/code.php

Let‘s break this down:

  • docker run is the command to run a new container from an image
  • -it keeps STDIN open and allocates a pseudo-TTY, allowing us to interact with the container
  • --rm automatically removes the container when it exits
  • -v $(pwd):/app mounts the current directory on the host to the /app directory inside the container
  • php:7.4-cli is the name of the official PHP 7.4 command-line image on Docker Hub
  • php /app/code.php is the command we want to run inside the container

You should see the integer -1 printed, since PHP 7 evaluates 1 <=> 2 as "less than". We‘ve just run a PHP 7 script without touching our default PHP 5 installation!

Example 2: Run sqlmap on MacOS

For our second example, imagine you‘re a Mac user who wants to run the popular sqlmap penetration testing tool, which is included in Kali Linux. While you could run Kali in a virtual machine, with Docker we can use sqlmap with a single command.

Since sqlmap isn‘t an official Docker image, we first need to search Docker Hub for a community-contributed version:

docker search sqlmap

This will display a list of available images. The one we want is paoloo/sqlmap, which has over 1000 pulls.

To run it:

docker run --rm -it paoloo/sqlmap -u http://example.com/?id=1

Notice how all the flags after the image name get passed directly to sqlmap, just as if it was installed locally. The -u flag tells sqlmap to test the given URL for SQL injection vulnerabilities.

However, there‘s one problem. By default, sqlmap stores its scan logs in a hidden directory at ~/.sqlmap. When the Docker container exits, that data will be lost. To persist the logs, we need to mount a directory from the host into the running container:

docker run --rm -it -v ~/.sqlmap:/root/.sqlmap paoloo/sqlmap -u http://example.com/?id=1

Now ~/.sqlmap on the host will contain sqlmap‘s log files, allowing us to retain the data between runs.

Example 3: Serve Files with NGINX on Windows

For the next example, let‘s say you‘re on a Windows machine and need to quickly share some files over HTTP using NGINX. With Docker, we can have a web server up and running in seconds.

First, create a directory to hold your files:

mkdir my-website

The official NGINX image is conveniently named nginx on Docker Hub. We‘ll use a special :alpine tag, which is a minimal version of the image based on Alpine Linux. This helps keep the final image size down.

To start the web server:

docker run --name my-nginx 
-d -p 8080:80 
-v C:\path\to\my-website:/usr/share/nginx/html:ro
nginx:1.19-alpine  

A few new flags here:

  • --name my-nginx assigns a friendly name to the container for easy reference
  • -d runs the container in detached mode in the background
  • -p 8080:80 maps port 8080 on the host to port 80 in the container, so we can access the web server at http://localhost:8080
  • -v C:\path\to\my-website:/usr/share/nginx/html:ro mounts our files as a read-only volume to NGINX‘s default web root

That‘s it! You now have a fully-functional web server hosting your files. To stop it:

docker stop my-nginx

To start it again:

docker start my-nginx

Example 4: Run a Node.js App Without Installing Node

Finally, let‘s use Docker to run a Node.js app without actually installing Node on our system.

Suppose we have a simple app.js file:

const http = require(‘http‘);

const port = process.env.PORT || 3000;

const server = http.createServer((req, res) => {
res.statusCode = 200; res.setHeader(‘Content-Type‘, ‘text/plain‘); res.end(‘Hello World‘); });

server.listen(port, () => {
console.log(Server running on port ${port}); });

To run this with Docker:

docker run -it --rm 
-v $(pwd):/app
-w /app 
-p 3000:3000
node:14-alpine
node app.js  

This mounts our current directory as a volume into the /app directory in the container, sets the working directory with -w, maps port 3000, and runs Node 14. We can now access the app at http://localhost:3000.

Conclusion

As you can see, Docker makes it incredibly easy to run Linux tools and apps on any platform without mucking up your main working environment. With a single docker run command, you can execute just about any Linux binary in an isolated, reproducible way.

Some key things to remember:

  • Use :alpine tagged images when possible to keep container sizes small
  • Mount directories with -v to persist data between container runs
  • Map ports with -p to access network services running in containers
  • Leverage Docker Hub‘s official images and the broader community to find prebuilt images for popular tools

I hope this guide has helped you unlock the power of Docker for running Linux tools on demand. The applications are endless, from quickly testing a new programming language to running one-off security scans. And best of all, when you‘re done, you can delete the container and reclaim the disk space, leaving your system exactly how you started.

Happy Dockerizing!

Similar Posts