A Complete Guide to End-to-End API Testing with Docker

API testing with Docker

Testing is an essential part of the software development lifecycle. Properly testing your application can catch bugs early, ensure the system behaves as expected, and give you confidence when making changes. When it comes to testing APIs, there are several approaches you can take – from unit testing individual functions to integration testing how components work together. In this guide, we‘ll explore end-to-end testing, which tests the full request/response cycle of an API, and see how Docker can help streamline the process.

The Benefits of End-to-End API Testing

With unit testing, you test small pieces of code in isolation to validate they work correctly. Integration testing takes this a step further by testing how those individual units work together. While both of these are important, end-to-end testing provides several key benefits:

• Tests the API from the client‘s perspective by making actual HTTP requests
• Validates the entire flow, including request format, response status code, headers, and body
• Ensures the API integrates properly with other services like databases and caching layers
• Gives you confidence the API satisfies business requirements and use cases

End-to-end tests most closely mirror how the API will be used in the real world. They are our first line of defense in catching bugs and ensuring the API works as intended.

An Introduction to Docker

API tests often require interacting with various services – databases, caches, etc. Standing up these environments can be time consuming. This is where Docker comes in.

Docker is a platform for running applications in isolated environments called containers. Containers bundle an application and its dependencies into a single package that can run consistently across machines. Docker makes it easy to integrate external services into your testing environment.

For example, say your API connects to a PostgreSQL database. With Docker, you can spin up a PostgreSQL container, run your tests against it, then tear it down when finished. You get a fresh database for each test run without having to install or configure Postgres on your machine. This makes tests easier to write and more portable.

Tools Needed for End-to-End API Testing

To get started with API testing in Docker, you‘ll need the following tools:

• Docker: For running containers
• Docker Compose: For defining and running multi-container applications
• API Testing Framework: For defining and executing API tests (we‘ll use Postman in this guide)

We‘ll also use NodeJS for our sample API, but the same concepts apply to any language or framework.

Setting Up the Sample API

Let‘s create a simple API with two endpoints – one to create a user and another to fetch all users. We‘ll store the data in a PostgreSQL database. Here‘s the code:

const express = require(‘express‘);
const pg = require(‘pg‘);

const app = express();
app.use(express.json());

const pool = new pg.Pool({
  host: process.env.POSTGRES_HOST,
  port: process.env.POSTGRES_PORT,
  database: process.env.POSTGRES_DB,
  user: process.env.POSTGRES_USER,
  password: process.env.POSTGRES_PASSWORD  
});

app.post(‘/users‘, async (req, res) => {
  const { name, email } = req.body;
  const result = await pool.query(
    ‘INSERT INTO users(name, email) VALUES($1, $2) RETURNING *‘,
    [name, email]
  );
  res.json(result.rows[0]);  
});

app.get(‘/users‘, async (req, res) => {
  const result = await pool.query(‘SELECT * FROM users‘);
  res.json(result.rows);
});

const port = process.env.PORT || 3000;
app.listen(port, () => console.log(`Listening on port ${port}`));

This sets up a simple Express app with two routes. The /users POST route inserts a new user into the database, while the /users GET route fetches all users. The Postgres configuration is pulled from environment variables, which we‘ll set in our docker-compose file.

Defining API Tests

Now let‘s create some API tests using Postman. Postman is an API testing tool that lets you easily send HTTP requests and validate the responses.

We want to test two main flows:

  1. Creating a user via POST /users and ensuring it can then be fetched via GET /users
  2. Attempting to create a user with invalid data and ensuring the API returns a 400 status code

Here‘s what those tests would look like in Postman:

pm.test(‘Create and fetch user‘, () => {
  const user = {
    name: ‘John Doe‘,
    email: ‘[email protected]‘
  };

  pm.sendRequest({
    url: `${pm.environment.get(‘API_URL‘)}/users`,
    method: ‘POST‘,
    body: {
      mode: ‘raw‘,
      raw: JSON.stringify(user)
    }
  }, (err, res) => {
    pm.expect(res.code).to.equal(200);

    const createdUser = res.json();    
    pm.expect(createdUser.name).to.equal(user.name);
    pm.expect(createdUser.email).to.equal(user.email);

    pm.sendRequest(`${pm.environment.get(‘API_URL‘)}/users`, (err, res) => {
      const users = res.json();
      pm.expect(users).to.be.an(‘array‘).that.includes(createdUser);
    });
  });  
});

pm.test(‘Create user validation‘, () => {
  const invalidUser = { name: ‘Invalid‘ };

  pm.sendRequest({
    url: `${pm.environment.get(‘API_URL‘)}/users`, 
    method: ‘POST‘,
    body: {
      mode: ‘raw‘,
      raw: JSON.stringify(invalidUser) 
    }
  }, (err, res) => {
    pm.expect(res.code).to.equal(400);
  });
});

The first test sends a POST request to create a user, validates the response code and body, then sends a GET request to ensure the user is returned. The second test attempts to create an invalid user and validates a 400 status code is returned.

Dockerizing the API

To run the API and tests, we need to create two Docker images – one for the API itself and another for the Postgres database. Here‘s the Dockerfile for the API:

FROM node:14

WORKDIR /app

COPY package*.json ./
RUN npm ci

COPY . .

CMD ["npm", "start"]

This copies the application code into the image and specifies the start command. We also need a docker-compose file to define our application‘s services:

version: ‘3‘
services:

  api:
    build: .
    ports:    
     - "3000:3000"
    environment:
     - POSTGRES_HOST=db
     - POSTGRES_PORT=5432  
     - POSTGRES_DB=mydb
     - POSTGRES_USER=postgres
     - POSTGRES_PASSWORD=password
    depends_on:
     - db

  db:
    image: postgres:13
    environment:
     - POSTGRES_DB=mydb  
     - POSTGRES_PASSWORD=password
    volumes:
     - db-data:/var/lib/postgresql/data

  tests:
    image: postman/newman
    depends_on:  
     - api
    environment:
     - API_URL=http://api:3000  
    volumes:
     - ./tests:/etc/newman
    command: run api-tests.json

volumes:
  db-data:

This defines three services:

• api: Our Node API
• db: The Postgres database
• tests: A container that runs our Postman tests using the Newman CLI

The api service sets the required Postgres environment variables and depends on the db service. The db service uses the official postgres image and defines the POSTGRES_DB and POSTGRES_PASSWORD environment variables.

Finally, the tests service uses the postman/newman image to run our tests. It mounts the directory containing our Postman collection and sets the API_URL environment variable to the hostname of our API container. The depends_on attribute ensures the api service is started before the tests run.

Running the End-to-End Tests

With our Docker Compose file in place, we can run our API and tests with a single command:

docker-compose up --abort-on-container-exit

This spins up our containers, runs the tests, then tears everything down. The –abort-on-container-exit flag ensures all containers are stopped once the tests finish running.

If everything is working correctly, you should see output indicating the tests passed:

 api-tests
┌─────────────────────────┐
│ Create and fetch user   │
├─────────────────────────┤
│ Create user validation  │
└─────────────────────────┘

⠿ 2 passed, 2 total (16ms)

And there you have it! We‘ve successfully run end-to-end tests for our API using Docker. The API and database are spun up,tests are run against them, and everything is torn down leaving no trace on your machine.

Best Practices for API Testing with Docker

Here are some tips and best practices to keep in mind when testing APIs using Docker:

• Use docker-compose to manage your application‘s services. This allows you to define your API, database, and test runner in a single file.
• Give each service a descriptive name and define the dependencies between them. This makes it clear how the services relate to each other.
• Use volumes to mount your test files into the test runner container. This keeps your tests independent of the container itself.
• Set the depends_on attribute for your test runner service to ensure the API is available before the tests run.
• Use docker-compose‘s –abort-on-container-exit flag to clean up containers after the tests complete.
• Push your API images to a Docker registry like Docker Hub. This makes them available to other team members and your CI/CD pipeline.
• Run your tests in a continuous integration environment to catch bugs before they make it to production.

By following these practices, you can create a reliable and maintainable process for testing your APIs with Docker.

Conclusion

End-to-end testing is a critical part of the API development process. It ensures your API works as expected and can handle the complexities of real-world use cases. While setting up end-to-end tests can be challenging, Docker makes the process much simpler.

By defining your API, database, and tests as Docker services, you can create a portable and reproducible testing environment. This environment can be shared across team members and run as part of your continuous integration pipeline.

The example in this guide demonstrates a basic Node API being tested with Postman, but the same approach can be applied to any language, framework, or testing tool. So give it a try! Incorporate end-to-end testing into your workflow and start shipping more reliable APIs.

Similar Posts