Going Serverless with Rust and API Gateway

Rust and AWS Serverless

The world of serverless computing allows developers to focus on writing code without worrying about managing servers and infrastructure. AWS Lambda is one of the most popular serverless platforms, enabling you to run code in response to events or requests, while automatically handling provisioning, scaling, and availability.

Until recently, writing Lambda functions in Rust wasn‘t officially supported. You either had to use a language like Go as a wrapper around Rust code or rely on third-party libraries. However, at re:Invent 2022, AWS announced runtime API support for any language, along with an open source Rust runtime for Lambda.

In this post, we‘ll walk through a complete example of writing, testing, and deploying a serverless Rust application with AWS Lambda and API Gateway. By the end, you‘ll have a solid foundation for building high-performance, low-cost serverless APIs and web apps with Rust. Let‘s dive in!

Prerequisites

Before we get started, make sure you have the following:

  • Docker for building and packaging code
  • Git for version control
  • An AWS account (free tier is fine)
  • Rust (optional if using Docker)
  • Terraform (optional) for deploying infrastructure as code

You can find detailed AWS setup instructions in the repo.

Writing a Rust Lambda Function

We‘ll start by creating a basic Rust function to handle requests from API Gateway. Our function will receive JSON input, deserialize it into a Rust struct, and return a response.

First, create a new cargo project:

cargo new rust-lambda-example
cd rust-lambda-example

Add the following dependencies to your Cargo.toml:

[dependencies]
lambda_runtime = "0.7"
serde = { version = "1.0", features = ["derive"] }
serde_json = "1.0"
tokio = { version = "1", features = ["macros"] }
tracing = { version = "0.1", features = ["log"] }
tracing-subscriber = { version = "0.3", default-features = false, features = ["fmt"] }

Here‘s the code for our Lambda handler in src/main.rs:

use lambda_runtime::{run, service_fn, Error, LambdaEvent};
use serde::{Deserialize, Serialize};

#[derive(Deserialize)]
struct Request {
    name: String,
}

#[derive(Serialize)] 
struct Response {
    req_id: String,
    msg: String,
}

async fn handler(event: LambdaEvent<Request>) -> Result<Response, Error> {
    let req = event.payload;

    Ok(Response {  
        req_id: event.context.request_id,
        msg: format!("Hello, {}!", req.name),
    })
}

#[tokio::main]
async fn main() -> Result<(), Error> {
    tracing_subscriber::fmt()
        .with_max_level(tracing::Level::INFO)
        .with_target(false)
        .without_time()  
        .init();

    run(service_fn(handler)).await
}

The Request struct represents the incoming JSON payload, which our handler expects to contain a "name" field. The Response struct is what we‘ll serialize to JSON and return.

Inside the handler function, we extract the request payload and context. We use the name from the request to personalize the response message, and include the request ID from the context.

Finally, the main function sets up tracing for logging, and starts the Lambda runtime, passing it our handler wrapped in service_fn.

That‘s all the Rust code we need! Now let‘s see how to package it for deployment.

Packaging and Deployment

To deploy our Rust Lambda, we need to compile the code and produce a zip file with the binary. We‘ll use Docker to ensure a consistent build environment.

Create a Dockerfile with the following contents:

FROM rust:1.68 as builder
WORKDIR /usr/src/lambda
COPY . .
RUN cargo build --release

FROM public.ecr.aws/lambda/provided:al2 
COPY --from=builder /usr/src/lambda/target/release/bootstrap /var/task/
CMD ["bootstrap"]  

This multi-stage build:

  1. Compiles the Rust code in a builder stage with all dependencies
  2. Copies the resulting binary to the final Amazon Linux 2 based image AWS provides for custom runtimes
  3. Sets the CMD to run our binary

To build and package the code, run:

docker build -t rust-lambda .
docker run --rm rust-lambda > rust-lambda.zip

We now have a zip file ready to be uploaded to Lambda!

There are several ways to deploy Lambda functions, such as the AWS web console, CLI, SDKs, or infrastructure-as-code tools like AWS SAM or Terraform. We‘ll use Terraform since it lets us define our full serverless stack.

Here‘s a basic Terraform config to create our Lambda function and API Gateway API:

provider "aws" {
  region = "us-west-2"
}

resource "aws_lambda_function" "rust_lambda" {
  function_name    = "rust-lambda-example"
  handler          = "bootstrap"
  runtime          = "provided.al2"  
  filename         = "rust-lambda.zip"
  source_code_hash = filebase64sha256("rust-lambda.zip")
  role             = aws_iam_role.lambda_exec.arn
}

resource "aws_apigatewayv2_api" "rust_lambda_api" {
  name          = "rust-lambda-example-api"
  protocol_type = "HTTP"
}

resource "aws_apigatewayv2_stage" "default" {
  api_id      = aws_apigatewayv2_api.rust_lambda_api.id
  name        = "$default"
  auto_deploy = true
}

resource "aws_apigatewayv2_integration" "rust_lambda_integration" {
  api_id                 = aws_apigatewayv2_api.rust_lambda_api.id
  integration_type       = "AWS_PROXY"
  integration_method     = "POST"
  integration_uri        = aws_lambda_function.rust_lambda.invoke_arn
  payload_format_version = "2.0"
}

resource "aws_apigatewayv2_route" "default" {
  api_id    = aws_apigatewayv2_api.rust_lambda_api.id
  route_key = "POST /"
  target    = "integrations/${aws_apigatewayv2_integration.rust_lambda_integration.id}"
}

resource "aws_iam_role" "lambda_exec" {
  assume_role_policy = jsonencode({
    Version = "2012-10-17"
    Statement = [{
      Action = "sts:AssumeRole"
      Effect = "Allow"
      Sid    = ""
      Principal = {
        Service = "lambda.amazonaws.com"
      }
    }]
  })
}

resource "aws_iam_role_policy_attachment" "lambda_policy" {
  policy_arn = "arn:aws:iam::aws:policy/service-role/AWSLambdaBasicExecutionRole"
  role       = aws_iam_role.lambda_exec.name 
}

output "api_endpoint" {
  value = aws_apigatewayv2_api.rust_lambda_api.api_endpoint 
}

This config creates:

  • The Lambda function from our packaged zip file
  • An HTTP API in API Gateway
  • An integration and route to wire up the API to our function
  • The IAM role and policy to allow Lambda to execute and log

Initialize Terraform and apply the config:

terraform init
terraform apply 

Take note of the api_endpoint output. We‘ll use that to test our deployed API.

Testing the API

Let‘s send a request to our serverless Rust API! Using curl:

curl -X POST https://<api-id>.execute-api.us-west-2.amazonaws.com/ \
  -d ‘{"name":"Rustacean"}‘ 

You should get back a response like:

{
  "req_id": "2dcc7e5b-f913-4e58-b0ef-48be487b0f16",
  "msg": "Hello, Rustacean!"
}

Our Rust Lambda function successfully deserialized the request, generated a personalized response, and returned it through API Gateway. We have a working serverless API!

To clean up the resources created, run:

terraform destroy

Conclusion

In this post, we saw how to write, package, and deploy a serverless Rust application with AWS Lambda and API Gateway. The key steps were:

  1. Write a Rust function to handle Lambda events
  2. Use Docker to build and package the function for deployment
  3. Configure the serverless infrastructure with Terraform
  4. Deploy and test the API

Going serverless with Rust is a great option for building high-performance, low-cost APIs and web apps. Rust‘s speed, safety, and small footprint are well-suited for serverless environments. With native AWS Lambda support through the
runtime API, using Rust is easier than ever.

I encourage you to experiment further with serverless Rust! Some ideas:

  • Add more API routes and HTTP methods
  • Integrate with other AWS services like S3, DynamoDB, or SQS
  • Set up CI/CD to automatically build and deploy on git push

Please let me know if you have any questions or feedback. I‘d also love to hear suggestions for future posts! You can find all the code from this example in the GitHub repo.

Happy coding!

Similar Posts