Development Environment

Run Rust natively on your host machine. Run backing services (PostgreSQL, Valkey, Restate, RustFS, MailCrab) in Docker containers. This separation keeps your edit-compile-run cycle fast while giving you disposable, reproducible infrastructure.

Rust Toolchain

Install rustup, which manages your Rust compiler, standard library, and development tools.

curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh

The default installation profile includes rustc, cargo, clippy, and rustfmt. Add rust-analyzer (the language server) and rust-src (standard library source, needed for full rust-analyzer functionality) separately:

rustup component add rust-analyzer rust-src

Verify the installation:

rustc --version
cargo --version

Keep everything current with rustup update. Rust releases a new stable version every six weeks.

What each tool does

  • rustc compiles Rust source code. You rarely invoke it directly; cargo handles it.
  • cargo builds, tests, runs, and manages dependencies. It is the entry point for nearly every Rust workflow.
  • clippy is the official linter. Run cargo clippy to catch common mistakes and non-idiomatic patterns.
  • rustfmt formats code to a consistent style. Run cargo fmt to format, cargo fmt -- --check to verify without modifying files.
  • rust-analyzer provides IDE features (completions, diagnostics, go-to-definition, refactoring) via the Language Server Protocol. Any editor or AI coding agent with LSP support can use it.

Backing Services with Docker Compose

The application depends on five external services during development. Run them in containers so they are disposable and require no host-level installation.

ServiceImagePortsPurpose
PostgreSQLpostgres:18-alpine5432Primary database
Valkeyvalkey/valkey:9-alpine6379Pub/sub and caching
Restatedocker.restate.dev/restatedev/restate:latest8080, 9070, 9071Durable execution engine
RustFSrustfs/rustfs:latest9000, 9001S3-compatible object storage
MailCrabmarlonb/mailcrab:latest1025 (SMTP), 1080 (Web UI)Email capture for testing

Create compose.yaml at the project root:

services:
  postgres:
    image: postgres:18-alpine
    ports:
      - "5432:5432"
    environment:
      POSTGRES_USER: app
      POSTGRES_PASSWORD: secret
      POSTGRES_DB: app_dev
    volumes:
      - pgdata:/var/lib/postgresql/data
    healthcheck:
      test: ["CMD-SHELL", "pg_isready -U app"]
      interval: 5s
      timeout: 5s
      retries: 5

  valkey:
    image: valkey/valkey:9-alpine
    ports:
      - "6379:6379"
    volumes:
      - valkeydata:/data

  restate:
    image: docker.restate.dev/restatedev/restate:latest
    ports:
      - "8080:8080"
      - "9070:9070"
      - "9071:9071"
    extra_hosts:
      - "host.docker.internal:host-gateway"
    volumes:
      - restatedata:/target

  rustfs:
    image: rustfs/rustfs:latest
    command: server /data --console-address ":9001"
    ports:
      - "9000:9000"
      - "9001:9001"
    environment:
      RUSTFS_ROOT_USER: minioadmin
      RUSTFS_ROOT_PASSWORD: minioadmin
    volumes:
      - rustfsdata:/data

  mailcrab:
    image: marlonb/mailcrab:latest
    ports:
      - "1080:1080"
      - "1025:1025"

volumes:
  pgdata:
  valkeydata:
  restatedata:
  rustfsdata:

Start all services:

docker compose up -d

Stop containers (data persists in named volumes):

docker compose down

Stop and destroy everything, including data:

docker compose down -v

Service notes

Valkey is the BSD-licensed fork of Redis, maintained by the Linux Foundation. It is fully API-compatible with Redis, so any Redis client library works without changes. The guide uses Valkey because its licence is unambiguous.

Restate is a durable execution engine for reliable background work, workflows, and agentic AI. The extra_hosts entry allows Restate (running inside Docker) to reach your application (running on the host) via host.docker.internal. Use this hostname instead of localhost when registering service deployments with the Restate admin API on port 9070.

RustFS is an S3-compatible object storage server written in Rust, licensed under Apache 2.0. It replaces MinIO, which entered maintenance mode in December 2025. RustFS is still in alpha but functional for local development. Its web console is available at http://localhost:9001.

MailCrab captures all email sent to it. Configure your application’s SMTP to point at localhost:1025, then view captured messages at http://localhost:1080. No email leaves your machine.

Docker runtime

Any Docker-compatible runtime works: Docker Desktop, OrbStack (macOS), Colima (macOS/Linux), or Podman. The docker compose commands behave identically across all of them.

cargo xtask

cargo xtask is a convention for writing project automation as a Rust binary inside your workspace. Instead of shell scripts or Makefiles, your build tasks are Rust code: checked by the compiler, cross-platform, and requiring no external tooling beyond cargo.

The pattern works by defining a cargo alias that runs a dedicated crate.

Setup

Create the alias in .cargo/config.toml:

[alias]
xtask = "run --package xtask --"

Add an xtask crate to your workspace. In the root Cargo.toml:

[workspace]
resolver = "2"
members = ["app", "xtask"]
default-members = ["app"]

default-members prevents cargo build and cargo test from compiling the xtask crate unless explicitly requested.

Create xtask/Cargo.toml:

[package]
name = "xtask"
version = "0.1.0"
edition = "2024"
publish = false

[dependencies]
clap = { version = "4", features = ["derive"] }
xshell = "0.2"
anyhow = "1"

xshell provides shell-like command execution without invoking an actual shell. Variable interpolation is safe by construction, preventing injection.

Create xtask/src/main.rs:

use std::process::ExitCode;

use anyhow::Result;
use clap::{Parser, Subcommand};
use xshell::{cmd, Shell};

#[derive(Parser)]
#[command(name = "xtask")]
struct Cli {
    #[command(subcommand)]
    command: Command,
}

#[derive(Subcommand)]
enum Command {
    /// Start backing services and the dev server
    Dev,
    /// Run database migrations
    Migrate,
    /// Run all CI checks locally
    Ci,
}

fn main() -> ExitCode {
    let cli = Cli::parse();
    let result = match cli.command {
        Command::Dev => dev(),
        Command::Migrate => migrate(),
        Command::Ci => ci(),
    };
    match result {
        Ok(()) => ExitCode::SUCCESS,
        Err(e) => {
            eprintln!("error: {e:?}");
            ExitCode::FAILURE
        }
    }
}

fn dev() -> Result<()> {
    let sh = Shell::new()?;
    cmd!(sh, "docker compose up -d").run()?;
    cmd!(sh, "bacon run").run()?;
    Ok(())
}

fn migrate() -> Result<()> {
    let sh = Shell::new()?;
    cmd!(sh, "cargo sqlx migrate run").run()?;
    Ok(())
}

fn ci() -> Result<()> {
    let sh = Shell::new()?;
    cmd!(sh, "cargo fmt --all -- --check").run()?;
    cmd!(sh, "cargo clippy --all-targets -- -D warnings").run()?;
    cmd!(sh, "cargo nextest run").run()?;
    Ok(())
}

Run tasks with:

cargo xtask dev       # start services + dev server
cargo xtask migrate   # run database migrations
cargo xtask ci        # fmt check, clippy, tests

Add subcommands as your project grows. Common additions: seed (populate development data), reset (drop and recreate the database), build-css (run lightningcss processing).

Editor Configuration

Any editor with Language Server Protocol support works for Rust development. Install the rust-analyzer extension or plugin for your editor of choice.

The following rust-analyzer settings matter for this stack. Apply them through your editor’s LSP configuration.

{
  "rust-analyzer.check.command": "clippy",
  "rust-analyzer.procMacro.enable": true,
  "rust-analyzer.cargo.buildScripts.enable": true,
  "rust-analyzer.check.allTargets": true
}

check.command: "clippy" runs clippy instead of cargo check on save, giving you lint feedback inline. Slightly slower on large workspaces, but the additional warnings are worth it.

procMacro.enable: true is critical for this stack. Maud’s html! macro, serde’s derive macros, and SQLx’s query! macro are all procedural macros. Without this setting, rust-analyzer cannot expand them, resulting in false errors and missing completions inside macro invocations.

cargo.buildScripts.enable: true ensures build scripts run during analysis. SQLx’s compile-time query checking depends on this.

check.allTargets: true includes tests, examples, and benchmarks in diagnostic checking.

Fast Iteration

bacon

bacon watches your source files and runs cargo commands on every change. It replaces the older cargo-watch, which is no longer actively developed (its maintainer recommends bacon).

Install it:

cargo install --locked bacon

Run it:

bacon           # defaults to cargo check
bacon clippy    # run clippy on every change
bacon test      # run tests on every change
bacon run       # build and run on every change

bacon provides a TUI with sorted, filtered diagnostics. Press t to switch to tests, c to switch to clippy, r to run the application. The full set of keyboard shortcuts is shown in the interface.

For project-specific jobs, create a bacon.toml at the workspace root:

[jobs.run]
command = ["cargo", "run"]
watch = ["src"]

[jobs.test-integration]
command = ["cargo", "nextest", "run", "--test", "integration"]
watch = ["src", "tests"]

Linking

On Linux with Rust 1.90+, the compiler uses lld (the LLVM linker) by default. This is significantly faster than the traditional system linker and requires no configuration.

On macOS, Apple’s default linker is adequate. No special setup is needed.

Incremental compilation

Cargo enables incremental compilation by default for debug builds. After the initial compile, changing a single file typically triggers a rebuild of only the affected crate and its dependents.

Two practices keep incremental rebuilds fast:

  • Split your workspace into focused crates. A change in one crate does not recompile unrelated crates. The Project Structure section covers this in detail.
  • Keep macro-heavy code in leaf crates. Procedural macro expansion is one of the slower compilation phases. Isolating it limits the rebuild radius.

cargo-nextest

cargo-nextest is a test runner that executes tests in parallel across separate processes. It is noticeably faster than cargo test on projects with more than a handful of tests, and its output is easier to read.

cargo install --locked cargo-nextest
cargo nextest run

Doctests are not supported by nextest. Run them separately with cargo test --doc.