Overview
Relevant Files
README.mdINSTALL.mdsrc/README.mdCargo.toml
This is the official source repository for the Rust programming language. It contains the compiler (rustc), standard library, and comprehensive documentation. Rust is a systems programming language that emphasizes performance, reliability, and productivity through its ownership model and type system.
Repository Structure
The repository is organized into three major components:
Compiler (compiler/ directory)
The heart of Rust. Contains 60+ crates implementing the compiler pipeline: lexing, parsing, type checking, borrow checking, and code generation. Key crates include rustc_driver (entry point), rustc_middle (intermediate representation), and rustc_codegen_llvm (LLVM backend).
Standard Library (library/ directory)
Core Rust libraries including core (no-std primitives), std (full standard library), alloc (heap allocation), and proc_macro (procedural macro support). These are bootstrapped and compiled as part of the build process.
Tools & Infrastructure (src/ directory)
Build system, testing infrastructure, and essential tools:
bootstrap/- Python-based build orchestration systemtools/- Clippy (linter), rustfmt (formatter), rustdoc (documentation generator), miri (interpreter), and 40+ other utilities
Build System
The project uses a custom Python-based build system (x.py) that manages the bootstrapping process. This is necessary because Rust is written in Rust—the compiler must be built by a previous version of itself. The build system:
- Downloads or builds a snapshot compiler (stage 0)
- Uses it to compile the current compiler (stage 1)
- Uses stage 1 to compile the final compiler (stage 2)
- Compiles the standard library and tools
Configuration is managed through bootstrap.toml (see bootstrap.example.toml for all options).
Key Characteristics
- Multi-platform: Supports 60+ target platforms with varying tier levels
- Modular: Organized as a Cargo workspace with 50+ member crates
- Dual-licensed: MIT and Apache 2.0 licenses
- Community-driven: Governed by the Rust Foundation with open contribution process
Getting Started
For installation, see the official installation guide. To build from source, refer to INSTALL.md. For compiler development details, consult the rustc dev guide.
Architecture & Compilation Pipeline
Relevant Files
compiler/rustc_driver_impl/src/lib.rscompiler/rustc_interface/src/lib.rscompiler/rustc_interface/src/passes.rscompiler/rustc_middle/src/lib.rscompiler/rustc_ast/src/lib.rscompiler/rustc_hir/src/lib.rssrc/doc/rustc-dev-guide/src/compiler-src.md
The Rust compiler is organized as a collection of interdependent crates that work together through a carefully layered architecture. Understanding this structure is essential for navigating and contributing to the compiler.
Dependency Hierarchy
The compiler follows a strict dependency hierarchy designed to enable parallel compilation and reduce rebuild times:
Loading diagram...
rustc_driver_impl is the entry point that acts as the compiler's main function. It parses command-line arguments and delegates to rustc_interface, which provides a generic interface for driving the entire compilation process. Most other compiler crates depend on rustc_middle, which defines central data structures like the type context (TyCtxt), intermediate representations (HIR, THIR, MIR), and the query system.
Compilation Pipeline Stages
The compiler processes source code through distinct phases, each transforming the code into a more refined representation:
-
Parsing - The lexer and parser convert source text into an Abstract Syntax Tree (AST) using
rustc_ast. This stage also handles initial attribute processing and error recovery. -
Macro Expansion & Name Resolution - The
configure_and_expandphase processes#[cfg]attributes, expands macros (both declarative and procedural), and performs name resolution. This transforms the AST while building the module structure. -
AST Lowering - The AST is lowered to the High-level Intermediate Representation (HIR) in
rustc_hir, which is more suitable for semantic analysis. HIR retains source-level information like explicit types and lifetimes. -
Type Checking & Analysis - The type checker (
rustc_hir_typeck) performs type inference, trait resolution, and borrow checking. This stage uses the query system to lazily compute type information. -
MIR Generation - The Typed HIR (THIR) is lowered to the Mid-level Intermediate Representation (MIR) in
rustc_mir_build. MIR is a control-flow graph suitable for optimization and analysis. -
Optimization & Codegen - MIR passes optimize the code, then the codegen backend (LLVM, Cranelift, or GCC) generates machine code or intermediate representations.
The Query System
The query system is central to the compiler's architecture. Defined in rustc_middle, it enables incremental compilation and parallel execution by treating compilation as a set of interdependent queries. Each query is lazily computed and cached, allowing the compiler to reuse results across compilation runs and parallelize independent computations.
Key Data Structures
TyCtxt- The type context, the central hub during most of compilation. Contains interners, the query system, and access to all compiler state.Session- Represents a compiler session, holding configuration, error handling, and I/O information.Compiler- Wraps aSessionand includes the codegen backend and query system infrastructure.Span- Tracks source code locations for error reporting and debugging.
Crate Organization
The compiler consists of approximately 50 interdependent crates. Early-stage crates like rustc_parse, rustc_expand, and rustc_resolve have minimal dependencies. Mid-stage crates depend on rustc_middle. Late-stage crates like rustc_codegen_llvm` depend on most others. This layering allows changes to early stages to be compiled in parallel with other crates, improving build times.
Compiler Crates & Core Components
Relevant Files
compiler/rustc_driver_impl/README.mdcompiler/rustc_middle/README.mdcompiler/rustc_hir_analysis/README.mdcompiler/rustc_codegen_ssa/README.mdcompiler/rustc_codegen_llvm/README.mdcompiler/rustc_interface/src/passes.rscompiler/rustc_interface/src/queries.rs
The Rust compiler is organized into specialized crates that work together in a well-defined pipeline. These core components handle different phases of compilation, from orchestration to code generation.
The Compilation Pipeline
The compilation process flows through several key stages:
- Driver & Orchestration (
rustc_driver_impl) - Entry point that coordinates the entire compilation - Parsing & Expansion - Lexical analysis, macro expansion, and name resolution
- Type Analysis (
rustc_hir_analysis) - Type checking and semantic analysis - Code Generation (
rustc_codegen_ssa,rustc_codegen_llvm) - MIR to machine code
Core Crates
rustc_driver_impl orchestrates the compilation process. The run_compiler() function is the primary entry point, accepting command-line arguments and callbacks. It manages the overall flow, handling early exits for print-only operations and coordinating with the codegen backend.
rustc_middle is the central hub containing shared type definitions used across the compiler. It defines:
- HIR (High-level Intermediate Representation) - Structured AST after macro expansion
- THIR (Typed HIR) - Type-annotated intermediate form
- MIR (Mid-level IR) - Control-flow graph used for optimization and analysis
- TyCtxt - The type context, a central data structure holding interned types and compiler state
rustc_hir_analysis performs type checking in multiple phases. The check_crate() function orchestrates: (1) collect phase determines item types, (2) variance inference computes parameter variance, (3) coherence checks validate trait implementations, (4) check phase verifies function bodies and constraints.
rustc_codegen_ssa provides backend-agnostic code generation. It defines traits that all backends must implement and handles monomorphization, partitioning code into compilation units, and orchestrating the codegen process. The codegen_mir() function converts MIR to backend-specific IR.
rustc_codegen_llvm implements the LLVM backend. It converts MIR to LLVM IR and then to machine code. The compile_codegen_unit() function processes individual compilation units, and LlvmCodegenBackend implements the backend traits.
Data Flow
Loading diagram...
Key Interfaces
The CodegenBackend trait defines the interface all backends must implement. Backends provide methods for creating codegen contexts, compiling units, and managing target machines. The BuilderMethods trait abstracts instruction generation, allowing different backends to emit their own IR.
The query system in rustc_middle caches compilation results. Queries like collect_and_partition_mono_items() partition code into compilation units, while codegen_unit() retrieves individual units for compilation.
Integration Points
rustc_interface bridges the driver and analysis phases. The create_and_enter_global_ctxt() function creates the global type context, and start_codegen() transitions from analysis to code generation. The Linker struct manages the final linking phase after codegen completes.
Error handling flows through DiagCtxt (diagnostic context), allowing errors to accumulate and be reported at appropriate times. The compiler can continue analysis even after errors to report multiple issues in one pass.
Standard Library & Runtime
Relevant Files
library/core/src/lib.rslibrary/std/src/lib.rslibrary/alloc/src/lib.rslibrary/proc_macro/src/lib.rslibrary/test/src/lib.rs
Rust's standard library is organized into a layered architecture, with each layer providing increasingly higher-level abstractions. Understanding this structure is essential for working with the compiler and runtime.
Core Library (core)
The core library is the dependency-free foundation of all Rust code. It defines intrinsic and primitive building blocks without linking to any system libraries or libc. Core provides:
- Ownership and memory management:
mem,ptr,pinmodules for unsafe operations - Language traits:
Clone,Copy,Default,Eq,Ord,Hash,Debug - Primitive types: Methods on
bool,char, numeric types, slices, and strings - Collections abstractions:
Option,Result, iterators - Async support:
Future,Poll,Wakerfor async/await - Intrinsics: Low-level compiler operations and SIMD support
Core is #![no_core] and platform-agnostic, making it suitable for embedded and no_std environments.
Allocation Library (alloc)
The alloc library adds heap allocation support on top of core. It requires an allocator but no OS integration. Key components:
- Smart pointers:
Box(unique ownership),Rc(reference-counted),Arc(atomic reference-counted) - Collections:
Vec,String,HashMap,BTreeMap,LinkedList,VecDeque - Formatting:
fmtmodule for heap-allocated formatting
Alloc is marked #![no_std] and #![needs_allocator], making it ideal for no_std projects that need collections.
Standard Library (std)
The standard library is the complete, platform-integrated library available by default. It layers on top of alloc and core, adding:
- I/O and filesystem:
fs,io,netfor file and network operations - Concurrency:
thread,sync(Mutex, RwLock, Barrier, channels) - Environment:
env,processfor system interaction - Runtime:
rtmodule managing program startup, panic handling, and cleanup
The runtime entry point is the #[lang = "start"] function in std::rt, which initializes the runtime, runs main(), and handles panics.
Procedural Macros (proc_macro)
The proc_macro library provides the API for procedural macro authors. It includes:
TokenStream: Represents token sequences for macro input/outputSpan: Source code location informationDiagnostic: Error and warning reporting- Bridge infrastructure for compiler communication
Macros can only use this crate from within procedural macro contexts; is_available() checks if the infrastructure is present.
Test Framework (test)
The test library powers Rust's built-in testing infrastructure. It provides:
- Test harness and runner
- Benchmark support with statistical analysis
- Multiple output formatters (pretty, terse, JSON, JUnit)
- Test filtering and concurrency control
Loading diagram...
Layering and Dependencies
The layering enables flexibility: embedded systems use only core, no_std projects use core + alloc, and standard applications use the full std. Each layer is independently testable and maintains clear boundaries. The runtime (std::rt) bridges the gap between language semantics and OS-specific behavior, handling initialization, panic unwinding, and cleanup.
Build System & Bootstrap
Relevant Files
src/bootstrap/README.mdsrc/bootstrap/src/lib.rssrc/bootstrap/bootstrap.pyx.pybootstrap.example.toml
Rust's build system, called bootstrap, is a sophisticated multi-stage compiler that bootstraps itself. It solves the classic chicken-and-egg problem: you need a Rust compiler to build Rust, so bootstrap uses a pre-compiled stage0 compiler to build newer stages.
The Three-Stage Build Process
Loading diagram...
Stage 0 is a pre-compiled beta compiler downloaded from CI. It's used only to compile the bootstrap system itself and the stage1 compiler. The stage0 compiler always uses its own bundled standard library.
Stage 1 is built from the current source code using the stage0 compiler. It links against the stage0 standard library, making it a transitional compiler that understands the latest language features but uses older libraries.
Stage 2 is the truly current compiler, built using stage1 and linking against a freshly-built stage1 standard library. This is the compiler you get when you run ./x build.
Entry Points and Execution Flow
The build process starts with one of three entry point scripts:
x(Unix/Linux shell script)x.ps1(Windows PowerShell)x.py(Cross-platform Python script)
These scripts handle downloading stage0 binaries, then compile the bootstrap binary itself (written in Rust). Finally, they invoke the compiled bootstrap binary to orchestrate the actual build.
The bootstrap binary reads bootstrap.toml configuration, performs sanity checks, and executes build steps in dependency order. Most heavy lifting is delegated to Cargo, which handles incremental compilation and parallelization.
Build Directory Structure
Bootstrap organizes output under the build/ directory:
build/cache/– Downloaded stage0 compiler tarballsbuild/bootstrap/– Bootstrap binary build artifactsbuild/x86_64-unknown-linux-gnu/– Host-specific outputsstage0/– Extracted stage0 compilerstage0-sysroot/– Temporary sysroot for stage0stageN-std/,stageN-rustc/,stageN-tools/– Cargo output directoriesstageN/– Final assembled compiler sysroots
Configuration and Customization
Bootstrap is configured via bootstrap.toml. Key options include:
llvm.download-ci-llvm– Download pre-built LLVM or build from sourcebuild.verbose– Enable verbose outputbuild.keep-stage– Skip rebuilding specific stagesbuild.download-rustc– Use CI-built artifacts instead of local builds
Run ./x setup to generate a default configuration, or ./configure for interactive setup.
Key Concepts
Uplifting is the process of assembling a new compiler stage from artifacts produced by the previous stage. Hard links copy binaries from cargo output directories into the final stage directory.
Sysroot is the directory containing a compiler's standard library and other runtime artifacts. Stage0 uses a special stage0-sysroot to isolate its libraries from the ones being built.
Incremental builds are enabled by Cargo's dependency tracking. Bootstrap itself is incremental—only changed steps re-execute.
For detailed information on extending bootstrap or understanding its internals, see src/bootstrap/README.md and the Rust Compiler Development Guide.
Developer Tools
Relevant Files
src/tools/rustdoc/main.rssrc/tools/clippy/README.mdsrc/tools/rustfmt/README.mdsrc/tools/rust-analyzer/README.mdsrc/tools/miri/README.md
The Rust project includes a comprehensive suite of developer tools that enhance code quality, productivity, and safety. These tools are maintained as part of the core repository and distributed through rustup.
Core Tools Overview
Rustfmt is the official code formatter that enforces consistent style across Rust projects. It reads configuration from rustfmt.toml or .rustfmt.toml files and can be invoked via cargo fmt. The tool supports both stable and nightly-only configuration options, allowing teams to enforce style guidelines automatically in CI/CD pipelines.
Clippy is a linter providing over 800 lints organized into categories (correctness, suspicious, style, complexity, perf, pedantic, restriction, nursery, cargo). Each lint has a default level, and developers can configure behavior through clippy.toml files or command-line flags. It integrates seamlessly with cargo clippy and supports automatic fixes via cargo clippy --fix.
Rust-analyzer is a language server implementing the Language Server Protocol (LSP), enabling IDE features across multiple editors (VS Code, Vim, Emacs, Zed). It provides code completion, go-to-definition, refactoring, inlay hints, and semantic syntax highlighting. Rust-analyzer integrates with rustfmt for formatting and clippy for diagnostics.
Miri is an undefined behavior detection tool that interprets Rust code to catch memory safety violations. It detects out-of-bounds accesses, use-after-free, uninitialized data, alignment violations, and data races. Developers run cargo miri test or cargo miri run to execute code through Miri's interpreter, which provides deterministic execution with optional host isolation.
Rustdoc generates documentation from Rust source code comments. It processes doc comments and produces HTML documentation, supporting markdown and code examples. The tool is essential for maintaining API documentation across the ecosystem.
Integration and Workflow
Loading diagram...
These tools work together in a typical development workflow: developers write code, use rust-analyzer for IDE support, run clippy for linting, apply rustfmt for formatting, and use miri to verify unsafe code correctness. All tools are distributed as rustup components and can be installed via rustup component add <tool-name>.
Testing Infrastructure
Relevant Files
src/tools/compiletest/src/lib.rssrc/tools/tidy/Readme.mdsrc/tools/miropt-test-tools/src/lib.rstests/COMPILER_TESTS.mdsrc/bootstrap/src/core/build_steps/test.rs
The Rust project uses a sophisticated multi-layered testing infrastructure to ensure compiler correctness, performance, and stability. Tests are organized into distinct suites, each targeting different aspects of the compiler and standard library.
Test Modes and Suites
The primary testing framework is compiletest, which supports 16 different test modes:
- UI Tests (
tests/ui,tests/ui-fulldeps,tests/rustdoc-ui) - Verify compiler error messages and diagnostics match expected output - Codegen Tests (
tests/codegen-llvm,tests/assembly-llvm) - Check generated LLVM IR and assembly code - MIR Optimization Tests (
tests/mir-opt) - Validate Mid-level Intermediate Representation transformations - Incremental Tests (
tests/incremental) - Ensure incremental compilation caching works correctly - Debuginfo Tests (
tests/debuginfo) - Verify debug information generation for GDB, LLDB, and CDB - Coverage Tests (
tests/coverage) - Test code coverage instrumentation and reporting - Run-Make Tests (
tests/run-make) - Execute complex build recipes using Makefiles - Rustdoc Tests (
tests/rustdoc,tests/rustdoc-json,tests/rustdoc-js) - Validate documentation generation
Test Discovery and Execution
Compiletest performs recursive directory scanning to discover test files (.rs files not starting with ., #, or ~). The framework:
- Collects tests from the test suite directory, creating one test structure per revision
- Filters tests based on command-line arguments and git-tracked modifications
- Executes tests in parallel using rayon for performance
- Validates output by comparing actual results against expected
.stderrand.stdoutfiles
Tests can be marked as ignored using directives like //@ ignore-* and support multi-revision testing for testing across different compiler configurations.
Tidy: Code Quality Enforcement
Tidy is a custom linter that runs automatically during ./x test and CI. It enforces:
- Style checks - Alphabetical ordering, file naming, line length, trailing whitespace
- Infrastructure checks - License compliance, dependency validation, documentation synchronization
- Testing checks - Test placement, debug artifact cleanup, revision consistency
Tidy directives (e.g., // ignore-tidy-linelength) allow selective suppression of checks.
Ecosystem and Integration Testing
Beyond compiler tests, the project includes:
- Package tests - Standard
#[test]unit tests inlibrary/andcompiler/crates - Cargotest - Runs
cargo teston real-world projects (servo, ripgrep, tokei) to catch regressions - Crater - Large-scale ecosystem testing on thousands of public projects (separate infrastructure)
Loading diagram...
Running Tests
Execute tests with ./x test followed by a path or mode:
./x test tests/ui # Run all UI tests
./x test tests/mir-opt --bless # Update MIR test expectations
./x test tidy # Run code quality checks
./x test tests/ui --pass check # Force check-pass mode
The --bless flag updates expected output files, useful after intentional compiler changes.
MIR & Optimization Passes
Relevant Files
compiler/rustc_mir_build/src/lib.rscompiler/rustc_mir_transform/src/lib.rscompiler/rustc_mir_dataflow/src/lib.rscompiler/rustc_const_eval/src/lib.rscompiler/rustc_monomorphize/src/lib.rs
MIR (Mid-level Intermediate Representation) is rustc's primary intermediate form for code analysis and optimization. It sits between the high-level HIR (after type checking) and low-level LLVM IR, providing a control-flow graph representation where all expressions are flattened and types are fully explicit.
MIR Construction & Pipeline
MIR is built from THIR (Typed High-level IR) via the mir_built query in rustc_mir_build. The construction process converts HIR into a control-flow graph of basic blocks, where each block contains statements ending with a terminator (branch, call, return, etc.). This flattened structure makes it ideal for data-flow analysis and optimization.
The MIR pipeline consists of five major query stages:
mir_built– Initial MIR after construction from THIRmir_const– Applies simple passes for const qualificationmir_promoted– Extracts promotable temporaries into separate bodiesmir_drops_elaborated_and_const_checked– Runs borrow checking and major transformations (drop elaboration, etc.)optimized_mir– Final MIR after all enabled optimizations
Optimization Passes
The rustc_mir_transform crate implements 50+ optimization passes organized by phase. Key passes include:
- Inlining – Inlines function calls based on heuristics and
#[inline]attributes - Copy Propagation – Eliminates redundant copies of values
- Dead Store Elimination – Removes assignments to unused variables
- Constant Propagation – Evaluates constant expressions at compile time
- GVN (Global Value Numbering) – Eliminates redundant computations
- Jump Threading – Simplifies control flow by threading jumps
- Destination Propagation – Optimizes move patterns by reusing storage
Passes are conditionally enabled based on optimization level (-O, -C opt-level). Some passes like CheckAlignment and CheckNull add runtime safety checks before optimizations eliminate UB.
Data-Flow Analysis
The rustc_mir_dataflow crate provides a framework for computing data-flow facts across the MIR control-flow graph. It supports both forward and backward analyses using a lattice-based approach:
- Forward analyses – Track facts flowing from predecessors (e.g., reachability)
- Backward analyses – Track facts flowing from successors (e.g., liveness)
- GenKill framework – Efficient representation for set-based analyses
Common analyses include move tracking, borrow checking, and liveness. Results are cached and can be queried at specific program points via ResultsCursor.
Const Evaluation & Monomorphization
rustc_const_eval interprets MIR at compile time for const evaluation, const generics, and static initializers. It uses a virtual machine (InterpCx) that executes MIR instructions in a controlled environment.
rustc_monomorphize collects all concrete instantiations of generic functions and types needed for code generation, walking the MIR to discover transitive dependencies.
MIR Phases & Validation
Each MIR body has a phase field tracking its transformation stage. The pass manager validates MIR after each pass (when -Z validate-mir is enabled) to catch bugs. Phases prevent invalid transformations—for example, certain optimizations only apply after borrow checking completes.
The Steal mechanism optimizes memory by allowing downstream queries to take ownership of intermediate MIR bodies rather than cloning them, but requires careful dependency ordering to avoid use-after-steal panics.
Type System & Trait Resolution
Relevant Files
compiler/rustc_hir_typeck/src/lib.rscompiler/rustc_trait_selection/src/lib.rscompiler/rustc_infer/src/lib.rscompiler/rustc_type_ir/src/lib.rscompiler/rustc_next_trait_solver/src/lib.rs
Overview
Rust's type system is enforced through a multi-phase process: type inference, trait resolution, and type checking. These systems work together to verify type safety and coherence while automatically inferring types where possible.
The compiler uses an inference context (InferCtxt) to manage type variables and constraints during compilation. Two trait solvers coexist: the legacy solver (still default) and the next-generation solver (enabled with -Znext-solver), which is more modular and better suited for future improvements.
Type Inference Engine
The type inference engine lives in rustc_infer and handles low-level equality and subtyping operations. It maintains:
- Type variables (
TyVid) representing unknown types to be resolved - Region variables for lifetime constraints
- Unification logic to equate types and propagate constraints
- Variance tracking (covariant, contravariant, invariant) for type relationships
The InferCtxt is the central context that coordinates inference across a function or item. It stores:
pub struct InferCtxt<'tcx> {
tcx: TyCtxt<'tcx>,
typing_mode: TypingMode<'tcx>,
universe: UniverseIndex,
// ... type and region variables
}
Trait Resolution
Trait resolution pairs trait references with concrete implementations. The process involves:
- Obligation collection - gathering trait bounds that must be satisfied
- Candidate assembly - finding potential impl blocks that could satisfy each obligation
- Confirmation - verifying the candidate works and applying any constraints
The rustc_trait_selection crate provides two implementations:
Legacy Solver (SelectionContext):
- Depth-first search through candidate impls
- Handles auto traits, built-in traits, and user-defined impls
- Used by default; stable but less flexible
Next-Generation Solver (rustc_next_trait_solver):
- Goal-oriented evaluation with a search graph
- Better handling of complex trait interactions
- Modular design supporting future enhancements
- Enabled per-context or globally with
-Znext-solver
Type Checking Pipeline
Type checking in rustc_hir_typeck operates in phases:
- Collect phase - determine types of all items without examining their bodies
- Check phase - type-check function bodies, expressions, and patterns
- Writeback phase - record inferred types back to the HIR
The FnCtxt (function context) manages type checking within a function:
pub struct FnCtxt<'a, 'tcx> {
infcx: &'a InferCtxt<'tcx>,
param_env: ParamEnv<'tcx>,
// ... local type information
}
Type Representation
rustc_type_ir provides the core type system abstraction:
Ty<'tcx>- the main type representation (interned for efficiency)TyKind- enum of concrete type variants (int, struct, function, etc.)Predicate- trait bounds and other constraintsRegion- lifetime informationBinder- higher-ranked types (e.g.,for<'a> fn(&'a T))
Types are interned (deduplicated) to reduce memory and speed up comparisons.
Key Concepts
Universes: Represent scopes for type variables. Higher universes can reference lower ones, but not vice versa. This prevents unsoundness in higher-ranked trait bounds.
Variance: Determines how type parameters relate in subtyping:
- Covariant -
T <: UimpliesContainer<T> <: Container<U> - Contravariant -
T <: UimpliesContainer<U> <: Container<T> - Invariant - no subtyping relationship
Canonical Forms: Trait goals are canonicalized (variables renamed) to enable caching and avoid redundant work.
Data Flow
Loading diagram...
The type system is designed to be sound (no unsound programs compile) and complete (most valid programs compile), though some edge cases require explicit annotations.