================== /// MCP /// /// RUS /// ================== [server:online] [protocol:ready]
rust-docs-mcp-server
by Govcraft
MCP (Model-Context-Protocol) server that turns any Rust crate’s current documentation into an LLM-queryable knowledge base. It downloads the crate docs, embeds them with OpenAI, caches the result, and exposes a `query_rust_docs` tool over stdio.
149
24
Open Source01
query_rust_docs
Query documentation for the specific Rust crate the server was started for, using semantic search and LLM summarization.
Installation
1. Prerequisites
• Rust 1.75+ with cargo (install via https://rustup.rs)
• A PostgreSQL (preferred) or SQLite database for storing embeddings
• An OpenAI-compatible API key exported as OPENAI_API_KEY
2. Clone the repository
git clone https://github.com/Govcraft/rust-docs-mcp-server.git
cd rust-docs-mcp-server
3. Build the binary
cargo build --release # generates target/release/rust-docs-mcp-server
4. Database preparation (PostgreSQL example)
createdb rust_docs_mcp
export DATABASE_URL=postgres://user:password@localhost/rust_docs_mcp
# The server will create the required tables on first launch.
5. Run migrations & start the server
--addr 0.0.0.0:8421 \
--db-url "$DATABASE_URL" \
--openai-api-key "$OPENAI_API_KEY"
cargo run --release -- \
6. (Optional) Systemd service snippet
[Service]
ExecStart=/opt/rust-docs-mcp-server/target/release/rust-docs-mcp-server --addr 0.0.0.0:8421 --db-url=${DATABASE_URL}
Environment=OPENAI_API_KEY=<your_key>
Restart=always
7. Verify
curl http://localhost:8421/healthz # returns {"status":"ok"}
Documentation
License: MIT License
Updated 7/30/2025