================== /// MCP /// /// DOC /// ================== [server:online] [protocol:ready]
docker-mcp
by QuantGeekDev
MCP (Model Context Protocol) server that lets Claude interact with Docker – create single containers, deploy compose stacks, inspect logs and list containers.
354
47
Open Source01
create-container
Creates a standalone Docker container
02
deploy-compose
Deploys a Docker Compose stack
03
get-logs
Retrieves logs from a specific container
04
list-containers
Lists all Docker containers
Installation
1. Prerequisites
• Docker Engine 20.10+ (or Podman compatible)
• (Optional) Docker-Compose v2 if you prefer compose files
2. Clone the repository
git clone https://github.com/QuantGeekDev/docker-mcp.git
cd docker-mcp
3. Build the image (if you want a local build)
docker build -t mcp-server:latest .
– OR – pull the pre-built image (if published):
docker pull quantgeekdev/mcp:latest
4. Start the server
-p 8080:8080 \
-e MCP_MODEL_PATH="/models/llama3" \
-v /path/to/your/models:/models \
- "8080:8080"
- ./models:/models
# Stand-alone container
docker run -d --name mcp \
mcp-server:latest
# With docker-compose (docker-compose.yml)
version: "3.9"
services:
mcp:
image: mcp-server:latest
ports:
environment:
MCP_MODEL_PATH: "/models/llama3"
volumes:
Then run:
docker compose up -d
5. Verify
curl http://localhost:8080/healthz # should return 200/OK
6. Configuration
• MCP_MODEL_PATH – absolute path to a local GGUF/GGML model folder
• MCP_CONTEXT_SIZE – maximum tokens in context (default 4096)
• MCP_THREADS – number of inference threads (defaults to host CPU count)
• Set OPENAI_API_KEY if you want the server to proxy requests to OpenAI in fallback mode.
7. Updating
docker pull quantgeekdev/mcp:latest
docker compose down && docker compose up -d
Documentation
License: MIT License
Updated 7/30/2025