
MCP-LLM Bridge
Author: patruff
Description: A TypeScript bridge that connects local LLMs running via Ollama to Model Context Protocol (MCP) servers, translating LLM tool-call output into MCP JSON-RPC so local/open models can use Claude-like tools (filesystem, web search, GitHub, Gmail/Drive, memory, image generation).
Stars: 970
Forks: 113
License: MIT License
Category: Open Source
Overview
Installation
ollama pull qwen2.5-coder:7b-instructnpm install -g @modelcontextprotocol/server-filesystem
npm install -g @modelcontextprotocol/server-brave-search
npm install -g @modelcontextprotocol/server-github
npm install -g @modelcontextprotocol/server-memory
npm install -g @patruff/server-flux
npm install -g @patruff/server-gmail-drivenode path/to/gmail-drive/index.js authnpm run startlist-tools
Show the available tools currently registered/accessible through the bridge (from connected MCP servers).
quit
Exit the bridge program/CLI.
FAQs
What is the difference between MCP-LLM Bridge and the jonigl ollama-mcp-bridge, and which one should I use?
The key difference is architecture: patruff's bridge wraps Ollama's native API in a standalone agent loop, while jonigl's acts as a transparent proxy that drops into existing Ollama API workflows. Choose jonigl if you need better documentation and active maintenance for production use, or patruff if you want to study multi-server routing architecture as a learning exercise before forking.
How do I connect multiple MCP servers to a single Ollama model using MCP-LLM Bridge?
Edit the bridge_config.json file to add multiple server entries under the mcpServers object, each with its own command, args, and service-specific parameters. Each server runs as a separate process that the bridge's Tool Router component coordinates simultaneously, allowing the single Ollama model instance to select from all available tools across servers based on the user's request.
What are the most common issues when setting up MCP-LLM Bridge on Windows?
Windows path handling remains incomplete according to issue nineteen, causing configuration errors when specifying MCP server locations. The bridge expects Unix-style forward slashes in bridge_config.json paths, so Windows users must either use forward slashes throughout or escape backslashes properly. Additionally, the absolute path requirement compounds this because Windows drive letters like C: can confuse the parser without careful formatting.
Which open-source LLM models besides Qwen 2.5 7B work reliably with MCP tool calling through the bridge?
The content only documents Qwen 2.5 7B as the recommended model. Larger parameter models like Llama 3.1 70B, Mistral Large, or DeepSeek Coder 33B theoretically handle structured output better, but no testing evidence exists for the bridge specifically. Smaller models consistently fail JSON parsing per documented issues.
How does MCP-LLM Bridge compare to LangChain's MCP integration for local agent workflows?
LangChain's MCP integration offers institutional backing, active maintenance, and seamless integration with existing LangChain workflows through MultiServerMCPClient, making it production-ready. The patruff bridge provides a simpler standalone agent loop ideal for understanding MCP client-server patterns without framework dependencies. Choose LangChain for teams already invested in its ecosystem; choose patruff's bridge for lightweight experiments or learning the protocol mechanics directly.
What are the common challenges when setting up an MCP server?
Common challenges include dependency version conflicts, permission errors on filesystem paths, environment variable misconfiguration for API keys, and tooling differences between operating systems. Path resolution issues with relative versus absolute paths, JSON-RPC protocol translation bugs, and lack of standardized error messages from individual servers further complicate debugging during initial setup.