Skip to content
Install
mcp-registry/MCP-LLM Bridge
MCP-LLM Bridge logo

MCP-LLM Bridge

Author: patruff

Description: A TypeScript bridge that connects local LLMs running via Ollama to Model Context Protocol (MCP) servers, translating LLM tool-call output into MCP JSON-RPC so local/open models can use Claude-like tools (filesystem, web search, GitHub, Gmail/Drive, memory, image generation).

Stars: 970

Forks: 113

License: MIT License

Category: Open Source

Overview

Installation

## Setup (from README)
1. Install Ollama model
ollama pull qwen2.5-coder:7b-instruct
2. Install MCP servers (global installs)
npm install -g @modelcontextprotocol/server-filesystem npm install -g @modelcontextprotocol/server-brave-search npm install -g @modelcontextprotocol/server-github npm install -g @modelcontextprotocol/server-memory npm install -g @patruff/server-flux npm install -g @patruff/server-gmail-drive
3. Configure credentials
- Set `BRAVE_API_KEY` for Brave Search
- Set `GITHUB_PERSONAL_ACCESS_TOKEN` for GitHub
- Set `REPLICATE_API_TOKEN` for Flux
- Run Gmail/Drive MCP auth:
node path/to/gmail-drive/index.js auth
Example path shown in README: `node C:\Users\patru\AppData\Roaming\npm\node_modules\@patruff\server-gmail-drive\dist\index.js auth`
4. Configure the bridge via `bridge_config.json`
- Define MCP server commands/args and settings
- Define LLM settings (model, baseUrl, etc.)
5. Start the bridge
npm run start

01

list-tools

Show the available tools currently registered/accessible through the bridge (from connected MCP servers).

02

quit

Exit the bridge program/CLI.

FAQs

What is the difference between MCP-LLM Bridge and the jonigl ollama-mcp-bridge, and which one should I use?

The key difference is architecture: patruff's bridge wraps Ollama's native API in a standalone agent loop, while jonigl's acts as a transparent proxy that drops into existing Ollama API workflows. Choose jonigl if you need better documentation and active maintenance for production use, or patruff if you want to study multi-server routing architecture as a learning exercise before forking.

How do I connect multiple MCP servers to a single Ollama model using MCP-LLM Bridge?

Edit the bridge_config.json file to add multiple server entries under the mcpServers object, each with its own command, args, and service-specific parameters. Each server runs as a separate process that the bridge's Tool Router component coordinates simultaneously, allowing the single Ollama model instance to select from all available tools across servers based on the user's request.

What are the most common issues when setting up MCP-LLM Bridge on Windows?

Windows path handling remains incomplete according to issue nineteen, causing configuration errors when specifying MCP server locations. The bridge expects Unix-style forward slashes in bridge_config.json paths, so Windows users must either use forward slashes throughout or escape backslashes properly. Additionally, the absolute path requirement compounds this because Windows drive letters like C: can confuse the parser without careful formatting.

Which open-source LLM models besides Qwen 2.5 7B work reliably with MCP tool calling through the bridge?

The content only documents Qwen 2.5 7B as the recommended model. Larger parameter models like Llama 3.1 70B, Mistral Large, or DeepSeek Coder 33B theoretically handle structured output better, but no testing evidence exists for the bridge specifically. Smaller models consistently fail JSON parsing per documented issues.

How does MCP-LLM Bridge compare to LangChain's MCP integration for local agent workflows?

LangChain's MCP integration offers institutional backing, active maintenance, and seamless integration with existing LangChain workflows through MultiServerMCPClient, making it production-ready. The patruff bridge provides a simpler standalone agent loop ideal for understanding MCP client-server patterns without framework dependencies. Choose LangChain for teams already invested in its ecosystem; choose patruff's bridge for lightweight experiments or learning the protocol mechanics directly.

What are the common challenges when setting up an MCP server?

Common challenges include dependency version conflicts, permission errors on filesystem paths, environment variable misconfiguration for API keys, and tooling differences between operating systems. Path resolution issues with relative versus absolute paths, JSON-RPC protocol translation bugs, and lack of standardized error messages from individual servers further complicate debugging during initial setup.

License: MIT License
Updated 4/20/2025