mcp-registry/llm-context.py

    ==================
      
       /// MCP ///
      /// LLM ///
        
    ==================
        
    [server:online]
    [protocol:ready]

llm-context.py

by cyberchitta

CLI & MCP server that lets you stream relevant code/text snippets from a project into an LLM session. Provides rule-based file selection (gitignore aware), smart code outlining and clipboard helpers.

251
18
Open Source

Installation

1. Prerequisites
• Python ≥3.9
• Git (if installing from source)
2. Install from PyPI (recommended)
pip install llm-context # or, for users with multiple Python versions python3 -m pip install llm-context
If the package is not yet published on PyPI, install directly from GitHub:
pip install git+https://github.com/cyberchitta/llm-context.py.git
3. Verify installation
llm-context --help
4. Optional configuration
• Create/modify the config file at `~/.config/llm-context/config.toml` (XDG path)
• Set your preferred LLM endpoint and authentication token, e.g.
[provider] name = "openai" api_key = "sk-..."
• Define rule-sets for different tasks (code-review, docs, etc.) in the same file.
5. Run the MCP server (exposes a local HTTP endpoint by default)
llm-context serve --port 8848
The CLI will also fallback to clipboard mode if no MCP client connects.

Documentation

License: Apache License 2.0
Updated 7/30/2025

Table of Contents