mcp-registry/vibe-check-mcp-server

    ==================
      
       /// MCP ///
      /// VIB ///
        
    ==================
        
    [server:online]
    [protocol:ready]

vibe-check-mcp-server

by PV-Bhat

Metacognitive MCP server that injects Critical Path Interrupts (CPI) to keep autonomous LLM agents aligned, reflective and safe.

116
15
Open Source

01

vibe_check

Challenge assumptions and prevent tunnel vision

02

vibe_learn

Capture mistakes, preferences and successes

Installation

1. Prerequisites:
• Node.js ≥ 18 and npm (or pnpm/yarn)
• Git
• An API key that can reach LearnLM-2.0-Flash (or an OpenAI-compatible endpoint)
2. Clone the repository
git clone https://github.com/PV-Bhat/vibe-check-mcp-server.git cd vibe-check-mcp-server
3. Install dependencies
npm install # or: pnpm i | yarn install
4. Configure environment
• Copy the sample environment file and edit it
cp .env.example .env
• Fill in required variables (example)
OPENAI_API_KEY=your_learnlm_or_openai_key MODEL_NAME=learnlm-2.0-flash PORT=3000
5. Build (for production)
npm run build
6. Start the MCP server
• Development (hot-reload)
npm run dev
• Production
npm start
7. Verify
The server should be listening on http://localhost:3000 (or the PORT you set). Use the health-check route, e.g. `GET /healthz`, to confirm it is up.

Documentation

License: MIT License
Updated 7/30/2025