mcp-registry/vercel-ai-docs-mcp

    ==================
      
       /// MCP ///
      /// VER ///
        
    ==================
        
    [server:online]
    [protocol:ready]

vercel-ai-docs-mcp

by IvanAmador

MCP server that lets AI assistants (e.g., Claude Desktop, Cursor) semantically search and query the Vercel AI SDK documentation using Google Gemini, FAISS and LangChain.

22
5
Open Source

01

agent-query

Query the Vercel AI SDK documentation using an AI agent that searches and synthesizes information

02

direct-query

Perform a direct similarity search against the Vercel AI SDK documentation index

03

clear-memory

Clear the conversation memory for a specific session or for all sessions if no session ID is supplied

Installation

## Prerequisites
• Node.js ≥ 18
• npm, pnpm, or yarn package manager
• (Optional) Vercel CLI if you plan to deploy on Vercel
## Clone the repository
git clone https://github.com/IvanAmador/vercel-ai-docs-mcp.git cd vercel-ai-docs-mcp
## Install dependencies (choose one)
npm install # or pnpm install # or yarn
## Build TypeScript → JavaScript
npm run build # uses tsc or next-build depending on the project
## Configure environment variables (create .env)
OPENAI_API_KEY=<your-openai-key> # If using a vector DB (e.g., Pinecone, Supabase, Postgres + pgvector), add its credentials VECTOR_DB_URL=<db-url> VECTOR_DB_API_KEY=<db-key>
## Start the MCP server locally
npm start # or: npm run dev for hot-reload
The server will default to http://localhost:3000 (check scripts section of package.json)
## Deploy (examples)
# Vercel vercel --prod # Docker docker build -t vercel-ai-docs-mcp . && docker run -p 3000:3000 vercel-ai-docs-mcp
## Verify
GET http://localhost:3000/health # or GET http://localhost:3000/.well-known/ai-plugin.json
The service should now be ready to accept MCP requests.

Documentation

License: Unknown (no license file or metadata found)
Updated 7/15/2025