mcp-registry/vercel-ai-docs-mcp

    ==================
      
       /// MCP ///
      /// VER ///
        
    ==================
        
    [server:online]
    [protocol:ready]

vercel-ai-docs-mcp

by IvanAmador

MCP server that lets AI assistants (e.g., Claude Desktop, Cursor) semantically search and query the Vercel AI SDK documentation using Google Gemini, FAISS and LangChain.

22
5
Open Source

01

agent-query

Query the Vercel AI SDK documentation using an AI agent that searches and synthesizes information

02

direct-query

Perform a direct similarity search against the Vercel AI SDK documentation index

03

clear-memory

Clear the conversation memory for a specific session or for all sessions if no session ID is supplied

Installation

1. Prerequisites
• Node.js ≥ 18
• npm, pnpm, or yarn package manager
• (Optional) Vercel CLI if you plan to deploy on Vercel
2. Clone the repository
git clone https://github.com/IvanAmador/vercel-ai-docs-mcp.git
cd vercel-ai-docs-mcp
3. Install dependencies (choose one)
npm install # or
pnpm install # or
yarn
4. Build TypeScript → JavaScript
npm run build # uses tsc or next-build depending on the project
5. Configure environment variables (create .env)
OPENAI_API_KEY=<your-openai-key>
# If using a vector DB (e.g., Pinecone, Supabase, Postgres + pgvector), add its credentials
VECTOR_DB_URL=<db-url>
VECTOR_DB_API_KEY=<db-key>
6. Start the MCP server locally
npm start # or: npm run dev for hot-reload
The server will default to http://localhost:3000 (check scripts section of package.json)
7. Deploy (examples)
• Vercel: vercel --prod
• Docker: docker build -t vercel-ai-docs-mcp . && docker run -p 3000:3000 vercel-ai-docs-mcp
8. Verify
GET http://localhost:3000/health or http://localhost:3000/.well-known/ai-plugin.json
The service should now be ready to accept MCP requests.

Documentation

# Supercharge Your Vercel AI SDK Development with Augment Code

Transform Augment Code into your personal Vercel AI SDK documentation expert with this powerful MCP server that brings semantic search and AI-powered documentation querying directly into your coding workflow.

## Instant Documentation Access in Your IDE

When you're deep in development with the Vercel AI SDK, context switching to search documentation kills your flow. This MCP server integrates seamlessly with Augment Code, giving you instant access to comprehensive Vercel AI SDK documentation without leaving your editor. Ask Augment natural language questions like "How do I implement streaming with the generateText function?" or "What's the difference between streamText and generateText?" and get precise, contextual answers with code examples pulled directly from the official documentation.

The server provides three powerful tools that extend Augment's capabilities: `agent-query` for AI-powered documentation synthesis, `direct-query` for fast similarity searches, and `clear-memory` for session management. With session persistence, Augment can maintain context across multiple related questions, making it perfect for exploring complex implementation patterns or debugging specific SDK features.

## Real-World Development Workflows

Picture this: you're building a chat application and need to implement function calling with the Vercel AI SDK. Instead of juggling browser tabs and documentation searches, simply ask Augment "Show me how to implement function calling with tool definitions" and get an instant, comprehensive answer with working code examples. The AI agent searches through the entire SDK documentation, synthesizes the information, and provides contextual guidance tailored to your specific use case.

This MCP server particularly shines when working on AI-powered features where you need to understand nuanced differences between SDK methods, configuration options, or integration patterns. Whether you're implementing streaming responses, setting up model providers, or handling complex prompt engineering scenarios, Augment Code becomes your knowledgeable pair programming partner with deep Vercel AI SDK expertise.

## Setup with Augment Code

Add this server to your Augment Code MCP configuration to unlock these capabilities:

```json
{
  "mcpServers": {
    "vercel-ai-docs": {
      "command": "node",
      "args": ["path/to/vercel-ai-docs-mcp/dist/main.js"],
      "env": {
        "GOOGLE_GENERATIVE_AI_API_KEY": "your-gemini-api-key"
      }
    }
  }
}
```

The result? Augment Code becomes your AI-powered documentation assistant, capable of answering complex questions about the Vercel AI SDK with the same depth and accuracy as if you had a Vercel SDK expert sitting right next to you.
License: Unknown (no license file or metadata found)
Updated 7/15/2025

Table of Contents