Skip to content
Install
mcp-registry/Nx Console
Nx Console logo

Nx Console

Author: nrwl

Description: Nx Console is the UI for Nx & Lerna (VS Code + JetBrains). It also ships an MCP server (for VS Code/Copilot and Cursor) to enrich AI chats with Nx workspace context (architecture, generators, up-to-date docs). The MCP server can be installed separately via the `nx-mcp` npm package (see `apps/nx-mcp/README.md`).

Stars: 1.4k

Forks: 239

License: MIT License

Category: Specialized

Overview

Installation

## Install Nx Console (Editor Extension)
You can download Nx Console from:
1. Visual Studio Code:
- Visual Studio Marketplace: [https://marketplace.visualstudio.com/items?itemName=nrwl.angular-console](https://marketplace.visualstudio.com/items?itemName=nrwl.angular-console)
- OpenVSX Registry: [https://open-vsx.org/extension/nrwl/angular-console](https://open-vsx.org/extension/nrwl/angular-console)
2. JetBrains:
- JetBrains Marketplace: [https://plugins.jetbrains.com/plugin/21060-nx-console](https://plugins.jetbrains.com/plugin/21060-nx-console)
## Requirements (from README)
- Use Nx Console inside an Nx or Lerna workspace.
- Node.js must be installed.
If you need a workspace:
1. Create a new Nx workspace:
npx create-nx-workspace@latest my-workspace
2. Or install Nx into an existing repository:
npx nx init
## JetBrains WSL note (from README)
If using JetBrains with WSL, configure **Languages & Frameworks > Node.js** to use the Node executable within the WSL distribution:
- [https://www.jetbrains.com/help/webstorm/how-to-use-wsl-development-environment-in-product.html#ws_wsl_node_interpreter_configure](https://www.jetbrains.com/help/webstorm/how-to-use-wsl-development-environment-in-product.html#ws_wsl_node_interpreter_configure)

FAQs

How do I configure nx-mcp to work with multiple MCP servers in a monorepo setup?

Configure nx-mcp alongside other MCP servers by adding separate entries in your client's MCP config file, ensuring each server has a unique name. Use the cwd property to scope servers to specific subdirectories, though note that some clients have limited support for this approach. For workspace-level nx-mcp options, create .nx/nx-mcp-config.json to control tool visibility without affecting other servers' configurations.

What is the difference between minimal mode and full mode in nx-mcp, and how do I decide which one to use?

Minimal mode hides workspace-analysis tools that duplicate Nx skills files functionality, while full mode exposes the complete tool set. Use full mode with the --no-minimal flag if skills files aren't installed or your client doesn't support them, ensuring the assistant accesses all workspace-analysis capabilities. Otherwise, use minimal mode to avoid tool duplication and conflicting responses.

How does nx-mcp expose CI pipeline data from Nx Cloud to AI assistants, and what specific information is available?

When Nx Cloud is enabled, nx-mcp surfaces CI pipeline data as MCP resources. The system exposes recent pipeline executions, allowing assistants to inspect failed tasks and terminal output from CI runs. This lets AI analyze build failures, test errors, and deployment issues directly from the cloud execution environment without manually forwarding logs.

Can I use nx-mcp with AI coding assistants other than the officially supported ones like Copilot, Cursor, and Claude?

Yes. Any AI assistant supporting MCP stdio transport can connect to nx-mcp using standard npx configuration. The requirement is Model Context Protocol implementation, not official documentation. Community users have successfully integrated undocumented clients, though manual JSON configuration and troubleshooting may be needed without first-party guides or guaranteed compatibility.

How do I fix the @parcel/watcher connection failure error when running nx-mcp on Windows?

Install Node.js directly at the system level rather than through a shell-level version manager like nvm-windows. The error typically occurs because MCP clients launched through GUI contexts cannot access Node.js installations scoped only to shell environments. After system-level installation, restart your editor to pick up the updated PATH.

What MCP tools does nx-mcp provide and how do they help AI assistants understand monorepo project dependencies?

The package provides workspace-aware tools including nx_workspace for project graphs, nx_project_details for configuration, nx_generators for code generators, and nx_generator_schema for parameter schemas. These transform AI assistants from file-level autocomplete into architecture-aware collaborators by exposing structural relationships, enabling accurate dependency tracing, generator command construction, and configuration guidance aligned to the installed Nx version.

License: MIT License
Updated 3/17/2026
Nx Console MCP by nrwl | Monorepo AI Development