TL;DR
- Codex CLI is OpenAI's terminal-based coding agent, now at 67K stars, 9K forks, and 400 contributors on GitHub.
- Version 0.116.0, released March 19, 2026, brings ChatGPT device-code sign-in, a new user prompt hook, and smoother plugin setup.
- Recent development targets corporate engineering teams with custom CA certificate support, sandbox policies, and a Python SDK.

Introduction
OpenAI's Codex CLI, a terminal-based coding agent, has surged to 67K GitHub stars, 9K forks, and 400 contributors according to its GitHub repository, making it one of the most popular open-source AI developer tools right now. The project has already shipped 640 tagged releases, with the latest (v0.116.0) landing on March 19, 2026. Here's what developers working in terminal-heavy workflows should know about it.
What Happened
OpenAI open-sourced Codex CLI in April 2025 as a lightweight coding agent that runs locally on your machine. Since then, the project has grown fast: over 4,000 commits, 9K forks, and a Rust-based rewrite (codex-rs) that has made Rust the dominant language in the repo at 95.6% of the codebase.
The tool installs via npm (npm i -g @openai/codex), Homebrew (brew install --cask codex), or direct binary download for macOS and Linux. Users can authenticate with their ChatGPT plan (Plus, Pro, Team, Edu, or Enterprise) or use an API key.
Recent developments have focused on enterprise-grade features: custom CA certificate support for corporate proxies, a hooks system including a new user prompt hook, a CI-friendly sandbox and remote test workflows, and a Python SDK for programmatic access. Version 0.116.0 adds ChatGPT device code sign-in, smoother plugin setup, a new user prompt hook, and improved real-time session behavior.
Key Features
- Terminal-native agent: Runs directly in your shell. No browser, no IDE plugin required. Just type codex and start prompting.
- Multiple auth paths: Sign in with your ChatGPT subscription or use an API key. ChatGPT plan users get access through their existing tier.
- Cross-platform binaries: Pre-built releases for macOS (ARM and x86) and Linux (x86 and ARM), with musl-linked Linux builds for maximum portability.
- Sandboxed execution: Built-in shell escalation controls and network proxy policies that block or allow outbound requests based on configurable allowlists.
- Enterprise proxy support: Custom CA certificates via environment variables such as SSL_CERT_FILE and related Codex-specific settings, covering HTTPS and WebSocket connections. Critical for teams behind TLS inspection.
- SDK and app-server: A Python SDK and app-server let you embed Codex in scripts, CI pipelines, or custom tooling beyond the interactive TUI.
Why It Matters
Terminal-based AI agents are becoming the default interface for developers who spend most of their day in a shell. Codex CLI fits that workflow without requiring a context switch to a browser or IDE.
The enterprise proxy work (custom CA certs, structured network policies) signals that OpenAI is targeting corporate engineering teams, not just individual developers. Teams behind corporate firewalls can now use Codex without fighting TLS interception errors.
The new hooks system, including a user prompt hook, lets teams intercept or augment prompts and wire Codex into auditing or policy workflows. That opens the door to automated context injection, audit logging, or policy enforcement at the agent level.
Example Use Case
A backend team maintains a large Rust monorepo with strict linting rules. A developer opens their terminal, runs codex, and asks it to add a new API endpoint to their actix-web service. Codex reads the existing route definitions, generates the handler, adds the route registration, and runs the project's cargo clippy and cargo test commands to verify correctness, all within the sandboxed shell environment.
Because the team's office network uses TLS inspection, they configure their corporate root CA bundle via the appropriate environment variable. Codex respects that across all outbound connections, including the WebSocket channel to OpenAI's API. No manual proxy workarounds needed.
Competitive Context
OpenAI built Codex CLI as the terminal counterpart to Codex Web (the cloud-based agent at chatgpt.com/codex) and Codex IDE extensions for VS Code, Cursor, and Windsurf. Where the cloud agent handles long-running tasks asynchronously, the CLI targets fast, interactive coding sessions on local repos.
The open-source, Apache-2.0-licensed approach differentiates it from proprietary terminal AI tools. With 95.6% Rust, the binary is fast and self-contained. The Python SDK also positions it as an embeddable component for teams building custom AI workflows, not just a standalone chat interface.
Bottom Line
Codex CLI is OpenAI's bet that developers want AI coding agents in their terminal, not just their editor. If your team writes code in a shell-heavy workflow, especially behind a corporate proxy, it's worth trying. Install it with npm i -g @openai/codex or brew install --cask codex and see if it fits your loop.
AI agent for your team? See how Augment Code approaches it differently.
Free tier available · VS Code extension · Takes 2 minutes
Written by

Molisha Shah
GTM and Customer Champion