Documentation Index
Fetch the complete documentation index at: https://docs.nogic.dev/llms.txt
Use this file to discover all available pages before exploring further.
Commands
All commands live under the Nogic: prefix in the Command Palette (Cmd+Shift+P / Ctrl+Shift+P).
| Command | What it does |
|---|
Nogic: Open Visualizer | Open (or focus) the visualizer panel |
Nogic: Open Diff Board | Open the live current-diff board |
Nogic: Show MCP Setup | Reopen the MCP setup overlay |
Nogic: Sign In | Sign in to Nogic |
Nogic: Sign Out | Sign out of Nogic |
Add to Nogic Board | (Explorer context menu) Add a file or folder to the active board |
Keybindings
| Action | Mac | Win/Linux | Scope |
|---|
| Focus search | Cmd+F | Ctrl+F | Visualizer panel |
| Open AI cursor | Cmd+K | Ctrl+K | Visualizer panel |
| Open file browser | Cmd+I | Ctrl+I | Visualizer panel |
The visualizer-only scope keeps these bindings from clashing with VS Code’s editor-level shortcuts elsewhere.
Settings
Configure under Settings → Extensions → Nogic, or in settings.json:
| Key | Type | Default | Description |
|---|
nogic.openOnStartup | boolean | true | Auto-open the visualizer when a workspace opens |
nogic.telemetry.enabled | boolean | true | Anonymous usage telemetry. Respects VS Code’s global telemetry setting |
nogic.walkthrough.preferredCli | "auto" | "claude" | "codex" | "auto" | Which CLI handles Cmd+K. auto probes both at startup |
nogic.walkthrough.enabledClis | string[] | ["claude", "codex"] | Which CLIs are allowed to handle Cmd+K. Toggle via the MCP Setup overlay |
nogic.diffAnalyze.modelTier | "cheap" | "medium" | "medium" | Model tier for Diff Analyze canvas authoring. See Diff Analyze tuning below |
nogic.diffAnalyze.concurrency | number (1-16) | 8 | How many canvas authoring runs in flight simultaneously. See Diff Analyze tuning below |
nogic.walkthrough.agentBaseUrl | string | https://api.nogic.dev | Deprecated. Legacy backend URL — Cmd+K and canvas authoring now run locally via your CLI |
nogic.walkthrough.apiKey | string | "" | Deprecated. Legacy backend key — see above |
Diff Analyze tuning
Two knobs control the cost/quality and cost/speed trade-offs for Diff Analyze canvas authoring. Most users never need to change these — the defaults work well for Claude Pro/Max and Codex Plus/Pro subscribers.
nogic.diffAnalyze.modelTier
Controls which model tier authors the per-group canvases.
| Value | Models used | Trade-off |
|---|
medium (default) | Sonnet (Claude Code) / codex-grade (Codex) | Reliable connection authoring, accurate hunk anchoring, supports Before/After diagrams. Higher cost per group. |
cheap | Haiku (Claude Code) / mini (Codex) | ~3-5× cheaper, ~30% faster. Sparser hunks; multi-file diagrams may miss connections. |
Switch to cheap when: you’re on a pay-as-you-go API tier and want to control cost; your typical diffs are small/single-file; you’re iterating quickly and quality variance is acceptable.
Stay on medium when: you’re on Claude Pro/Max or Codex Plus/Pro (flat-rate plans — no per-call cost); your diffs span multiple files; you’re using Diff Analyze for actual code review, not just exploration.
Cmd+K is unaffected by this setting. Cmd+K always uses cheap (foreground, single-turn — Haiku is plenty for that use case).
nogic.diffAnalyze.concurrency
How many canvas authoring runs the parallel dispatcher allows in flight at once.
| Value | When to pick |
|---|
1-2 | Anthropic API tier 1 (default new accounts) or aggressive rate limits |
4 | Anthropic API tier 1 with light usage |
8 (default) | Claude Pro/Max, Codex Plus/Pro, Anthropic API tier 2+ |
12-16 | Anthropic API tier 3+ — benchmark first to confirm your tier handles the burst |
Higher values finish a 30-group diff faster (~15s vs ~60s typical), but multiply peak in-flight calls. If your AI provider rate-limits, you’ll see silent failures and need to use Refresh All Stale in the Diff Analyze toolbar to recover. The setting takes effect immediately on the next canvas authoring run — no extension restart needed.
Local data layout
~/.nogic/
workspaces/
<workspace-hash>/
nogic.db # SQLite index for this workspace
saved_walkthroughs/ # AI canvas walkthroughs saved per workspace
~/.codex/
config.toml # Auto-managed [mcp_servers.nogic] block (Codex only)
config.toml.nogic.bak # Backup of original codex config (created on first write)
Nothing here is sent off your machine. Delete ~/.nogic to wipe all extension data.
Privacy & telemetry
Nogic collects anonymous usage metrics (feature use counts, performance timings) to help prioritize what to work on. It never collects:
- Code, file contents, or paths
- Symbol names
- Prompts you type into the AI cursor
- AI responses
Disable telemetry with nogic.telemetry.enabled = false, or globally by setting VS Code’s telemetry.telemetryLevel = "off". See nogic.dev/telemetry for the full list of events.
Language support
Nogic has two surfaces and they support different languages.
Code graph (parsed)
These features depend on Nogic’s symbol index:
- Visualizer (the connected graph view, boards, inspect mode)
- Diff Analyze (smart ordering, blast radius, change groups)
- The MCP
render_code_tour tool
Supported languages:
- JavaScript / TypeScript (including JSX/TSX): full call graph and import detection, including
tsconfig / jsconfig path aliases, TS ESM (.js → .ts), and JSON/YAML config imports
- Python: class / function / method extraction, imports, basic call resolution
Files in other languages appear in the file tree but don’t contribute symbols or edges.
AI canvas (any language)
Cmd+K and every MCP canvas tool except render_code_tour work on any language. The AI client reads source with its own native Read / Glob / Grep, so the render, patch, render_dataflow, render_sequence, render_state_machine, and render_er_diagram tools don’t depend on the symbol index.
More parsed languages are planned. AI canvas language coverage is automatic: anything the AI client can read, it can diagram.
Engine compatibility
- VS Code
1.88.0+
- VSCodium and other Code OSS forks at the same engine version
The extension uses the VS Code MCP provider API (vscode.lm.registerMcpServerDefinitionProvider) when available (1.102+). On older VS Code versions, the local MCP server still runs and external CLIs can still connect; only the auto-registration into VS Code’s built-in AI ecosystem is skipped.
Support