Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.nogic.dev/llms.txt

Use this file to discover all available pages before exploring further.

Commands

All commands live under the Nogic: prefix in the Command Palette (Cmd+Shift+P / Ctrl+Shift+P).
CommandWhat it does
Nogic: Open VisualizerOpen (or focus) the visualizer panel
Nogic: Open Diff BoardOpen the live current-diff board
Nogic: Show MCP SetupReopen the MCP setup overlay
Nogic: Sign InSign in to Nogic
Nogic: Sign OutSign out of Nogic
Add to Nogic Board(Explorer context menu) Add a file or folder to the active board

Keybindings

ActionMacWin/LinuxScope
Focus searchCmd+FCtrl+FVisualizer panel
Open AI cursorCmd+KCtrl+KVisualizer panel
Open file browserCmd+ICtrl+IVisualizer panel
The visualizer-only scope keeps these bindings from clashing with VS Code’s editor-level shortcuts elsewhere.

Settings

Configure under Settings → Extensions → Nogic, or in settings.json:
KeyTypeDefaultDescription
nogic.openOnStartupbooleantrueAuto-open the visualizer when a workspace opens
nogic.telemetry.enabledbooleantrueAnonymous usage telemetry. Respects VS Code’s global telemetry setting
nogic.walkthrough.preferredCli"auto" | "claude" | "codex""auto"Which CLI handles Cmd+K. auto probes both at startup
nogic.walkthrough.enabledClisstring[]["claude", "codex"]Which CLIs are allowed to handle Cmd+K. Toggle via the MCP Setup overlay
nogic.diffAnalyze.modelTier"cheap" | "medium""medium"Model tier for Diff Analyze canvas authoring. See Diff Analyze tuning below
nogic.diffAnalyze.concurrencynumber (1-16)8How many canvas authoring runs in flight simultaneously. See Diff Analyze tuning below
nogic.walkthrough.agentBaseUrlstringhttps://api.nogic.devDeprecated. Legacy backend URL — Cmd+K and canvas authoring now run locally via your CLI
nogic.walkthrough.apiKeystring""Deprecated. Legacy backend key — see above

Diff Analyze tuning

Two knobs control the cost/quality and cost/speed trade-offs for Diff Analyze canvas authoring. Most users never need to change these — the defaults work well for Claude Pro/Max and Codex Plus/Pro subscribers.

nogic.diffAnalyze.modelTier

Controls which model tier authors the per-group canvases.
ValueModels usedTrade-off
medium (default)Sonnet (Claude Code) / codex-grade (Codex)Reliable connection authoring, accurate hunk anchoring, supports Before/After diagrams. Higher cost per group.
cheapHaiku (Claude Code) / mini (Codex)~3-5× cheaper, ~30% faster. Sparser hunks; multi-file diagrams may miss connections.
Switch to cheap when: you’re on a pay-as-you-go API tier and want to control cost; your typical diffs are small/single-file; you’re iterating quickly and quality variance is acceptable. Stay on medium when: you’re on Claude Pro/Max or Codex Plus/Pro (flat-rate plans — no per-call cost); your diffs span multiple files; you’re using Diff Analyze for actual code review, not just exploration.
Cmd+K is unaffected by this setting. Cmd+K always uses cheap (foreground, single-turn — Haiku is plenty for that use case).

nogic.diffAnalyze.concurrency

How many canvas authoring runs the parallel dispatcher allows in flight at once.
ValueWhen to pick
1-2Anthropic API tier 1 (default new accounts) or aggressive rate limits
4Anthropic API tier 1 with light usage
8 (default)Claude Pro/Max, Codex Plus/Pro, Anthropic API tier 2+
12-16Anthropic API tier 3+ — benchmark first to confirm your tier handles the burst
Higher values finish a 30-group diff faster (~15s vs ~60s typical), but multiply peak in-flight calls. If your AI provider rate-limits, you’ll see silent failures and need to use Refresh All Stale in the Diff Analyze toolbar to recover. The setting takes effect immediately on the next canvas authoring run — no extension restart needed.

Local data layout

~/.nogic/
  workspaces/
    <workspace-hash>/
      nogic.db                  # SQLite index for this workspace
      saved_walkthroughs/       # AI canvas walkthroughs saved per workspace
~/.codex/
  config.toml                   # Auto-managed [mcp_servers.nogic] block (Codex only)
  config.toml.nogic.bak         # Backup of original codex config (created on first write)
Nothing here is sent off your machine. Delete ~/.nogic to wipe all extension data.

Privacy & telemetry

Nogic collects anonymous usage metrics (feature use counts, performance timings) to help prioritize what to work on. It never collects:
  • Code, file contents, or paths
  • Symbol names
  • Prompts you type into the AI cursor
  • AI responses
Disable telemetry with nogic.telemetry.enabled = false, or globally by setting VS Code’s telemetry.telemetryLevel = "off". See nogic.dev/telemetry for the full list of events.

Language support

Nogic has two surfaces and they support different languages.

Code graph (parsed)

These features depend on Nogic’s symbol index:
  • Visualizer (the connected graph view, boards, inspect mode)
  • Diff Analyze (smart ordering, blast radius, change groups)
  • The MCP render_code_tour tool
Supported languages:
  • JavaScript / TypeScript (including JSX/TSX): full call graph and import detection, including tsconfig / jsconfig path aliases, TS ESM (.js.ts), and JSON/YAML config imports
  • Python: class / function / method extraction, imports, basic call resolution
Files in other languages appear in the file tree but don’t contribute symbols or edges.

AI canvas (any language)

Cmd+K and every MCP canvas tool except render_code_tour work on any language. The AI client reads source with its own native Read / Glob / Grep, so the render, patch, render_dataflow, render_sequence, render_state_machine, and render_er_diagram tools don’t depend on the symbol index. More parsed languages are planned. AI canvas language coverage is automatic: anything the AI client can read, it can diagram.

Engine compatibility

  • VS Code 1.88.0+
  • VSCodium and other Code OSS forks at the same engine version
The extension uses the VS Code MCP provider API (vscode.lm.registerMcpServerDefinitionProvider) when available (1.102+). On older VS Code versions, the local MCP server still runs and external CLIs can still connect; only the auto-registration into VS Code’s built-in AI ecosystem is skipped.

Support