Skip to content

CodeAlive-AI/codealive-mcp

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

137 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

CodeAlive MCP: Deepest Context Engine for your projects (especially for large codebases)

CodeAlive Logo

Connect your AI assistant to CodeAlive's powerful code understanding platform in seconds!

This MCP (Model Context Protocol) server enables AI clients like Claude Code, Cursor, Claude Desktop, Continue, VS Code (GitHub Copilot), Cline, Codex, OpenCode, Qwen Code, Gemini CLI, Roo Code, Goose, Kilo Code, Windsurf, Kiro, Qoder, n8n, and Amazon Q Developer to access CodeAlive's advanced semantic code search and codebase interaction features.

What is CodeAlive?

The most accurate and comprehensive Context Engine as a service, optimized for large codebases, powered by advanced GraphRAG and accessible via MCP. It enriches the context for AI agents like Cursor, Claude Code, Codex, etc., making them 35% more efficient and up to 84% faster.

It's like Context7, but for your (large) codebases.

It allows AI-Coding Agents to:

  • Find relevant code faster with semantic search
  • Understand the bigger picture beyond isolated files
  • Provide better answers with full project context
  • Reduce costs and time by removing guesswork

πŸ›  Available Tools

Once connected, you'll have access to these powerful tools:

  1. get_data_sources - List your indexed repositories and workspaces
  2. semantic_search - Canonical semantic search across indexed artifacts
  3. grep_search - Exact text or regex search with line-level matches
  4. fetch_artifacts - Load the full source for relevant search hits
  5. get_artifact_relationships - Expand call graph, inheritance, and reference relationships for one artifact
  6. chat - Slower synthesized codebase Q&A, typically only after search
  7. codebase_search - Deprecated legacy semantic search alias kept for backward compatibility
  8. codebase_consultant - Deprecated alias for chat

🎯 Usage Examples

After setup, try these commands with your AI assistant:

  • "Show me all available repositories" β†’ Uses get_data_sources
  • "Find authentication code in the user service" β†’ Uses semantic_search
  • "Find the exact regex that matches JWT tokens" β†’ Uses grep_search
  • "Explain how the payment flow works in this codebase" β†’ Usually starts with semantic_search/grep_search, then optionally uses chat

semantic_search and grep_search should be the default tools for most agents. chat is a slower synthesis fallback, can take up to 30 seconds, and is usually unnecessary when an agent can run a multi-step workflow with search, fetch, relationships, and local file reads. If your agent supports subagents, the highest-confidence path is to delegate a focused subagent that orchestrates semantic_search and grep_search first.

πŸ“š Agent Skill

For an even better experience, install the CodeAlive Agent Skill alongside the MCP server. The MCP server gives your agent access to CodeAlive's tools; the skill teaches it the best workflows and query patterns to use them effectively.

For most agents (Cursor, Copilot, Gemini CLI, Codex, and 30+ others) β€” install the skill:

npx skills add CodeAlive-AI/codealive-skills@codealive-context-engine

For Claude Code β€” install the plugin (recommended), which includes the skill plus Claude-specific enhancements:

/plugin marketplace add CodeAlive-AI/codealive-skills
/plugin install codealive@codealive-marketplace

Table of Contents

πŸš€ Quick Start (Remote)

The fastest way to get started - no installation required! Our remote MCP server at https://mcp.codealive.ai/api provides instant access to CodeAlive's capabilities.

Step 1: Get Your API Key

  1. Sign up at https://app.codealive.ai/
  2. Navigate to MCP & API
  3. Click "+ Create API Key"
  4. Copy your API key immediately - you won't see it again!

Step 2: Choose Your AI Client

Select your preferred AI client below for instant setup:

πŸš€ Quick Start (Agentic Installation)

You may ask your AI agent to install the CodeAlive MCP server for you.

  1. Copy-Paste the following prompt into your AI agent (remember to insert your API key):
Here is CodeAlive API key: PASTE_YOUR_API_KEY_HERE

Add the CodeAlive MCP server by following the installation guide from the README at https://raw.githubusercontent.com/CodeAlive-AI/codealive-mcp/main/README.md

Find the section "AI Client Integrations" and locate your client (Claude Code, Cursor, Gemini CLI, etc.). Each client has specific setup instructions:
- For Gemini CLI: Use the one-command setup with `gemini mcp add`
- For Claude Code: Use `claude mcp add` with the --transport http flag
- For other clients: Follow the configuration snippets provided

Prefer the Remote HTTP option when available. If API key is not provided above, help me issue a CodeAlive API key first.

Then allow execution.

  1. Restart your AI agent.

πŸ€– AI Client Integrations

Claude Code

Option 1: Remote HTTP (Recommended)

claude mcp add --transport http codealive https://mcp.codealive.ai/api --header "Authorization: Bearer YOUR_API_KEY_HERE"

Option 2: Docker (STDIO)

claude mcp add codealive-docker /usr/bin/docker run --rm -i -e CODEALIVE_API_KEY=YOUR_API_KEY_HERE ghcr.io/codealive-ai/codealive-mcp:main

Replace YOUR_API_KEY_HERE with your actual API key.

Cursor

Option 1: Remote HTTP (Recommended)

  1. Open Cursor β†’ Settings (Cmd+, or Ctrl+,)
  2. Navigate to "MCP" in the left panel
  3. Click "Add new MCP server"
  4. Paste this configuration:
{
  "mcpServers": {
    "codealive": {
      "url": "https://mcp.codealive.ai/api",
      "headers": {
        "Authorization": "Bearer YOUR_API_KEY_HERE"
      }
    }
  }
}
  1. Save and restart Cursor

Option 2: Docker (STDIO)

{
  "mcpServers": {
    "codealive": {
      "command": "docker",
      "args": [
        "run", "--rm", "-i",
        "-e", "CODEALIVE_API_KEY=YOUR_API_KEY_HERE",
        "ghcr.io/codealive-ai/codealive-mcp:main"
      ]
    }
  }
}
Codex

OpenAI Codex CLI supports MCP via ~/.codex/config.toml.

~/.codex/config.toml (Docker stdio – recommended)

[mcp_servers.codealive]
command = "docker"
args = ["run", "--rm", "-i",
        "-e", "CODEALIVE_API_KEY=YOUR_API_KEY_HERE",
        "ghcr.io/codealive-ai/codealive-mcp:main"]

Experimental: Streamable HTTP (requires [features].rmcp_client = true)

Note: Streamable HTTP support requires rmcp_client = true under a [features] section in your Codex configuration.

[mcp_servers.codealive]
url = "https://mcp.codealive.ai/api"
headers = { Authorization = "Bearer YOUR_API_KEY_HERE" }
Gemini CLI

One command setup (complete):

gemini mcp add --transport http secure-http https://mcp.codealive.ai/api --header "Authorization: Bearer YOUR_API_KEY_HERE"

Replace YOUR_API_KEY_HERE with your actual API key. That's it - no config files needed! πŸŽ‰

Continue

Option 1: Remote HTTP (Recommended)

  1. Create/edit .continue/config.yaml in your project or ~/.continue/config.yaml
  2. Add this configuration:
mcpServers:
  - name: CodeAlive
    type: streamable-http
    url: https://mcp.codealive.ai/api
    requestOptions:
      headers:
        Authorization: "Bearer YOUR_API_KEY_HERE"
  1. Restart VS Code

Option 2: Docker (STDIO)

mcpServers:
  - name: CodeAlive
    type: stdio
    command: docker
    args:
      - run
      - --rm
      - -i
      - -e
      - CODEALIVE_API_KEY=YOUR_API_KEY_HERE
      - ghcr.io/codealive-ai/codealive-mcp:main
Visual Studio Code with GitHub Copilot

Option 1: Remote HTTP (Recommended)

Note: VS Code supports both Streamable HTTP and SSE transports, with automatic fallback to SSE if Streamable HTTP fails.

  1. Open Command Palette (Ctrl+Shift+P or Cmd+Shift+P)
  2. Run "MCP: Add Server"
  3. Choose "HTTP" server type
  4. Enter this configuration:
{
  "servers": {
    "codealive": {
      "type": "http",
      "url": "https://mcp.codealive.ai/api",
      "headers": {
        "Authorization": "Bearer YOUR_API_KEY_HERE"
      }
    }
  }
}
  1. Restart VS Code

Option 2: Docker (STDIO)

Create .vscode/mcp.json in your workspace:

{
  "servers": {
    "codealive": {
      "command": "docker",
      "args": [
        "run", "--rm", "-i",
        "-e", "CODEALIVE_API_KEY=YOUR_API_KEY_HERE",
        "ghcr.io/codealive-ai/codealive-mcp:main"
      ]
    }
  }
}
Claude Desktop

Option 1: Extension bundle .mcpb (Recommended)

The .mcpb bundle gives you one-click install, secure token storage, and self-hosted baseUrl configuration β€” no Docker or CLI required.

  1. Download codealive-mcp.mcpb from the latest GitHub Release
  2. In Claude Desktop, open Settings β†’ Extensions β†’ Install Extension...
  3. Select the downloaded .mcpb file and configure:
    • CodeAlive API Key: your bearer token
    • CodeAlive Base URL: defaults to https://app.codealive.ai; for self-hosted, use your deployment origin (e.g. https://codealive.yourcompany.com)
    • Ignore TLS Errors: only for dev/self-signed environments

Option 2: Docker (STDIO)

  1. Edit your config file:

    • macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
    • Windows: %APPDATA%\Claude\claude_desktop_config.json
  2. Add this configuration:

{
  "mcpServers": {
    "codealive": {
      "command": "docker",
      "args": [
        "run", "--rm", "-i",
        "-e", "CODEALIVE_API_KEY=YOUR_API_KEY_HERE",
        "ghcr.io/codealive-ai/codealive-mcp:main"
      ]
    }
  }
}
  1. Restart Claude Desktop
Cline

Option 1: Remote HTTP (Recommended)

  1. Open Cline extension in VS Code
  2. Click the MCP Servers icon to configure
  3. Add this configuration to your MCP settings:
{
  "mcpServers": {
    "codealive": {
      "url": "https://mcp.codealive.ai/api",
      "headers": {
        "Authorization": "Bearer YOUR_API_KEY_HERE"
      }
    }
  }
}
  1. Save and restart VS Code

Option 2: Docker (STDIO)

{
  "mcpServers": {
    "codealive": {
      "command": "docker",
      "args": [
        "run", "--rm", "-i",
        "-e", "CODEALIVE_API_KEY=YOUR_API_KEY_HERE",
        "ghcr.io/codealive-ai/codealive-mcp:main"
      ]
    }
  }
}
OpenCode

Add CodeAlive as a remote MCP server in your opencode.json.

{
  "$schema": "https://opencode.ai/config.json",
  "mcp": {
    "codealive": {
      "type": "remote",
      "url": "https://mcp.codealive.ai/api",
      "enabled": true,
      "headers": {
        "Authorization": "Bearer YOUR_API_KEY_HERE"
      }
    }
  }
}
Qwen Code

Qwen Code supports MCP via mcpServers in its settings.json and multiple transports (stdio/SSE/streamable-http). Use streamable-http when available; otherwise use Docker (stdio).

~/.qwen/settings.json (Streamable HTTP)

{
  "mcpServers": {
    "codealive": {
      "type": "streamable-http",
      "url": "https://mcp.codealive.ai/api",
      "requestOptions": {
        "headers": {
          "Authorization": "Bearer YOUR_API_KEY_HERE"
        }
      }
    }
  }
}

Fallback: Docker (stdio)

{
  "mcpServers": {
    "codealive": {
      "type": "stdio",
      "command": "docker",
      "args": ["run", "--rm", "-i",
               "-e", "CODEALIVE_API_KEY=YOUR_API_KEY_HERE",
               "ghcr.io/codealive-ai/codealive-mcp:main"]
    }
  }
}
Roo Code

Roo Code reads a JSON settings file similar to Cline.

Global config: mcp_settings.json (Roo) or cline_mcp_settings.json (Cline-style)

Option A β€” Remote HTTP

{
  "mcpServers": {
    "codealive": {
      "type": "streamable-http",
      "url": "https://mcp.codealive.ai/api",
      "headers": {
        "Authorization": "Bearer YOUR_API_KEY_HERE"
      }
    }
  }
}

Option B β€” Docker (STDIO)

{
  "mcpServers": {
    "codealive": {
      "type": "stdio",
      "command": "docker",
      "args": [
        "run", "--rm", "-i",
        "-e", "CODEALIVE_API_KEY=YOUR_API_KEY_HERE",
        "ghcr.io/codealive-ai/codealive-mcp:main"
      ]
    }
  }
}

Tip: If your Roo build doesn't honor HTTP headers, use the Docker/STDIO option.

Goose

UI path: Settings β†’ MCP Servers β†’ Add β†’ choose Streamable HTTP

Streamable HTTP configuration:

  • Name: codealive
  • Endpoint URL: https://mcp.codealive.ai/api
  • Headers: Authorization: Bearer YOUR_API_KEY_HERE

Docker (STDIO) alternative:

Add a STDIO extension with:

  • Command: docker
  • Args: run --rm -i -e CODEALIVE_API_KEY=YOUR_API_KEY_HERE ghcr.io/codealive-ai/codealive-mcp:main
Kilo Code

UI path: Manage β†’ Integrations β†’ Model Context Protocol (MCP) β†’ Add Server

HTTP

{
  "mcpServers": {
    "codealive": {
      "type": "streamable-http",
      "url": "https://mcp.codealive.ai/api",
      "headers": {
        "Authorization": "Bearer YOUR_API_KEY_HERE"
      }
    }
  }
}

STDIO (Docker)

{
  "mcpServers": {
    "codealive": {
      "type": "stdio",
      "command": "docker",
      "args": [
        "run", "--rm", "-i",
        "-e", "CODEALIVE_API_KEY=YOUR_API_KEY_HERE",
        "ghcr.io/codealive-ai/codealive-mcp:main"
      ]
    }
  }
}
Windsurf (Codeium)

File: ~/.codeium/windsurf/mcp_config.json

{
  "mcpServers": {
    "codealive": {
      "type": "streamable-http",
      "serverUrl": "https://mcp.codealive.ai/api",
      "headers": {
        "Authorization": "Bearer YOUR_API_KEY_HERE"
      }
    }
  }
}
Kiro

Note: Kiro does not yet support remote MCP servers natively. Use the mcp-remote workaround to connect to remote HTTP servers.

Prerequisites:

npm install -g mcp-remote

UI path: Settings β†’ MCP β†’ Add Server

Global file: ~/.kiro/settings/mcp.json Workspace file: .kiro/settings/mcp.json

Remote HTTP (via mcp-remote workaround)

{
  "mcpServers": {
    "codealive": {
      "type": "stdio",
      "command": "npx",
      "args": [
        "mcp-remote",
        "https://mcp.codealive.ai/api",
        "--header",
        "Authorization: Bearer ${CODEALIVE_API_KEY}"
      ],
      "env": {
        "CODEALIVE_API_KEY": "YOUR_API_KEY_HERE"
      }
    }
  }
}

Docker (STDIO)

{
  "mcpServers": {
    "codealive": {
      "type": "stdio",
      "command": "docker",
      "args": [
        "run", "--rm", "-i",
        "-e", "CODEALIVE_API_KEY=YOUR_API_KEY_HERE",
        "ghcr.io/codealive-ai/codealive-mcp:main"
      ]
    }
  }
}
Qoder

UI path: User icon β†’ Qoder Settings β†’ MCP β†’ My Servers β†’ + Add (Agent mode)

SSE (remote HTTP)

{
  "mcpServers": {
    "codealive": {
      "type": "sse",
      "url": "https://mcp.codealive.ai/api",
      "headers": {
        "Authorization": "Bearer YOUR_API_KEY_HERE"
      }
    }
  }
}

STDIO (Docker)

{
  "mcpServers": {
    "codealive": {
      "type": "stdio",
      "command": "docker",
      "args": [
        "run", "--rm", "-i",
        "-e", "CODEALIVE_API_KEY=YOUR_API_KEY_HERE",
        "ghcr.io/codealive-ai/codealive-mcp:main"
      ]
    }
  }
}
Amazon Q Developer (CLI & IDE)

Q Developer CLI

Config file: ~/.aws/amazonq/mcp.json or workspace .amazonq/mcp.json

HTTP server

{
  "mcpServers": {
    "codealive": {
      "type": "http",
      "url": "https://mcp.codealive.ai/api",
      "headers": {
        "Authorization": "Bearer YOUR_API_KEY_HERE"
      }
    }
  }
}

STDIO (Docker)

{
  "mcpServers": {
    "codealive": {
      "type": "stdio",
      "command": "docker",
      "args": [
        "run", "--rm", "-i",
        "-e", "CODEALIVE_API_KEY=YOUR_API_KEY_HERE",
        "ghcr.io/codealive-ai/codealive-mcp:main"
      ]
    }
  }
}

Q Developer IDE (VS Code / JetBrains)

Global: ~/.aws/amazonq/agents/default.json Local (workspace): .aws/amazonq/agents/default.json

Minimal entry (HTTP):

{
  "mcpServers": {
    "codealive": {
      "type": "http",
      "url": "https://mcp.codealive.ai/api",
      "headers": {
        "Authorization": "Bearer YOUR_API_KEY_HERE"
      },
      "timeout": 310000
    }
  }
}

Use the IDE UI: Q panel β†’ Chat β†’ tools icon β†’ Add MCP Server β†’ choose http or stdio.

JetBrains AI Assistant

Note: JetBrains AI Assistant requires the mcp-remote workaround for connecting to remote HTTP MCP servers.

Prerequisites:

npm install -g mcp-remote

Config file: Settings/Preferences β†’ AI Assistant β†’ Model Context Protocol β†’ Configure

Add this configuration:

{
  "mcpServers": {
    "codealive": {
      "command": "npx",
      "args": [
        "mcp-remote",
        "https://mcp.codealive.ai/api",
        "--header",
        "Authorization: Bearer ${CODEALIVE_API_KEY}"
      ],
      "env": {
        "CODEALIVE_API_KEY": "YOUR_API_KEY_HERE"
      }
    }
  }
}

For self-hosted deployments, replace the URL:

{
  "mcpServers": {
    "codealive": {
      "command": "npx",
      "args": [
        "mcp-remote",
        "http://your-server:8000/api",
        "--header",
        "Authorization: Bearer ${CODEALIVE_API_KEY}"
      ],
      "env": {
        "CODEALIVE_API_KEY": "YOUR_API_KEY_HERE"
      }
    }
  }
}

See JetBrains MCP Documentation for more details.

n8n

Using AI Agent Node with MCP Tools

  1. Add an AI Agent node to your workflow

  2. Configure the agent with MCP tools:

    Server URL: https://mcp.codealive.ai/api
    Authorization Header: Bearer YOUR_API_KEY_HERE
    
  3. The server automatically handles n8n's extra parameters (sessionId, action, chatInput, toolCallId)

  4. Use the available tools:

    • get_data_sources - List available repositories
    • semantic_search - Search code semantically
    • grep_search - Search by exact text or regex
    • get_artifact_relationships - Expand relationships for one artifact
    • chat - Slower synthesized codebase Q&A, usually after search
    • codebase_search - Legacy semantic search alias
    • codebase_consultant - Deprecated alias for chat

Example Workflow:

Trigger β†’ AI Agent (with CodeAlive MCP tools) β†’ Process Response

Note: n8n middleware is built-in, so no special configuration is needed. The server will automatically strip n8n's extra parameters before processing tool calls.


πŸ”§ Advanced: Local Development

For developers who want to customize or contribute to the MCP server.

Prerequisites

  • Python 3.11+
  • uv (recommended) or pip

Installation

# Clone the repository
git clone https://github.com/CodeAlive-AI/codealive-mcp.git
cd codealive-mcp

# Setup with uv (recommended)
uv venv
source .venv/bin/activate  # Windows: .venv\Scripts\activate
uv pip install -e .

# Or setup with pip
python -m venv .venv
source .venv/bin/activate  # Windows: .venv\Scripts\activate  
pip install -e .

Local Server Configuration

Once installed locally, configure your AI client to use the local server:

Claude Code (Local)

claude mcp add codealive-local /path/to/codealive-mcp/.venv/bin/python /path/to/codealive-mcp/src/codealive_mcp_server.py --env CODEALIVE_API_KEY=YOUR_API_KEY_HERE

Other Clients (Local)

Replace the Docker command and args with:

{
  "command": "/path/to/codealive-mcp/.venv/bin/python",
  "args": ["/path/to/codealive-mcp/src/codealive_mcp_server.py"],
  "env": {
    "CODEALIVE_API_KEY": "YOUR_API_KEY_HERE"
  }
}

Running HTTP Server Locally

# Start local HTTP server
export CODEALIVE_API_KEY="your_api_key_here"
python src/codealive_mcp_server.py --transport http --host localhost --port 8000

# Test health endpoint
curl http://localhost:8000/health

Testing Your Local Installation

After making changes, quickly verify everything works:

# Quick smoke test (recommended)
make smoke-test

# Or run directly
python smoke_test.py

# With your API key for full testing
CODEALIVE_API_KEY=your_key python smoke_test.py

# Run unit tests
make unit-test

# Run all tests
make test

The smoke test verifies:

  • Server starts and connects correctly
  • All tools are registered
  • Each tool responds appropriately
  • Parameter validation works
  • Runs in ~5 seconds

Smithery Installation

Auto-install for Claude Desktop via Smithery:

npx -y @smithery/cli install @CodeAlive-AI/codealive-mcp --client claude

🌐 Community Plugins

Gemini CLI β€” CodeAlive Extension

Repo: https://github.com/akolotov/gemini-cli-codealive-extension

Gemini CLI extension that wires CodeAlive into your terminal with prebuilt slash commands and MCP config. It includes:

  • GEMINI.md guidance so Gemini knows how to use CodeAlive tools effectively
  • Slash commands: /codealive:chat, /codealive:find, /codealive:search
  • Easy setup via Gemini CLI's extension system

Install

gemini extensions install https://github.com/akolotov/gemini-cli-codealive-extension

Configure

# Option 1: .env next to where you run `gemini`
CODEALIVE_API_KEY="your_codealive_api_key_here"

# Option 2: environment variable
export CODEALIVE_API_KEY="your_codealive_api_key_here"
gemini

🚒 HTTP Deployment (Self-Hosted & Cloud)

Deploy the MCP server as an HTTP service for team-wide access or integration with self-hosted CodeAlive instances.

Deployment Options

The CodeAlive MCP server can be deployed as an HTTP service using Docker. This allows multiple AI clients to connect to a single shared instance, and enables integration with self-hosted CodeAlive deployments.

Docker Compose (Recommended)

Create a docker-compose.yml file based on our example:

# Download the example
curl -O https://raw.githubusercontent.com/CodeAlive-AI/codealive-mcp/main/docker-compose.example.yml
mv docker-compose.example.yml docker-compose.yml

# Edit configuration (see below)
nano docker-compose.yml

# Start the service
docker compose up -d

# Check health
curl http://localhost:8000/health

Configuration Options:

  1. For CodeAlive Cloud (default):

    • Remove CODEALIVE_BASE_URL environment variable (uses default https://app.codealive.ai)
    • Clients must provide their API key via Authorization: Bearer YOUR_KEY header
  2. For Self-Hosted CodeAlive:

    • Set CODEALIVE_BASE_URL to your CodeAlive instance URL (e.g., https://codealive.yourcompany.com)
    • Clients must provide their API key via Authorization: Bearer YOUR_KEY header

See docker-compose.example.yml for the complete configuration template.

Connecting AI Clients to Your Deployed Instance

Once deployed, configure your AI clients to use your HTTP endpoint:

Claude Code:

claude mcp add --transport http codealive http://your-server:8000/api --header "Authorization: Bearer YOUR_API_KEY_HERE"

VS Code:

code --add-mcp "{\"name\":\"codealive\",\"type\":\"http\",\"url\":\"http://your-server:8000/api\",\"headers\":{\"Authorization\":\"Bearer YOUR_API_KEY_HERE\"}}"

Cursor / Other Clients:

{
  "mcpServers": {
    "codealive": {
      "url": "http://your-server:8000/api",
      "headers": {
        "Authorization": "Bearer YOUR_API_KEY_HERE"
      }
    }
  }
}

Replace your-server:8000 with your actual deployment URL and port.


πŸͺŸ Windows & WSL

Overview

Claude Code on Windows runs inside WSL (Windows Subsystem for Linux). Claude Desktop runs natively on Windows. This creates specific challenges when configuring MCP servers β€” especially around binary paths, environment variables, and networking.

The simplest solution for both Claude Code and Claude Desktop on Windows is Remote HTTP β€” it avoids all subprocess spawning and path issues entirely.

Claude Code in WSL

When Claude Code runs inside WSL, it operates as a normal Linux environment. Use the same commands as on Linux/macOS:

Remote HTTP (Recommended β€” avoids all WSL issues):

claude mcp add --transport http codealive https://mcp.codealive.ai/api --header "Authorization: Bearer YOUR_API_KEY_HERE"

Docker STDIO (if Docker Desktop WSL integration is enabled):

claude mcp add codealive-docker /usr/bin/docker run --rm -i -e CODEALIVE_API_KEY=YOUR_API_KEY_HERE ghcr.io/codealive-ai/codealive-mcp:main

Note: If docker is not found, ensure Docker Desktop has WSL integration enabled for your distro, or use the full path /usr/bin/docker.

Claude Desktop on Windows (Docker STDIO)

Claude Desktop on Windows cannot connect to MCP servers running inside WSL directly. Use one of these approaches:

Option 1: Docker Desktop (Recommended)

If Docker Desktop is installed on Windows, docker.exe is in the Windows PATH:

{
  "mcpServers": {
    "codealive": {
      "command": "docker",
      "args": [
        "run", "--rm", "-i",
        "-e", "CODEALIVE_API_KEY=YOUR_API_KEY_HERE",
        "ghcr.io/codealive-ai/codealive-mcp:main"
      ]
    }
  }
}

Option 2: wsl.exe Proxy

If Docker is only available inside WSL, use wsl.exe as a bridge:

{
  "mcpServers": {
    "codealive": {
      "command": "wsl.exe",
      "args": [
        "--", "docker", "run", "--rm", "-i",
        "-e", "CODEALIVE_API_KEY=YOUR_API_KEY_HERE",
        "ghcr.io/codealive-ai/codealive-mcp:main"
      ]
    }
  }
}
Self-Hosted MCP Server in WSL2

If you run the MCP HTTP server inside WSL2 and connect from Windows-side clients:

WSL2 NAT networking issue: By default, localhost inside WSL2 is not reachable from Windows. Two workarounds:

  1. Enable mirrored networking (Windows 11 22H2+) β€” add to %USERPROFILE%\.wslconfig:

    [wsl2]
    networkingMode=mirrored

    Then restart WSL: wsl --shutdown. After this, localhost is shared between Windows and WSL2.

  2. Use WSL2 VM IP β€” run hostname -I inside WSL to get the IP, then connect to http://<WSL_IP>:8000/api instead of http://localhost:8000/api.

Common WSL Pitfalls
Problem Cause Fix
docker: command not found Docker not in WSL PATH Enable Docker Desktop WSL integration for your distro, or use full path /usr/bin/docker
ENOENT / spawn error for npx or python Binary not in non-interactive shell PATH Use absolute path (e.g., /home/user/.nvm/versions/node/v20/bin/npx)
Environment variables missing WSL non-login shell doesn't source .bashrc Add vars explicitly in MCP config env block
Connection refused to self-hosted server WSL2 NAT isolates localhost Enable mirrored networking or use WSL2 VM IP
Claude Desktop can't reach WSL MCP server Claude Desktop doesn't support WSL subprocess spawning Use Remote HTTP, Docker Desktop, or wsl.exe proxy

🐞 Troubleshooting

Quick Diagnostics

  1. Test the hosted service:

    curl https://mcp.codealive.ai/health
  2. Check your API key:

    curl -H "Authorization: Bearer YOUR_API_KEY" https://app.codealive.ai/api/v1/data_sources
  3. Enable debug logging: Add --debug to local server args

Common Issues

  • "Connection refused" β†’ Check internet connection
  • "401 Unauthorized" β†’ Verify your API key
  • "No repositories found" β†’ Check API key permissions in CodeAlive dashboard
  • Client-specific logs β†’ See your AI client's documentation for MCP logs

Windows / WSL Issues

  • docker: command not found in WSL β†’ Enable Docker Desktop WSL integration for your distro (Settings β†’ Resources β†’ WSL integration), or use the full path /usr/bin/docker
  • ENOENT or spawn error for npx/python β†’ Non-interactive WSL shells don't inherit nvm/pyenv paths. Use absolute paths in MCP configs
  • Connection refused to self-hosted server in WSL2 β†’ WSL2 uses NAT networking; localhost differs between Windows and WSL2. Enable mirrored networking in .wslconfig or use the WSL2 VM IP (hostname -I)
  • Claude Desktop can't connect to WSL MCP server β†’ Claude Desktop doesn't support WSL subprocess spawning. Use Remote HTTP (https://mcp.codealive.ai/api), Docker Desktop, or the wsl.exe proxy pattern (see Windows & WSL section)

Getting Help


πŸ“¦ Publishing to MCP Registry

For maintainers: see DEPLOYMENT.md for instructions on publishing new versions to the MCP Registry.


Privacy Policy

CodeAlive processes the repositories and queries you send through this extension in order to provide semantic search and codebase analysis. For complete privacy details, see CodeAlive Privacy Policy.


πŸ“„ License

MIT License - see LICENSE file for details.


Ready to supercharge your AI assistant with deep code understanding?
Get started now β†’

About

The most accurate and comprehensive Context Engine as a service, optimized for large codebases, powered by advanced GraphRAG and accessible via MCP. It enriches the context for AI agents like Codex, Claude Code, Cursor, etc., making them 35% more efficient and up to 84% faster.

Topics

Resources

License

Stars

Watchers

Forks

Packages

 
 
 

Contributors

Languages