Installation

Engrams runs as an MCP server and requires Python 3.10+. The recommended way is uvx, which manages the Python environment automatically.

Prerequisites

  • Python 3.10+Download
  • uv (recommended) — fast Python package manager. Install with:
    curl -LsSf https://astral.sh/uv/install.sh | sh

Option 1 — uvx (recommended)

uvx runs Engrams directly from PyPI without a manual install step. Add the following to your MCP client's configuration file:

{
  "mcpServers": {
    "engrams": {
      "command": "uvx",
      "args": [
        "--from", "engrams-mcp",
        "engrams-mcp",
        "--mode", "stdio",
        "--log-level", "INFO"
      ]
    }
  }
}
💡 Workspace detection is automatic. You do not need to pass --workspace_id. Engrams detects the active project directory per-call using common project indicators (.git, package.json, etc.).

Option 2 — pip install

pip install engrams-mcp

Then reference the installed command in your MCP config:

{
  "mcpServers": {
    "engrams": {
      "command": "engrams-mcp",
      "args": ["--mode", "stdio"]
    }
  }
}

Option 3 — Developer install (from source)

git clone https://github.com/stevebrownlee/engrams-mcp.git
cd engrams-mcp

# Create virtual environment
uv venv
source .venv/bin/activate   # macOS/Linux
# .venv\Scripts\activate   # Windows

# Install dependencies
uv pip install -r requirements.txt

Then in your MCP config, point to the local checkout:

{
  "mcpServers": {
    "engrams": {
      "command": "uv",
      "args": [
        "--directory", "/path/to/engrams-mcp",
        "run", "engrams-mcp",
        "--mode", "stdio"
      ]
    }
  }
}

Where is the config file?

ToolConfig file location
Roo Code~/.roo/mcp.json or workspace .roo/mcp.json
ClineVS Code settings → Cline → MCP Servers
Cursor~/.cursor/mcp.json
Windsurf~/.codeium/windsurf/mcp_config.json
Claude Code~/.claude/mcp.json
Claude Desktop~/Library/Application Support/Claude/claude_desktop_config.json

Install the AI strategy

After the MCP server is configured, run the engrams init command from your project root to scaffold the correct custom-instructions file for your AI tool:

# See all supported tools
engrams init --list

# Scaffold for your tool
engrams init --tool roo          # → .roo/rules/engrams_strategy
engrams init --tool cline        # → .clinerules
engrams init --tool cursor       # → .cursorrules
engrams init --tool windsurf     # → .windsurfrules
engrams init --tool claude-code  # → CLAUDE.md
engrams init --tool generic      # → engrams_strategy.md
What does this do? The strategy file tells your AI how and when to use Engrams tools — when to log decisions, how to retrieve context, and how to format responses. Without it, the AI won't know Engrams exists.

Verify the installation

In a new chat, ask your AI:

What is the Engrams schema? List all available MCP tools.

If Engrams is connected, the AI will respond with a full list of tools.


Next: Quick Start →