MCP Protocol

The Model Context Protocol (MCP) is an open standard for connecting AI models to external tools and data sources. Engrams is an MCP server that exposes project memory as a set of callable tools.

What is MCP?

MCP defines a standard interface between AI clients (your IDE extension) and servers (Engrams). The client discovers available tools, calls them with structured arguments, and receives structured responses. This decouples the AI from any particular storage backend.

How Engrams uses MCP

Every Engrams capability is exposed as an MCP tool — log_decision, get_relevant_context, bind_code_to_item, and so on. Your AI assistant calls these tools automatically as it works, without you having to prompt it. The strategy file (installed via engrams init) teaches the AI when and how to call each tool.

Supported clients

ClientConfig methodInit command
Roo Codemcp.jsonengrams init --tool roo
ClineVS Code settingsengrams init --tool cline
Cursor~/.cursor/mcp.jsonengrams init --tool cursor
Windsurfmcp_config.jsonengrams init --tool windsurf
Claude CodeCLAUDE.mdengrams init --tool claude-code
Claude Desktopclaude_desktop_config.jsonengrams init --tool claude-desktop
Any MCP clientVariesengrams init --tool generic

Communication modes

stdio

The default and recommended mode for local IDE use. The client spawns the Engrams process and communicates via stdin/stdout. Zero network overhead.

engrams serve --mode stdio

HTTP

Runs as a FastAPI web server. Useful when multiple agents need to share one Engrams instance, or for remote access.

engrams serve --mode http --host 0.0.0.0 --port 8000
⚠️ HTTP mode security The HTTP server has no authentication by default. Only expose it on trusted networks.

Further reading