MCP Protocol
The Model Context Protocol (MCP) is an open standard for connecting AI models to external tools and data sources. Engrams is an MCP server that exposes project memory as a set of callable tools.
What is MCP?
MCP defines a standard interface between AI clients (your IDE extension) and servers (Engrams). The client discovers available tools, calls them with structured arguments, and receives structured responses. This decouples the AI from any particular storage backend.
How Engrams uses MCP
Every Engrams capability is exposed as an MCP tool — log_decision,
get_relevant_context, bind_code_to_item, and so on.
Your AI assistant calls these tools automatically as it works, without you having to prompt it.
The strategy file (installed via engrams init) teaches the AI when and
how to call each tool.
Supported clients
| Client | Config method | Init command |
|---|---|---|
| Roo Code | mcp.json | engrams init --tool roo |
| Cline | VS Code settings | engrams init --tool cline |
| Cursor | ~/.cursor/mcp.json | engrams init --tool cursor |
| Windsurf | mcp_config.json | engrams init --tool windsurf |
| Claude Code | CLAUDE.md | engrams init --tool claude-code |
| Claude Desktop | claude_desktop_config.json | engrams init --tool claude-desktop |
| Any MCP client | Varies | engrams init --tool generic |
Communication modes
stdio
The default and recommended mode for local IDE use. The client spawns the Engrams process and communicates via stdin/stdout. Zero network overhead.
engrams serve --mode stdio HTTP
Runs as a FastAPI web server. Useful when multiple agents need to share one Engrams instance, or for remote access.
engrams serve --mode http --host 0.0.0.0 --port 8000