Drop-in Model Context Protocol server. Seven deterministic tools. Rust core, stdio transport, zero-panic in production. Persistent memory for agents you build — without paying tokens on the retrieval path.
If you've surveyed the MCP memory landscape you've seen the same pattern: the tool is named recall, but under the hood it embeds the query, calls a cloud model to summarize, or proxies through OpenAI. Every turn your agent takes, the memory layer taxes it. The token bill compounds.
The other half of the landscape is flat-file: CLAUDE.md, markdown drops, prose injected into the prompt. That's fine for "the user prefers async/await," useless for "enumerate every caller of DatabaseClient.connect."
If you're shipping an MCP-based agent to real users, neither is acceptable infrastructure.
foo.bar exist? Answers member-access questions deterministically.Plus the write path: remember, learn, verify, connect, strengthen, forget, consolidate. Hebbian reinforcement and decay run in a background tokio task.
{
"mcpServers": {
"argosbrain": {
"command": "argosbrain-mcp",
"args": ["--project", "/path/to/repo"]
}
}
}
Stdio transport. No network. No API key. The binary starts in under 100ms and holds the full symbol graph in ~50MB RAM for typical repos.
If you're building a custom agent with the Claude Agent SDK, TypeScript MCP SDK, or Python SDK — the server speaks standard MCP. No vendor lock-in. No bespoke protocol.
The benchmark (LongMemCode) is MIT-licensed, and every number on this page is reproducible on your laptop using it. The engine itself is commercial — the source is not published, but adapter stubs for competitors in the benchmark repo let anyone run the comparison.
github.com/CataDef/neurogenesis → · One-line install · Independent verdict · Compare vs other MCP memory servers