Now Your AI Will Remember
Everything.
Persistent memory for Claude, Cursor, Windsurf, and any MCP client. Runs locally. Zero cloud dependencies. Automatically discards outdated knowledge so your AI never hallucinates based on stale decisions.
$ pip install git+https://github.com/Arkya-AI/anchor-mcp.git
$ anchor-mcp init
# That's it. Your AI now has permanent memory.
Tired of Re-Explaining Your Stack?
Every new chat window starts from zero. You explain your tech stack, your architecture decisions, your naming conventions — again. Anchor remembers everything across sessions, clients, and projects.
Getting Suggestions for Code You Abandoned?
You migrated from JWT to OAuth last week, but your AI still suggests JWT patterns. Without drift detection, old memories are just as “relevant” as new ones. Anchor detects the evolution and deprioritizes the old.
Context Doesn't Follow You?
Debug in Claude Desktop during the day, refactor in Cursor at night — but each tool starts fresh. Anchor shares memory across all MCP clients, so you pick up exactly where you left off.
Key Features
Everything you need for AI that actually remembers.
Cross-Session Memory
Close your laptop Friday, open a fresh chat Monday — the AI picks up exactly where you left off. No summaries needed.
100% Private & Local
Runs on CPU (~500MB disk, ~200MB RAM). No API keys, no cloud. Paste NDAs, proprietary code — nothing leaves your machine.
Temporal Intelligence
Memories ranked by recency and access frequency. Upgrade from React 17 to 19? It stops suggesting the old syntax.
Drift Detection
Automatically detects when knowledge regions shift. Migrate REST to GraphQL? Anchor notices and stops suggesting REST endpoints.
Source Linking
Every memory traces back to its origin. When the AI says “we chose Kubernetes,” it points to the exact doc where that was recorded.
Zero Configuration
Two commands to install. No vector databases to spin up, no embeddings to configure, no memory dashboard to prune.
Why Drift Detection Matters
Standard vector stores suffer from “semantic collision” — old memories match new queries because they’re semantically similar. Anchor detects the shift.
You tell Claude your project uses PostgreSQL.
Anchor stores memories about schemas and SQL drivers in the “databases” region of the vector space.
You migrate to MongoDB.
New memories about documents and collections flow into the same region.
Without Anchor
The old “we use PostgreSQL” memory has high similarity to “database queries.” Claude confidently gives you SQL syntax for a database you no longer use.
Hallucination based on stale memory.
With Anchor
Anchor detects the “databases” region has shifted significantly. PostgreSQL memories are flagged stale and ranked 10x lower.
Claude retrieves only MongoDB context.
How It Works
You just talk to your AI normally. Behind the scenes, Anchor clusters your conversations by topic, tracks freshness, and injects only relevant context.
Local Embeddings
all-MiniLM-L6-v2 generates 384-dim vectors on CPU. No API calls.
Voronoi Partitioning
16 frozen centroids cluster knowledge by topic automatically.
FAISS Search
Meta's FAISS handles high-speed similarity search with temporal re-ranking.
Drift Pipeline
Statistical monitoring with Laplacian smoothing auto-flags stale memories.
Quick Start
Two commands. No config files to edit.
$ pip install git+https://github.com/Arkya-AI/anchor-mcp.git
$ anchor-mcp init
Anchor MCP — Setup
Checking for MCP clients...
✓ Claude Desktop found
✓ Claude Code found
✓ Cursor found
Registering Anchor with detected clients...
✓ Claude Desktop — registered
✓ Claude Code — registered
✓ Cursor — registered
Initializing storage at ~/.anchor/...
✓ Created directories
Downloading embedding model (first time only)...
✓ Model ready
Setup complete! Restart your AI clients to activate Anchor.
Manual setup for unsupported clients — see docs.
11 Tools for Your AI
Exposed via the Model Context Protocol to any connected LLM.
Store & Recall
anchor_store
Save a named memory with importance level and tags
anchor_recall
Semantic search with temporal scoring
anchor_deep_recall
Recall + read source files for full context
anchor_learn
Auto-capture facts, preferences, decisions
Management
anchor_list
List all stored memories, filter by tag
anchor_delete
Remove a memory by ID
anchor_contradict
Mark outdated memory stale, store corrected version
Intelligence
anchor_auto
Auto-retrieve relevant context at conversation start
anchor_inspect
View Voronoi cell distribution and density
anchor_save_session
Save session summary with source linking
anchor_drift_check
Run drift detection to flag stale memories
Frequently Asked Questions
What is Anchor MCP?
Which AI tools does it work with?
anchor-mcp init command auto-detects and registers with all installed clients.Does my data leave my machine?
~/.anchor/ on your filesystem. No API keys, no cloud services, no external calls.What is drift detection?
How much disk space and RAM does it need?
How do I install it?
pip install git+https://github.com/Arkya-AI/anchor-mcp.git then anchor-mcp init. The init command auto-detects clients, registers Anchor, creates storage, and downloads the embedding model.Do I need to manually manage memories?
What happens when I correct outdated information?
anchor_contradict to mark the old memory as stale and store the corrected version. Old information is still retrievable but ranked much lower.