Open Source · MIT Licensed · 100% Local

Now Your AI Will Remember
Everything.

Persistent memory for Claude, Cursor, Windsurf, and any MCP client. Runs locally. Zero cloud dependencies. Automatically discards outdated knowledge so your AI never hallucinates based on stale decisions.

copy

$ pip install git+https://github.com/Arkya-AI/anchor-mcp.git

$ anchor-mcp init

# That's it. Your AI now has permanent memory.

Tired of Re-Explaining Your Stack?

Every new chat window starts from zero. You explain your tech stack, your architecture decisions, your naming conventions — again. Anchor remembers everything across sessions, clients, and projects.

Getting Suggestions for Code You Abandoned?

You migrated from JWT to OAuth last week, but your AI still suggests JWT patterns. Without drift detection, old memories are just as “relevant” as new ones. Anchor detects the evolution and deprioritizes the old.

Context Doesn't Follow You?

Debug in Claude Desktop during the day, refactor in Cursor at night — but each tool starts fresh. Anchor shares memory across all MCP clients, so you pick up exactly where you left off.

Key Features

Everything you need for AI that actually remembers.

Cross-Session Memory

Close your laptop Friday, open a fresh chat Monday — the AI picks up exactly where you left off. No summaries needed.

100% Private & Local

Runs on CPU (~500MB disk, ~200MB RAM). No API keys, no cloud. Paste NDAs, proprietary code — nothing leaves your machine.

Temporal Intelligence

Memories ranked by recency and access frequency. Upgrade from React 17 to 19? It stops suggesting the old syntax.

Drift Detection

Automatically detects when knowledge regions shift. Migrate REST to GraphQL? Anchor notices and stops suggesting REST endpoints.

Source Linking

Every memory traces back to its origin. When the AI says “we chose Kubernetes,” it points to the exact doc where that was recorded.

Zero Configuration

Two commands to install. No vector databases to spin up, no embeddings to configure, no memory dashboard to prune.

Why Drift Detection Matters

Standard vector stores suffer from “semantic collision” — old memories match new queries because they’re semantically similar. Anchor detects the shift.

January

You tell Claude your project uses PostgreSQL.

Anchor stores memories about schemas and SQL drivers in the “databases” region of the vector space.

April

You migrate to MongoDB.

New memories about documents and collections flow into the same region.

Without Anchor

The old “we use PostgreSQL” memory has high similarity to “database queries.” Claude confidently gives you SQL syntax for a database you no longer use.

Hallucination based on stale memory.

With Anchor

Anchor detects the “databases” region has shifted significantly. PostgreSQL memories are flagged stale and ranked 10x lower.

Claude retrieves only MongoDB context.

How It Works

You just talk to your AI normally. Behind the scenes, Anchor clusters your conversations by topic, tracks freshness, and injects only relevant context.

1

Local Embeddings

all-MiniLM-L6-v2 generates 384-dim vectors on CPU. No API calls.

2

Voronoi Partitioning

16 frozen centroids cluster knowledge by topic automatically.

3

FAISS Search

Meta's FAISS handles high-speed similarity search with temporal re-ranking.

4

Drift Pipeline

Statistical monitoring with Laplacian smoothing auto-flags stale memories.

Quick Start

Two commands. No config files to edit.

terminal

$ pip install git+https://github.com/Arkya-AI/anchor-mcp.git

$ anchor-mcp init

Anchor MCP — Setup

Checking for MCP clients...

  ✓ Claude Desktop found

  ✓ Claude Code found

  ✓ Cursor found

Registering Anchor with detected clients...

  ✓ Claude Desktop — registered

  ✓ Claude Code — registered

  ✓ Cursor — registered

Initializing storage at ~/.anchor/...

  ✓ Created directories

Downloading embedding model (first time only)...

  ✓ Model ready

Setup complete! Restart your AI clients to activate Anchor.

Manual setup for unsupported clients — see docs.

11 Tools for Your AI

Exposed via the Model Context Protocol to any connected LLM.

Store & Recall

anchor_store

Save a named memory with importance level and tags

anchor_recall

Semantic search with temporal scoring

anchor_deep_recall

Recall + read source files for full context

anchor_learn

Auto-capture facts, preferences, decisions

Management

anchor_list

List all stored memories, filter by tag

anchor_delete

Remove a memory by ID

anchor_contradict

Mark outdated memory stale, store corrected version

Intelligence

anchor_auto

Auto-retrieve relevant context at conversation start

anchor_inspect

View Voronoi cell distribution and density

anchor_save_session

Save session summary with source linking

anchor_drift_check

Run drift detection to flag stale memories

Frequently Asked Questions

What is Anchor MCP?
Anchor MCP is a Model Context Protocol server that gives your AI long-term memory. It stores, retrieves, and manages knowledge across all your conversations and AI tools — locally on your machine with zero cloud dependencies.
Which AI tools does it work with?
Any MCP-compatible client: Claude Desktop, Claude Code, Cursor, Windsurf, and more. The anchor-mcp init command auto-detects and registers with all installed clients.
Does my data leave my machine?
No. Everything runs locally — embeddings are generated on CPU using all-MiniLM-L6-v2, and all data is stored in ~/.anchor/ on your filesystem. No API keys, no cloud services, no external calls.
What is drift detection?
Drift detection monitors how your knowledge changes over time. When you migrate technologies (e.g., PostgreSQL to MongoDB), Anchor statistically detects that the “databases” knowledge region has shifted and flags old memories as stale, preventing hallucinations from outdated information.
How much disk space and RAM does it need?
About 500MB disk (mostly the embedding model) and ~200MB RAM. It runs entirely on CPU — no GPU required.
How do I install it?
Two commands: pip install git+https://github.com/Arkya-AI/anchor-mcp.git then anchor-mcp init. The init command auto-detects clients, registers Anchor, creates storage, and downloads the embedding model.
Do I need to manually manage memories?
No. Anchor automatically captures important information from your conversations (decisions, preferences, facts) and manages staleness through drift detection. You just talk to your AI normally.
What happens when I correct outdated information?
The AI uses anchor_contradict to mark the old memory as stale and store the corrected version. Old information is still retrievable but ranked much lower.
Can I use it across multiple projects?
Yes. Anchor memory is global — it works across all projects, sessions, and MCP clients. You can tag memories by project for organization.
Is it open source?
Yes. MIT licensed. View the source at github.com/Arkya-AI/anchor-mcp.