v0.1 · open source

Shared memory for your AI agents.

Dory is a local-first memory daemon. Every agent — Claude, Codex, Hermes, anything MCP-aware — reads from the same markdown corpus on your disk. Five verbs. One folder. No cloud.

MIT  ·  Python 3.12+  ·  Native MCP  ·  Local by default

The problem

Every agent
starts from scratch.

Most developers now use more than one AI agent. One for code, one for content, one for personal automations. Each one is capable. None of them share context — so you re-introduce yourself, your project, and your decisions every time you switch tools.

Claude content + frontend
> Hi, I'm working on a project called Atlas. Stack is Python + uv. The person I keep mentioning is Alex (not Alexei). Last Thursday we decided to ship the eval suite first. Today I want to—
Codex writing code
> Hi, I'm working on a project called Atlas. Stack is Python + uv. The person I keep mentioning is Alex (not Alexei). Last Thursday we decided to ship the eval suite first. Today I want to—
Hermes personal ops
> Hi, I'm working on a project called Atlas. Stack is Python + uv. The person I keep mentioning is Alex (not Alexei). Last Thursday we decided to ship the eval suite first. Today I want to—
OpenClaw background automations
> Hi, I'm working on a project called Atlas. Stack is Python + uv. The person I keep mentioning is Alex (not Alexei). Last Thursday we decided to ship the eval suite first. Today I want to—

A real tax — minutes lost per session, repeated across every agent, every day.

How it works

One corpus.
Every agent.

Dory runs as a small daemon on your machine. Each agent reads from and writes to the same shared corpus. Markdown is the source of truth. SQLite is a disposable index that can be rebuilt at any time.

Pick up a project after two weeks and the agent already knows the active focus. Switch from Claude to Codex mid-task without re-pasting context. Tell Dory something once and every agent has it.

See it in action

Four agents, one shared brain.

Real verbs. Real markdown. The corpus on the right is the source of truth — the agents on the left read and write through Dory. Scroll the demo into view to watch a full scenario play out.

Four agents, one corpus

Claudecontent + frontend
$_
Codexwriting code
$_
Hermespersonal ops
$_
OpenClawbackground automations
$_
core/active.md
# core/active.md
 
## Current focus
- User: Dee. Building Atlas (Python + uv).
 
## Last updated
_by claude · just now_
1 fact·local markdown

Each agent issues real Dory verbs against this corpus. The corpus is one folder of markdown.

The API

Five verbs. That's the whole surface.

Learn them in a sitting. The same five reach the corpus over CLI, HTTP, or MCP.

  1. 01 wake load bounded hot context at the start of a session
  2. 02 search hybrid BM25 + vector search across the corpus
  3. 03 get fetch exact markdown with a content hash
  4. 04 memory-write semantic writes auto-routed to canonical pages
  5. 05 link backlinks and graph relationships

Same five everywhere. Everything else is markdown you can open in any text editor.

Surfaces

One corpus, many ways in.

Run the daemon once. Reach the same corpus from the terminal, an MCP-aware editor, an HTTP call, your browser, or an Obsidian vault.

  • HTTP daemon POST /v1/memory-write
  • Browser wiki 127.0.0.1:8766/wiki
  • OpenClaw + Hermes plugins memory_mode: hybrid
  • Obsidian-friendly data/corpus/
Local-first by design

Markdown is the truth. SQLite is disposable.

If the index ever corrupts, rebuild it from markdown. To read your memories, open them in any text editor. To delete them, delete the folder. There is no migration tool because there is nothing proprietary to migrate.

From the outside, Dory is a shared folder with good search and a thin write API. That turns out to be most of what shared agent memory actually needs.

  • ·No SaaS account.
  • ·No phone-home.
  • ·No vendor lock-in.
  • ·Diff it, back it up, version it like any other folder.
Why Dory exists

Most developers now run several AI agents in parallel. Different tools are good at different things, and the stack keeps growing. The cost is invisible until you notice you're explaining the same project, the same people, and the same decisions to every agent, every day.

Managed memory services exist. So do single-tool plugins. Both work — both put your context inside someone else's product.

Integrations get deprecated. Vendors pivot. Memory that lives behind an API is memory you can lose access to.

Dory takes the other path. A small daemon. A folder of markdown. Five verbs that any agent can call. Your memory stays on your machine, in a format you can read, edit, and back up like any other file.

Built and open-sourced because the shared layer underneath modern multi-agent workflows shouldn't belong to a vendor.

Dory, MIT-licensed, made in LA

Install

Get every agent on the same page.
In about two minutes.

# clone, install, init
$ git clone https://github.com/deeflect/dory
$ cd dory && uv sync --frozen
$ uv run dory init

# write, search, wake
$ uv run dory memory-write "Atlas is the active focus" --kind decision
$ uv run dory wake --profile coding
# durable always-on daemon, bind-mounted markdown corpus
$ cp .env.example .env
$ mkdir -p data/corpus
$ docker compose up -d --build

# bound to 127.0.0.1:8766 by default
$ curl http://127.0.0.1:8766/healthz
# native MCP — stdio or TCP
$ uv run dory-mcp --mode stdio

# or via the bundled installer that wires Claude / Codex / opencode
$ ./scripts/agent-policy/install.sh