Files
Ombre_Brain/README.md

419 lines
20 KiB
Markdown
Raw Blame History

This file contains invisible Unicode characters
This file contains invisible Unicode characters that are indistinguishable to humans but may be processed differently by a computer. If you think that this is intentional, you can safely ignore this warning. Use the Escape button to reveal them.
This file contains Unicode characters that might be confused with other characters. If you think that this is intentional, you can safely ignore this warning. Use the Escape button to reveal them.
# Ombre Brain
一个给提供给Claude 用的长期情绪记忆系统。基于 Russell 效价/唤醒度坐标打标Obsidian 做存储层MCP 接入,带遗忘曲线。
A long-term emotional memory system for Claude. Tags memories using Russell's valence/arousal coordinates, stores them as Obsidian-compatible Markdown, connects via MCP, and has a forgetting curve.
> **⚠️ 仓库临时迁移 / Repo temporarily moved**
> GitHub 访问受限期间,代码暂时托管在 Gitea
> **https://git.p0lar1s.uk/P0lar1s/Ombre_Brain**
> 下面的 `git clone` 地址请替换为上面这个。
---
## 快速开始 / Quick StartDocker推荐
> 这是最简单的方式,不需要装 Python不需要懂命令行跟着做就行。
**前置条件:** 电脑上装了 [Docker Desktop](https://www.docker.com/products/docker-desktop/),并且已经打开。
**第一步:拉取代码**
```bash
git clone https://git.p0lar1s.uk/P0lar1s/Ombre_Brain.git
cd Ombre_Brain
```
**第二步:创建 `.env` 文件**
在项目目录下新建一个叫 `.env` 的文件(注意有个点),内容填:
```
OMBRE_API_KEY=你的DeepSeek或其他API密钥
```
没有 API key 也能用,脱水压缩会降级到本地模式,只是效果差一点。那就写:
```
OMBRE_API_KEY=
```
**第三步:配置 `docker-compose.yml`(指向你的 Obsidian Vault**
用文本编辑器打开 `docker-compose.yml`,找到这一行:
```yaml
- ./buckets:/data
```
改成你的 Obsidian Vault 里 `Ombre Brain` 文件夹的路径,例如:
```yaml
- /Users/你的用户名/Documents/Obsidian Vault/Ombre Brain:/data
```
> 不知道路径?在 Obsidian 里右键那个文件夹 → 「在访达中显示」,然后把地址栏的路径复制过来。
> 不想挂载 Obsidian 也行,保持 `./buckets:/data` 不动,数据会存在项目目录的 `buckets/` 文件夹里。
**第四步:启动**
```bash
docker compose up -d
```
等它跑完,看到 `Started` 就好了。
**验证是否正常运行:**
```bash
docker logs ombre-brain
```
看到 `Uvicorn running on http://0.0.0.0:8000` 说明成功了。
---
**接入 Claude.ai远程访问**
需要额外配置 Cloudflare Tunnel把服务暴露到公网。参考下面「接入 Claude.ai (远程)」章节。
**接入 Claude Desktop本地**
不需要 Docker直接用 Python 本地跑。参考下面「安装 / Setup」章节。
---
[![Deploy to Render](https://render.com/images/deploy-to-render-button.svg)](https://render.com/deploy?repo=https://github.com/P0lar1zzZ/Ombre-Brain)
---
## 它是什么 / What is this
Claude 没有跨对话记忆。每次对话结束,之前聊过的所有东西都会消失。
Ombre Brain 给了它一套持久记忆——不是那种冷冰冰的键值存储,而是带情感坐标的、会自然衰减的、像人类记忆一样会遗忘和浮现的系统。
Claude has no cross-conversation memory. Everything from a previous chat vanishes once it ends.
Ombre Brain gives it persistent memory — not cold key-value storage, but a system with emotional coordinates, natural decay, and forgetting/surfacing mechanics that loosely mimic how human memory works.
核心特点 / Key features:
- **情感坐标打标 / Emotional tagging**: 每条记忆用 Russell 环形情感模型的 valence效价和 arousal唤醒度两个连续维度标记。不是"开心/难过"这种离散标签。
Each memory is tagged with two continuous dimensions from Russell's circumplex model: valence and arousal. Not discrete labels like "happy/sad".
- **自然遗忘 / Natural forgetting**: 改进版艾宾浩斯遗忘曲线。不活跃的记忆自动衰减归档,高情绪强度的记忆衰减更慢。
Modified Ebbinghaus forgetting curve. Inactive memories naturally decay and archive. High-arousal memories decay slower.
- **权重池浮现 / Weight pool surfacing**: 记忆不是被动检索的,它们会主动浮现——未解决的、情绪强烈的记忆权重更高,会在对话开头自动推送。
Memories aren't just passively retrieved — they actively surface. Unresolved, emotionally intense memories carry higher weight and get pushed at conversation start.
- **Obsidian 原生 / Obsidian-native**: 每个记忆桶就是一个 Markdown 文件YAML frontmatter 存元数据。可以直接在 Obsidian 里浏览、编辑、搜索。自动注入 `[[双链]]`
Each memory bucket is a Markdown file with YAML frontmatter. Browse, edit, and search directly in Obsidian. Wikilinks are auto-injected.
- **API 降级 / API degradation**: 脱水压缩和自动打标优先用廉价 LLM APIDeepSeek 等API 不可用时自动降级到本地关键词分析——始终可用。
Dehydration and auto-tagging prefer a cheap LLM API (DeepSeek etc.). When the API is unavailable, it degrades to local keyword analysis — always functional.
## 边界说明 / Design boundaries
官方记忆功能已经在做身份层的事了——你是谁你有什么偏好你们的关系是什么。那一层交给它Ombre Brain不打算造重复的轮子。
Ombre Brain 的边界是时间里发生的事,不是你是谁。它记住的是:你们聊过什么,经历了什么,哪些事情还悬在那里没有解决。两层配合用,才是完整的。
每次新对话Claude 从零开始——但它能从 Ombre Brain 里找回跟你有关的一切。不是重建,是接续。
---
Official memory already handles the identity layer — who you are, what you prefer, what your relationship is. That layer belongs there. Ombre Brain isn't trying to duplicate it.
Ombre Brain's boundary is *what happened in time*, not *who you are*. It holds conversations, experiences, unresolved things. The two layers together are what make it feel complete.
Each new conversation starts fresh — but Claude can reach back through Ombre Brain and find everything that happened between you. Not a rebuild. A continuation.
## 架构 / Architecture
```
Claude ←→ MCP Protocol ←→ server.py
┌───────────────┼───────────────┐
│ │ │
bucket_manager dehydrator decay_engine
(CRUD + 搜索) (压缩 + 打标) (遗忘曲线)
Obsidian Vault (Markdown files)
```
5 个 MCP 工具 / 5 MCP tools:
| 工具 Tool | 作用 Purpose |
|-----------|-------------|
| `breath` | 浮现或检索记忆。无参数=推送未解决记忆;有参数=关键词+情感检索 / Surface or search memories |
| `hold` | 存储单条记忆,自动打标+合并相似桶 / Store a single memory with auto-tagging |
| `grow` | 日记归档,自动拆分长内容为多个记忆桶 / Diary digest, auto-split into multiple buckets |
| `trace` | 修改元数据、标记已解决、删除 / Modify metadata, mark resolved, delete |
| `pulse` | 系统状态 + 所有记忆桶列表 / System status + bucket listing |
## 安装 / Setup
### 环境要求 / Requirements
- Python 3.11+
- 一个 Obsidian Vault可选不用也行会在项目目录下自建 `buckets/`
An Obsidian vault (optional — without one, it uses a local `buckets/` directory)
### 步骤 / Steps
```bash
git clone https://github.com/P0lar1zzZ/Ombre-Brain.git
cd Ombre-Brain
python -m venv .venv
source .venv/bin/activate # Windows: .venv\Scripts\activate
pip install -r requirements.txt
```
复制配置文件并按需修改 / Copy config and edit as needed:
```bash
cp config.example.yaml config.yaml
```
如果你要用 API 做脱水压缩和自动打标(推荐,效果好很多),设置环境变量:
If you want API-powered dehydration and tagging (recommended, much better quality):
```bash
export OMBRE_API_KEY="your-api-key"
```
支持任何 OpenAI 兼容 API。在 `config.yaml` 里改 `base_url``model` 就行。
Supports any OpenAI-compatible API. Just change `base_url` and `model` in `config.yaml`.
### 接入 Claude Desktop / Connect to Claude Desktop
在 Claude Desktop 配置文件中添加macOS: `~/Library/Application Support/Claude/claude_desktop_config.json`
Add to your Claude Desktop config:
```json
{
"mcpServers": {
"ombre-brain": {
"command": "python",
"args": ["/path/to/Ombre-Brain/server.py"],
"env": {
"OMBRE_API_KEY": "your-api-key"
}
}
}
}
```
### 接入 Claude.ai (远程) / Connect to Claude.ai (remote)
需要 HTTP 传输 + 隧道。可以用 Docker
Requires HTTP transport + tunnel. Docker setup:
```bash
echo "OMBRE_API_KEY=your-api-key" > .env
docker-compose up -d
```
`docker-compose.yml` 里配好了 Cloudflare Tunnel。你需要自己在 `~/.cloudflared/` 下放凭证和路由配置。
The `docker-compose.yml` includes Cloudflare Tunnel. You'll need your own credentials under `~/.cloudflared/`.
### 指向 Obsidian / Point to Obsidian
`config.yaml` 里设置 `buckets_dir`
Set `buckets_dir` in `config.yaml`:
```yaml
buckets_dir: "/path/to/your/Obsidian Vault/Ombre Brain"
```
不设的话,默认用项目目录下的 `buckets/`
If not set, defaults to `buckets/` in the project directory.
## 配置 / Configuration
所有参数在 `config.yaml`(从 `config.example.yaml` 复制)。关键的几个:
All parameters in `config.yaml` (copy from `config.example.yaml`). Key ones:
| 参数 Parameter | 说明 Description | 默认 Default |
|---|---|---|
| `transport` | `stdio`(本地)/ `streamable-http`(远程)| `stdio` |
| `buckets_dir` | 记忆桶存储路径 / Bucket storage path | `./buckets/` |
| `dehydration.model` | 脱水用的 LLM 模型 / LLM model for dehydration | `deepseek-chat` |
| `dehydration.base_url` | API 地址 / API endpoint | `https://api.deepseek.com/v1` |
| `decay.lambda` | 衰减速率,越大越快忘 / Decay rate | `0.05` |
| `decay.threshold` | 归档阈值 / Archive threshold | `0.3` |
| `merge_threshold` | 合并相似度阈值 (0-100) / Merge similarity | `75` |
敏感配置用环境变量:
Sensitive config via env vars:
- `OMBRE_API_KEY` — LLM API 密钥
- `OMBRE_TRANSPORT` — 覆盖传输方式
- `OMBRE_BUCKETS_DIR` — 覆盖存储路径
## 衰减公式 / Decay Formula
$$final\_score = time\_weight \times base\_score$$
$$base\_score = Importance \times activation\_count^{0.3} \times e^{-\lambda \times days} \times (base + arousal \times boost)$$
时间系数(乘数,优先级最高)/ Time weight (multiplier, highest priority):
| 距今天数 Days since active | 时间系数 Weight |
|---|---|
| 01 天 | 1.0 |
| 第 2 天 | 0.9 |
| 之后每天约降 10% | `max(0.3, 0.9 × e^{-0.2197 × (days-2)})` |
| 7 天后稳定 | ≈ 0.3(不归零)|
- `importance`: 1-10记忆重要性 / memory importance
- `activation_count`: 被检索的次数,越常被想起衰减越慢 / retrieval count; more recalls = slower decay
- `days`: 距上次激活的天数 / days since last activation
- `arousal`: 唤醒度,越强烈的记忆越难忘 / arousal; intense memories are harder to forget
- 已解决的记忆权重降到 5%,沉底等被关键词唤醒 / resolved memories drop to 5%, sink until keyword-triggered
- `pinned=true` 的桶不衰减、不合并、importance 锁定 10 / `pinned` buckets: never decay, never merge, importance locked at 10
## 给 Claude 的使用指南 / Usage Guide for Claude
`CLAUDE_PROMPT.md` 是写给 Claude 看的使用说明。放到你的 system prompt 或 custom instructions 里就行。
`CLAUDE_PROMPT.md` is the usage guide written for Claude. Put it in your system prompt or custom instructions.
## 工具脚本 / Utility Scripts
| 脚本 Script | 用途 Purpose |
|---|---|
| `write_memory.py` | 手动写入记忆,绕过 MCP / Manually write memories, bypass MCP |
| `migrate_to_domains.py` | 迁移平铺文件到域子目录 / Migrate flat files to domain subdirs |
| `reclassify_domains.py` | 基于关键词重分类 / Reclassify by keywords |
| `reclassify_api.py` | 用 API 重打标未分类桶 / Re-tag uncategorized buckets via API |
| `test_smoke.py` | 冒烟测试 / Smoke test |
## 部署 / Deploy
### Render
[![Deploy to Render](https://render.com/images/deploy-to-render-button.svg)](https://render.com/deploy?repo=https://github.com/P0lar1zzZ/Ombre-Brain)
> ⚠️ **免费层不可用**Render 免费层**不支持持久化磁盘**,服务重启后记忆数据会丢失,且会在无流量时休眠。**必须使用 Starter$7/mo或以上**才能正常使用。
> **Free tier won't work**: Render free tier has **no persistent disk** — all memory data is lost on restart. It also sleeps on inactivity. **Starter plan ($7/mo) or above is required.**
项目根目录已包含 `render.yaml`,点击按钮后:
1. (可选)设置 `OMBRE_API_KEY`:任何 OpenAI 兼容 API 的 key不填则自动降级为本地关键词提取
2. (可选)设置 `OMBRE_BASE_URL`API 地址,支持任意 OpenAI 化地址,如 `https://api.deepseek.com/v1` / `http://123.1.1.1:7689/v1` / `http://your-ollama:11434/v1`
3. Render 自动挂载持久化磁盘到 `/opt/render/project/src/buckets`
4. 部署后 MCP URL`https://<你的服务名>.onrender.com/mcp`
`render.yaml` is included. After clicking the button:
1. (Optional) `OMBRE_API_KEY`: any OpenAI-compatible key; omit to fall back to local keyword extraction
2. (Optional) `OMBRE_BASE_URL`: any OpenAI-compatible endpoint, e.g. `https://api.deepseek.com/v1`, `http://123.1.1.1:7689/v1`, `http://your-ollama:11434/v1`
3. Persistent disk auto-mounts at `/opt/render/project/src/buckets`
4. MCP URL after deploy: `https://<your-service>.onrender.com/mcp`
### Zeabur
> 💡 **Zeabur 的定价模式**Zeabur 是「买 VPS + 平台托管」,你先购买一台服务器(最低腾讯云新加坡 $2/mo、火山引擎 $3/moVolume 直接挂在该服务器上,**数据天然持久化,无丢失问题**。另需订阅 Zeabur 管理方案Developer $5/mo总计约 $7-8/mo 起。
> **Zeabur pricing model**: You buy a VPS first (cheapest: Tencent Cloud Singapore ~$2/mo, Volcano Engine ~$3/mo), then add Zeabur's Developer plan ($5/mo) for management. Volumes mount directly on your server — **data is always persistent, no cold-start data loss**. Total ~$7-8/mo minimum.
**步骤 / Steps**
1. **创建项目 / Create project**
- 打开 [zeabur.com](https://zeabur.com) → 购买一台服务器 → **New Project****Deploy from GitHub**
- 先 Fork 本仓库到自己 GitHub 账号,然后在 Zeabur 选择 `你的用户名/Ombre-Brain`
- Zeabur 会自动检测到根目录的 `Dockerfile` 并使用 Docker 方式构建
- Go to [zeabur.com](https://zeabur.com) → buy a server → **New Project****Deploy from GitHub**
- Fork this repo first, then select `your-username/Ombre-Brain` in Zeabur
- Zeabur auto-detects the `Dockerfile` in root and builds via Docker
2. **设置环境变量 / Set environment variables**(服务页面 → **Variables** 标签页)
- `OMBRE_API_KEY`(可选)— LLM API 密钥,不填则自动降级为本地关键词提取
- `OMBRE_BASE_URL`(可选)— API 地址,如 `https://api.deepseek.com/v1`
> ⚠️ **不需要**手动设置 `OMBRE_TRANSPORT` 和 `OMBRE_BUCKETS_DIR`Dockerfile 里已经设好了默认值。Zeabur 对单阶段 Dockerfile 会自动注入控制台设置的环境变量。
> You do **NOT** need to set `OMBRE_TRANSPORT` or `OMBRE_BUCKETS_DIR` — defaults are baked into the Dockerfile. Zeabur auto-injects dashboard env vars for single-stage Dockerfiles.
3. **挂载持久存储 / Mount persistent volume**(服务页面 → **Volumes** 标签页)
- Volume ID`ombre-buckets`(或任意名)
- 挂载路径 / Path**`/app/buckets`**
- ⚠️ 不挂载的话,每次重新部署记忆数据会丢失
- ⚠️ Without this, memory data is lost on every redeploy
4. **配置端口 / Configure port**(服务页面 → **Networking** 标签页)
- Port Name`web`(或任意名)
- Port**`8000`**
- Port Type**`HTTP`**
- 然后点 **Generate Domain** 生成一个 `xxx.zeabur.app` 域名
- Then click **Generate Domain** to get a `xxx.zeabur.app` domain
5. **验证 / Verify**
- 访问 `https://<你的域名>.zeabur.app/health`,应返回 JSON
- Visit `https://<your-domain>.zeabur.app/health` — should return JSON
- 最终 MCP 地址 / MCP URL`https://<你的域名>.zeabur.app/mcp`
**常见问题 / Troubleshooting**
| 现象 Symptom | 原因 Cause | 解决 Fix |
|---|---|---|
| 域名无法访问 / Domain unreachable | 没配端口 / Port not configured | Networking 标签页加 port 8000 (HTTP) |
| 域名无法访问 / Domain unreachable | `OMBRE_TRANSPORT` 未设置,服务以 stdio 模式启动,不监听任何端口 / Service started in stdio mode — no port is listened | **Variables 标签页确认设置 `OMBRE_TRANSPORT=streamable-http`,然后重新部署** |
| 构建失败 / Build failed | Dockerfile 未被识别 / Dockerfile not detected | 确认仓库根目录有 `Dockerfile`(大小写敏感) |
| 服务启动后立刻退出 | `OMBRE_TRANSPORT` 被覆盖为 `stdio` | 检查 Variables 里有没有多余的 `OMBRE_TRANSPORT=stdio`,删掉即可 |
| 重启后记忆丢失 / Data lost on restart | Volume 未挂载 | Volumes 标签页挂载到 `/app/buckets` |
### 使用 Cloudflare Tunnel 或 ngrok 连接 / Connecting via Cloudflare Tunnel or ngrok
> 自 v1.1 起server.py 在 HTTP 模式下已自动添加 CORS 中间件,无需额外配置。
> Since v1.1, server.py automatically enables CORS middleware in HTTP mode — no extra config needed.
使用隧道连接时,确保以下条件满足:
When connecting via tunnel, ensure:
1. **服务器必须运行在 HTTP 模式** / Server must use HTTP transport
```bash
OMBRE_TRANSPORT=streamable-http python server.py
```
或 Docker
```bash
docker-compose up -d
```
2. **在 Claude.ai 网页版添加 MCP 服务器** / Adding to Claude.ai web
- URL 格式 / URL format: `https://<tunnel-subdomain>.trycloudflare.com/mcp`
- 或 ngrok / or ngrok: `https://<xxxx>.ngrok-free.app/mcp`
- 先访问 `/health` 验证连接 / Verify first: `https://<your-tunnel>/health` should return `{"status":"ok",...}`
3. **已知限制 / Known limitations**
- Cloudflare Tunnel 免费版有空闲超时(约 10 分钟),系统内置保活 ping 可缓解但不能完全消除
- Free Cloudflare Tunnel has idle timeout (~10 min); built-in keepalive pings mitigate but can't fully prevent it
- ngrok 免费版有请求速率限制 / ngrok free tier has rate limits
- 如果连接仍失败,检查隧道是否正在运行、服务是否以 `streamable-http` 模式启动
- If connection still fails, verify the tunnel is running and the server started in `streamable-http` mode
| 现象 Symptom | 原因 Cause | 解决 Fix |
|---|---|---|
| 网页版无法连接隧道 URL / Web can't connect to tunnel URL | 服务以 stdio 模式运行 / Server in stdio mode | 设置 `OMBRE_TRANSPORT=streamable-http` 后重启 |
| 网页版无法连接隧道 URL / Web can't connect to tunnel URL | 旧版 server.py 缺少 CORS 头 / Missing CORS headers | 拉取最新代码CORS 已内置 / Pull latest — CORS is now built-in |
| `/health` 返回 200 但 MCP 连不上 / `/health` 200 but MCP fails | 路径错误 / Wrong path | MCP URL 末尾必须是 `/mcp` 而非 `/` |
| 隧道连接偶尔断开 / Tunnel disconnects intermittently | Cloudflare Tunnel 空闲超时 / Idle timeout | 保活 ping 已内置,若仍断开可缩短隧道超时配置 |
---
### Session Start Hook自动 breath
部署后,如果你使用 Claude Code可以在项目内激活自动浮现 hook
`.claude/settings.json` 已配置好 `SessionStart` hook每次新会话或恢复会话时自动触发 `breath`,把最高权重未解决记忆推入上下文。
**仅在远程 HTTP 模式下有效**`OMBRE_TRANSPORT=streamable-http`)。本地 stdio 模式下 hook 会安静退出,不影响正常使用。
可以通过 `OMBRE_HOOK_URL` 环境变量指定服务器地址(默认 `http://localhost:8000`),或者设置 `OMBRE_HOOK_SKIP=1` 临时禁用。
If using Claude Code, `.claude/settings.json` configures a `SessionStart` hook that auto-calls `breath` on each new or resumed session, surfacing your highest-weight unresolved memories as context. Only active in remote HTTP mode. Set `OMBRE_HOOK_SKIP=1` to disable temporarily.
## License
MIT