refactor: doc/code consistency, OMBRE_PORT, webhook push, host-vault dashboard
Doc-code consistency (per BEHAVIOR_SPEC.md ground truth):
- INTERNALS.md, dehydrator.py, README.md, config.example.yaml: drop the
outdated "API 不可用自动降级到本地关键词提取" claims; align with the
"RuntimeError on API outage, no silent fallback" design decision
- INTERNALS.md & BEHAVIOR_SPEC.md narrative: activation_count=1 → 0 (B-04)
- server.py header: 5 MCP tools → 6 (add dream)
OMBRE_PORT (T5/T6):
- Replace hardcoded 8000 in FastMCP / uvicorn / keepalive URL
with int(os.environ.get("OMBRE_PORT", "8000"))
OMBRE_HOOK_URL / OMBRE_HOOK_SKIP webhook (T7):
- Implement _fire_webhook() helper: fire-and-forget POST with 5s timeout,
failures logged at WARNING but never propagated
- Wired into breath / dream MCP tools and /breath-hook + /dream-hook routes
- Push payload: {event, timestamp, payload:{...}}; documented in ENV_VARS.md
Dashboard host-vault input (T12, per user request):
- New /api/host-vault GET/POST endpoints persist OMBRE_HOST_VAULT_DIR
to project-root .env (idempotent upsert, preserves other entries,
rejects quotes/newlines)
- Settings tab gains a "宿主机记忆桶目录 (Docker)" panel with
load/save buttons and a clear "需要 docker compose down/up 生效" notice
This commit is contained in:
@@ -508,7 +508,7 @@ Claude 决策: hold / grow / 自动
|
||||
YAML frontmatter:
|
||||
id, name, tags, domain, valence, arousal,
|
||||
importance, type="dynamic", created, last_active,
|
||||
activation_count=1
|
||||
activation_count=0 # B-04: starts at 0; touch() bumps to 1+
|
||||
│
|
||||
▼
|
||||
┌─────── 记忆桶存活期 ──────────────────────────────────────┐
|
||||
|
||||
28
ENV_VARS.md
28
ENV_VARS.md
@@ -5,9 +5,10 @@
|
||||
| `OMBRE_API_KEY` | 是 | — | Gemini / OpenAI-compatible API Key,用于脱水(dehydration)和向量嵌入 |
|
||||
| `OMBRE_BASE_URL` | 否 | `https://generativelanguage.googleapis.com/v1beta/openai/` | API Base URL(可替换为代理或兼容接口) |
|
||||
| `OMBRE_TRANSPORT` | 否 | `stdio` | MCP 传输模式:`stdio` / `sse` / `streamable-http` |
|
||||
| `OMBRE_PORT` | 否 | `8000` | HTTP/SSE 模式监听端口(仅 `sse` / `streamable-http` 生效) |
|
||||
| `OMBRE_BUCKETS_DIR` | 否 | `./buckets` | 记忆桶文件存放目录(绑定 Docker Volume 时务必设置) |
|
||||
| `OMBRE_HOOK_URL` | 否 | — | Breath/Dream Webhook 回调地址,留空则不推送 |
|
||||
| `OMBRE_HOOK_SKIP` | 否 | `false` | 设为 `true` 跳过 Webhook 推送 |
|
||||
| `OMBRE_HOOK_URL` | 否 | — | Breath/Dream Webhook 推送地址(POST JSON),留空则不推送 |
|
||||
| `OMBRE_HOOK_SKIP` | 否 | `false` | 设为 `true`/`1`/`yes` 跳过 Webhook 推送(即使 `OMBRE_HOOK_URL` 已设置) |
|
||||
| `OMBRE_DASHBOARD_PASSWORD` | 否 | — | 预设 Dashboard 访问密码;设置后覆盖文件存储的密码,首次访问不弹设置向导 |
|
||||
| `OMBRE_DEHYDRATION_MODEL` | 否 | `deepseek-chat` | 脱水/打标/合并/拆分用的 LLM 模型名(覆盖 `dehydration.model`) |
|
||||
| `OMBRE_DEHYDRATION_BASE_URL` | 否 | `https://api.deepseek.com/v1` | 脱水模型的 API Base URL(覆盖 `dehydration.base_url`) |
|
||||
@@ -19,3 +20,26 @@
|
||||
|
||||
- `OMBRE_API_KEY` 也可在 `config.yaml` 的 `dehydration.api_key` / `embedding.api_key` 中设置,但**强烈建议**通过环境变量传入,避免密钥写入文件。
|
||||
- `OMBRE_DASHBOARD_PASSWORD` 设置后,Dashboard 的"修改密码"功能将被禁用(显示提示,建议直接修改环境变量)。未设置则密码存储在 `{buckets_dir}/.dashboard_auth.json`(SHA-256 + salt)。
|
||||
|
||||
## Webhook 推送格式 (`OMBRE_HOOK_URL`)
|
||||
|
||||
设置 `OMBRE_HOOK_URL` 后,Ombre Brain 会在以下事件发生时**异步**(fire-and-forget,5 秒超时)`POST` JSON 到该 URL:
|
||||
|
||||
| 事件名 (`event`) | 触发时机 | `payload` 字段 |
|
||||
|------------------|----------|----------------|
|
||||
| `breath` | MCP 工具 `breath()` 返回时 | `mode` (`ok`/`empty`), `matches`, `chars` |
|
||||
| `dream` | MCP 工具 `dream()` 返回时 | `recent`, `chars` |
|
||||
| `breath_hook` | HTTP `GET /breath-hook` 命中(SessionStart 钩子) | `surfaced`, `chars` |
|
||||
| `dream_hook` | HTTP `GET /dream-hook` 命中 | `surfaced`, `chars` |
|
||||
|
||||
请求体结构(JSON):
|
||||
|
||||
```json
|
||||
{
|
||||
"event": "breath",
|
||||
"timestamp": 1730000000.123,
|
||||
"payload": { "...": "..." }
|
||||
}
|
||||
```
|
||||
|
||||
Webhook 推送失败仅在服务日志中以 WARNING 级别记录,**不会影响 MCP 工具的正常返回**。
|
||||
|
||||
10
INTERNALS.md
10
INTERNALS.md
@@ -65,7 +65,7 @@
|
||||
**自动化处理**
|
||||
- 存入时 LLM 自动分析 domain/valence/arousal/tags/name
|
||||
- 大段日记 LLM 拆分为 2~6 条独立记忆
|
||||
- 浮现时自动脱水压缩(LLM 压缩保语义,API 不可用降级到本地关键词提取)
|
||||
- 浮现时自动脱水压缩(LLM 压缩保语义,API 不可用时直接报错,无静默降级)
|
||||
- Wikilink `[[]]` 由 LLM 在内容中标记
|
||||
|
||||
---
|
||||
@@ -168,7 +168,7 @@
|
||||
**迁移/批处理工具**:`migrate_to_domains.py`、`reclassify_domains.py`、`reclassify_api.py`、`backfill_embeddings.py`、`write_memory.py`、`check_buckets.py`、`import_memory.py`(历史对话导入引擎)
|
||||
|
||||
**降级策略**
|
||||
- 脱水 API 不可用 → 本地关键词提取 + 句子评分
|
||||
- 脱水 API 不可用 → 直接抛 RuntimeError(设计决策,详见 BEHAVIOR_SPEC.md 三、降级行为表)
|
||||
- 向量搜索不可用 → 纯 fuzzy match
|
||||
- 逐条错误隔离(grow 中单条失败不影响其他)
|
||||
|
||||
@@ -216,7 +216,7 @@
|
||||
| `server.py` | MCP 服务器主入口,注册工具 + Dashboard API + 钩子端点 | `bucket_manager`, `dehydrator`, `decay_engine`, `embedding_engine`, `utils` | `test_tools.py` |
|
||||
| `bucket_manager.py` | 记忆桶 CRUD、多维索引搜索、wikilink 注入、激活更新 | `utils` | `server.py`, `check_buckets.py`, `backfill_embeddings.py` |
|
||||
| `decay_engine.py` | 衰减引擎:遗忘曲线计算、自动归档、自动结案 | 无(接收 `bucket_mgr` 实例) | `server.py` |
|
||||
| `dehydrator.py` | 数据脱水压缩 + 合并 + 自动打标(LLM API + 本地降级) | `utils` | `server.py` |
|
||||
| `dehydrator.py` | 数据脱水压缩 + 合并 + 自动打标(仅 LLM API,不可用时报 RuntimeError) | `utils` | `server.py` |
|
||||
| `embedding_engine.py` | 向量化引擎:Gemini embedding API + SQLite + 余弦搜索 | `utils` | `server.py`, `backfill_embeddings.py` |
|
||||
| `utils.py` | 配置加载、日志、路径安全、ID 生成、token 估算 | 无 | 所有模块 |
|
||||
| `write_memory.py` | 手动写入记忆 CLI(绕过 MCP) | 无(独立脚本) | 无 |
|
||||
@@ -389,12 +389,12 @@
|
||||
|
||||
### 5.4 为什么有 dehydration(脱水)这一层?
|
||||
|
||||
**决策**:存入前先用 LLM 压缩内容(保留信息密度,去除冗余表达),API 不可用时降级到本地关键词提取。
|
||||
**决策**:存入前先用 LLM 压缩内容(保留信息密度,去除冗余表达)。API 不可用时直接抛出 `RuntimeError`,不静默降级。
|
||||
|
||||
**理由**:
|
||||
- MCP 上下文有 token 限制,原始对话冗长,需要压缩
|
||||
- LLM 压缩能保留语义和情感色彩,纯截断会丢信息
|
||||
- 降级到本地确保离线可用——关键词提取 + 句子排序 + 截断
|
||||
- 本地关键词提取质量不足以替代语义打标与合并,静默降级会产生错误分类记忆,比报错更危险。详见 BEHAVIOR_SPEC.md 三、降级行为表。
|
||||
|
||||
**放弃方案**:只做截断。信息损失太大。
|
||||
|
||||
|
||||
@@ -591,14 +591,14 @@ Dashboard:浏览器打开 `http://localhost:8000/dashboard`
|
||||
> **Free tier won't work**: Render free tier has **no persistent disk** — all memory data is lost on restart. It also sleeps on inactivity. **Starter plan ($7/mo) or above is required.**
|
||||
|
||||
项目根目录已包含 `render.yaml`,点击按钮后:
|
||||
1. (可选)设置 `OMBRE_API_KEY`:任何 OpenAI 兼容 API 的 key,不填则自动降级为本地关键词提取
|
||||
1. 设置 `OMBRE_API_KEY`:任何 OpenAI 兼容 API 的 key(**必需**,未设置时 hold/grow 会报错、仅检索类工具可用)
|
||||
2. (可选)设置 `OMBRE_BASE_URL`:API 地址,支持任意 OpenAI 化地址,如 `https://api.deepseek.com/v1` / `http://123.1.1.1:7689/v1` / `http://your-ollama:11434/v1`
|
||||
3. Render 自动挂载持久化磁盘到 `/opt/render/project/src/buckets`
|
||||
4. Dashboard:`https://<你的服务名>.onrender.com/dashboard`
|
||||
5. 部署后 MCP URL:`https://<你的服务名>.onrender.com/mcp`
|
||||
|
||||
`render.yaml` is included. After clicking the button:
|
||||
1. (Optional) `OMBRE_API_KEY`: any OpenAI-compatible key; omit to fall back to local keyword extraction
|
||||
1. `OMBRE_API_KEY`: any OpenAI-compatible key (**required** for hold/grow; without it those tools raise an error)
|
||||
2. (Optional) `OMBRE_BASE_URL`: any OpenAI-compatible endpoint, e.g. `https://api.deepseek.com/v1`, `http://123.1.1.1:7689/v1`, `http://your-ollama:11434/v1`
|
||||
3. Persistent disk auto-mounts at `/opt/render/project/src/buckets`
|
||||
4. Dashboard: `https://<your-service>.onrender.com/dashboard`
|
||||
@@ -620,7 +620,7 @@ Dashboard:浏览器打开 `http://localhost:8000/dashboard`
|
||||
- Zeabur auto-detects the `Dockerfile` in root and builds via Docker
|
||||
|
||||
2. **设置环境变量 / Set environment variables**(服务页面 → **Variables** 标签页)
|
||||
- `OMBRE_API_KEY`(可选)— LLM API 密钥,不填则自动降级为本地关键词提取
|
||||
- `OMBRE_API_KEY`(**必需**)— LLM API 密钥;未设置时 hold/grow/dream 会报错
|
||||
- `OMBRE_BASE_URL`(可选)— API 地址,如 `https://api.deepseek.com/v1`
|
||||
|
||||
> ⚠️ **不需要**手动设置 `OMBRE_TRANSPORT` 和 `OMBRE_BUCKETS_DIR`,Dockerfile 里已经设好了默认值。Zeabur 对单阶段 Dockerfile 会自动注入控制台设置的环境变量。
|
||||
|
||||
@@ -28,9 +28,11 @@ log_level: "INFO"
|
||||
merge_threshold: 75
|
||||
|
||||
# --- Dehydration API / 脱水压缩 API 配置 ---
|
||||
# Uses a cheap LLM for intelligent compression; auto-degrades to local
|
||||
# keyword extraction if API is unavailable
|
||||
# 用廉价 LLM 做智能压缩,API 不可用时自动降级到本地关键词提取
|
||||
# Uses a cheap LLM for intelligent compression. API is required; if the
|
||||
# configured key/endpoint is unavailable, hold/grow will raise an explicit
|
||||
# error instead of silently degrading (see BEHAVIOR_SPEC.md 三、降级行为表).
|
||||
# 用廉价 LLM 做智能压缩。API 为必需;如 key/endpoint 不可用,
|
||||
# hold/grow 会直接报错而非静默降级(详见 BEHAVIOR_SPEC.md 三、降级行为表)。
|
||||
dehydration:
|
||||
# Supports any OpenAI-compatible API: DeepSeek / Ollama / LM Studio / vLLM / Gemini etc.
|
||||
# 支持所有 OpenAI 兼容 API:DeepSeek / Ollama / LM Studio / vLLM / Gemini 等
|
||||
|
||||
@@ -813,6 +813,24 @@
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="config-section">
|
||||
<h3>宿主机记忆桶目录 (Docker)</h3>
|
||||
<div style="font-size:12px;color:var(--text-dim);margin-bottom:10px;line-height:1.6;">
|
||||
设置 docker-compose 中 <code>${OMBRE_HOST_VAULT_DIR:-./buckets}:/data</code> 的宿主机路径。
|
||||
留空则使用项目内 <code>./buckets</code>。
|
||||
<span style="color:var(--warning);">⚠ 修改后需在宿主机执行 <code>docker compose down && docker compose up -d</code> 才会生效。</span>
|
||||
</div>
|
||||
<div class="config-row">
|
||||
<label>路径</label>
|
||||
<input type="text" id="settings-host-vault" placeholder="例如 /Users/you/Obsidian/Ombre Brain" style="flex:1;" />
|
||||
</div>
|
||||
<div style="display:flex;gap:8px;align-items:center;margin-top:6px;">
|
||||
<button class="btn-primary" onclick="saveHostVault()">保存到 .env</button>
|
||||
<button onclick="loadHostVault()" style="font-size:12px;padding:4px 12px;">重新加载</button>
|
||||
<span id="settings-host-vault-msg" style="font-size:12px;"></span>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="config-section">
|
||||
<h3>账号操作</h3>
|
||||
<button onclick="doLogout()" style="color:var(--negative);border-color:var(--negative);">退出登录</button>
|
||||
@@ -946,6 +964,61 @@ async function loadSettingsStatus() {
|
||||
} catch(e) {
|
||||
el.textContent = '加载失败: ' + e;
|
||||
}
|
||||
// Also refresh the host-vault input whenever the settings tab is loaded.
|
||||
loadHostVault();
|
||||
}
|
||||
|
||||
async function loadHostVault() {
|
||||
const input = document.getElementById('settings-host-vault');
|
||||
const msg = document.getElementById('settings-host-vault-msg');
|
||||
if (!input) return;
|
||||
msg.textContent = '';
|
||||
msg.style.color = 'var(--text-dim)';
|
||||
try {
|
||||
const resp = await authFetch('/api/host-vault');
|
||||
if (!resp) return;
|
||||
const d = await resp.json();
|
||||
input.value = d.value || '';
|
||||
if (d.source === 'env') {
|
||||
msg.textContent = '当前由进程环境变量提供(修改 .env 不会立即覆盖)';
|
||||
msg.style.color = 'var(--warning)';
|
||||
} else if (d.source === 'file') {
|
||||
msg.textContent = '当前来自 ' + (d.env_file || '.env');
|
||||
} else {
|
||||
msg.textContent = '尚未设置(默认使用 ./buckets)';
|
||||
}
|
||||
} catch(e) {
|
||||
msg.style.color = 'var(--negative)';
|
||||
msg.textContent = '加载失败: ' + e;
|
||||
}
|
||||
}
|
||||
|
||||
async function saveHostVault() {
|
||||
const input = document.getElementById('settings-host-vault');
|
||||
const msg = document.getElementById('settings-host-vault-msg');
|
||||
if (!input) return;
|
||||
const value = input.value.trim();
|
||||
msg.textContent = '保存中…';
|
||||
msg.style.color = 'var(--text-dim)';
|
||||
try {
|
||||
const resp = await authFetch('/api/host-vault', {
|
||||
method: 'POST',
|
||||
headers: {'Content-Type': 'application/json'},
|
||||
body: JSON.stringify({value})
|
||||
});
|
||||
if (!resp) return;
|
||||
const d = await resp.json();
|
||||
if (resp.ok) {
|
||||
msg.style.color = 'var(--accent)';
|
||||
msg.textContent = '已保存 → ' + (d.env_file || '.env') + '(需重启容器生效)';
|
||||
} else {
|
||||
msg.style.color = 'var(--negative)';
|
||||
msg.textContent = d.error || '保存失败';
|
||||
}
|
||||
} catch(e) {
|
||||
msg.style.color = 'var(--negative)';
|
||||
msg.textContent = '保存失败: ' + e;
|
||||
}
|
||||
}
|
||||
|
||||
// authFetch: wraps fetch, shows auth overlay on 401
|
||||
|
||||
@@ -152,10 +152,13 @@ class Dehydrator:
|
||||
"""
|
||||
Data dehydrator + content analyzer.
|
||||
Three capabilities: dehydration / merge / auto-tagging (domain + emotion).
|
||||
Prefers API (better quality); auto-degrades to local (guaranteed availability).
|
||||
API-only: every public method requires a working LLM API.
|
||||
If the API is unavailable, methods raise RuntimeError so callers can
|
||||
surface the failure to the user instead of silently producing low-quality results.
|
||||
数据脱水器 + 内容分析器。
|
||||
三大能力:脱水压缩 / 新旧合并 / 自动打标。
|
||||
优先走 API,API 挂了自动降级到本地。
|
||||
仅走 API:API 不可用时直接抛出 RuntimeError,调用方明确感知。
|
||||
(根据 BEHAVIOR_SPEC.md 三、降级行为表决策:无本地降级)
|
||||
"""
|
||||
|
||||
def __init__(self, config: dict):
|
||||
|
||||
183
server.py
183
server.py
@@ -10,18 +10,20 @@
|
||||
# 核心职责:
|
||||
# - Initialize config, bucket manager, dehydrator, decay engine
|
||||
# 初始化配置、记忆桶管理器、脱水器、衰减引擎
|
||||
# - Expose 5 MCP tools:
|
||||
# 暴露 5 个 MCP 工具:
|
||||
# - Expose 6 MCP tools:
|
||||
# 暴露 6 个 MCP 工具:
|
||||
# breath — Surface unresolved memories or search by keyword
|
||||
# 浮现未解决记忆 或 按关键词检索
|
||||
# hold — Store a single memory
|
||||
# 存储单条记忆
|
||||
# hold — Store a single memory (or write a `feel` reflection)
|
||||
# 存储单条记忆(或写 feel 反思)
|
||||
# grow — Diary digest, auto-split into multiple buckets
|
||||
# 日记归档,自动拆分多桶
|
||||
# trace — Modify metadata / resolved / delete
|
||||
# 修改元数据 / resolved 标记 / 删除
|
||||
# pulse — System status + bucket listing
|
||||
# 系统状态 + 所有桶列表
|
||||
# dream — Surface recent dynamic buckets for self-digestion
|
||||
# 返回最近桶 供模型自省/写 feel
|
||||
#
|
||||
# Startup:
|
||||
# 启动方式:
|
||||
@@ -61,6 +63,39 @@ config = load_config()
|
||||
setup_logging(config.get("log_level", "INFO"))
|
||||
logger = logging.getLogger("ombre_brain")
|
||||
|
||||
# --- Runtime env vars (port + webhook) / 运行时环境变量 ---
|
||||
# OMBRE_PORT: HTTP/SSE 监听端口,默认 8000
|
||||
try:
|
||||
OMBRE_PORT = int(os.environ.get("OMBRE_PORT", "8000") or "8000")
|
||||
except ValueError:
|
||||
logger.warning("OMBRE_PORT 不是合法整数,回退到 8000")
|
||||
OMBRE_PORT = 8000
|
||||
|
||||
# OMBRE_HOOK_URL: 在 breath/dream 被调用后推送事件到该 URL(POST JSON)。
|
||||
# OMBRE_HOOK_SKIP: 设为 true/1/yes 跳过推送。
|
||||
# 详见 ENV_VARS.md。
|
||||
OMBRE_HOOK_URL = os.environ.get("OMBRE_HOOK_URL", "").strip()
|
||||
OMBRE_HOOK_SKIP = os.environ.get("OMBRE_HOOK_SKIP", "").strip().lower() in ("1", "true", "yes", "on")
|
||||
|
||||
|
||||
async def _fire_webhook(event: str, payload: dict) -> None:
|
||||
"""
|
||||
Fire-and-forget POST to OMBRE_HOOK_URL with the given event payload.
|
||||
Failures are logged at WARNING level only — never propagated to the caller.
|
||||
"""
|
||||
if OMBRE_HOOK_SKIP or not OMBRE_HOOK_URL:
|
||||
return
|
||||
try:
|
||||
body = {
|
||||
"event": event,
|
||||
"timestamp": time.time(),
|
||||
"payload": payload,
|
||||
}
|
||||
async with httpx.AsyncClient(timeout=5.0) as client:
|
||||
await client.post(OMBRE_HOOK_URL, json=body)
|
||||
except Exception as e:
|
||||
logger.warning(f"Webhook push failed ({event} → {OMBRE_HOOK_URL}): {e}")
|
||||
|
||||
# --- Initialize core components / 初始化核心组件 ---
|
||||
embedding_engine = EmbeddingEngine(config) # Embedding engine first (BucketManager depends on it)
|
||||
bucket_mgr = BucketManager(config, embedding_engine=embedding_engine) # Bucket manager / 记忆桶管理器
|
||||
@@ -74,7 +109,7 @@ import_engine = ImportEngine(config, bucket_mgr, dehydrator, embedding_engine)
|
||||
mcp = FastMCP(
|
||||
"Ombre Brain",
|
||||
host="0.0.0.0",
|
||||
port=8000,
|
||||
port=OMBRE_PORT,
|
||||
)
|
||||
|
||||
|
||||
@@ -322,8 +357,11 @@ async def breath_hook(request):
|
||||
token_budget -= summary_tokens
|
||||
|
||||
if not parts:
|
||||
await _fire_webhook("breath_hook", {"surfaced": 0})
|
||||
return PlainTextResponse("")
|
||||
return PlainTextResponse("[Ombre Brain - 记忆浮现]\n" + "\n---\n".join(parts))
|
||||
body_text = "[Ombre Brain - 记忆浮现]\n" + "\n---\n".join(parts)
|
||||
await _fire_webhook("breath_hook", {"surfaced": len(parts), "chars": len(body_text)})
|
||||
return PlainTextResponse(body_text)
|
||||
except Exception as e:
|
||||
logger.warning(f"Breath hook failed: {e}")
|
||||
return PlainTextResponse("")
|
||||
@@ -360,7 +398,9 @@ async def dream_hook(request):
|
||||
f"{strip_wikilinks(b['content'][:200])}"
|
||||
)
|
||||
|
||||
return PlainTextResponse("[Ombre Brain - Dreaming]\n" + "\n---\n".join(parts))
|
||||
body_text = "[Ombre Brain - Dreaming]\n" + "\n---\n".join(parts)
|
||||
await _fire_webhook("dream_hook", {"surfaced": len(parts), "chars": len(body_text)})
|
||||
return PlainTextResponse(body_text)
|
||||
except Exception as e:
|
||||
logger.warning(f"Dream hook failed: {e}")
|
||||
return PlainTextResponse("")
|
||||
@@ -718,9 +758,12 @@ async def breath(
|
||||
logger.warning(f"Random surfacing failed / 随机浮现失败: {e}")
|
||||
|
||||
if not results:
|
||||
await _fire_webhook("breath", {"mode": "empty", "matches": 0})
|
||||
return "未找到相关记忆。"
|
||||
|
||||
return "\n---\n".join(results)
|
||||
final_text = "\n---\n".join(results)
|
||||
await _fire_webhook("breath", {"mode": "ok", "matches": len(matches), "chars": len(final_text)})
|
||||
return final_text
|
||||
|
||||
|
||||
# =============================================================
|
||||
@@ -1205,7 +1248,9 @@ async def dream() -> str:
|
||||
except Exception as e:
|
||||
logger.warning(f"Dream crystallization hint failed: {e}")
|
||||
|
||||
return header + "\n---\n".join(parts) + connection_hint + crystal_hint
|
||||
final_text = header + "\n---\n".join(parts) + connection_hint + crystal_hint
|
||||
await _fire_webhook("dream", {"recent": len(recent), "chars": len(final_text)})
|
||||
return final_text
|
||||
|
||||
|
||||
# =============================================================
|
||||
@@ -1549,6 +1594,122 @@ async def api_config_update(request):
|
||||
return JSONResponse({"updated": updated, "ok": True})
|
||||
|
||||
|
||||
# =============================================================
|
||||
# /api/host-vault — read/write the host-side OMBRE_HOST_VAULT_DIR
|
||||
# 用于在 Dashboard 设置 docker-compose 挂载的宿主机记忆桶目录。
|
||||
# 写入项目根目录的 .env 文件,需 docker compose down/up 才能生效。
|
||||
# =============================================================
|
||||
|
||||
def _project_env_path() -> str:
|
||||
return os.path.join(os.path.dirname(os.path.abspath(__file__)), ".env")
|
||||
|
||||
|
||||
def _read_env_var(name: str) -> str:
|
||||
"""Return current value of `name` from process env first, then .env file (best-effort)."""
|
||||
val = os.environ.get(name, "").strip()
|
||||
if val:
|
||||
return val
|
||||
env_path = _project_env_path()
|
||||
if not os.path.exists(env_path):
|
||||
return ""
|
||||
try:
|
||||
with open(env_path, "r", encoding="utf-8") as f:
|
||||
for line in f:
|
||||
line = line.strip()
|
||||
if not line or line.startswith("#") or "=" not in line:
|
||||
continue
|
||||
k, _, v = line.partition("=")
|
||||
if k.strip() == name:
|
||||
return v.strip().strip('"').strip("'")
|
||||
except Exception:
|
||||
pass
|
||||
return ""
|
||||
|
||||
|
||||
def _write_env_var(name: str, value: str) -> None:
|
||||
"""
|
||||
Idempotent upsert of `NAME=value` in project .env. Creates the file if missing.
|
||||
Preserves other entries verbatim. Quotes values containing spaces.
|
||||
"""
|
||||
env_path = _project_env_path()
|
||||
quoted = f'"{value}"' if value and (" " in value or "#" in value) else value
|
||||
new_line = f"{name}={quoted}\n"
|
||||
|
||||
lines: list[str] = []
|
||||
if os.path.exists(env_path):
|
||||
with open(env_path, "r", encoding="utf-8") as f:
|
||||
lines = f.readlines()
|
||||
|
||||
replaced = False
|
||||
for i, raw in enumerate(lines):
|
||||
stripped = raw.strip()
|
||||
if not stripped or stripped.startswith("#") or "=" not in stripped:
|
||||
continue
|
||||
k, _, _v = stripped.partition("=")
|
||||
if k.strip() == name:
|
||||
lines[i] = new_line
|
||||
replaced = True
|
||||
break
|
||||
if not replaced:
|
||||
if lines and not lines[-1].endswith("\n"):
|
||||
lines[-1] += "\n"
|
||||
lines.append(new_line)
|
||||
|
||||
with open(env_path, "w", encoding="utf-8") as f:
|
||||
f.writelines(lines)
|
||||
|
||||
|
||||
@mcp.custom_route("/api/host-vault", methods=["GET"])
|
||||
async def api_host_vault_get(request):
|
||||
"""Read the current OMBRE_HOST_VAULT_DIR (process env > project .env)."""
|
||||
from starlette.responses import JSONResponse
|
||||
err = _require_auth(request)
|
||||
if err: return err
|
||||
value = _read_env_var("OMBRE_HOST_VAULT_DIR")
|
||||
return JSONResponse({
|
||||
"value": value,
|
||||
"source": "env" if os.environ.get("OMBRE_HOST_VAULT_DIR", "").strip() else ("file" if value else ""),
|
||||
"env_file": _project_env_path(),
|
||||
})
|
||||
|
||||
|
||||
@mcp.custom_route("/api/host-vault", methods=["POST"])
|
||||
async def api_host_vault_set(request):
|
||||
"""
|
||||
Persist OMBRE_HOST_VAULT_DIR to the project .env file.
|
||||
Body: {"value": "/path/to/vault"} (empty string clears the entry)
|
||||
Note: container restart is required for docker-compose to pick up the new mount.
|
||||
"""
|
||||
from starlette.responses import JSONResponse
|
||||
err = _require_auth(request)
|
||||
if err: return err
|
||||
try:
|
||||
body = await request.json()
|
||||
except Exception:
|
||||
return JSONResponse({"error": "invalid JSON"}, status_code=400)
|
||||
|
||||
raw = body.get("value", "")
|
||||
if not isinstance(raw, str):
|
||||
return JSONResponse({"error": "value must be a string"}, status_code=400)
|
||||
value = raw.strip()
|
||||
|
||||
# Reject characters that would break .env / shell parsing
|
||||
if "\n" in value or "\r" in value or '"' in value or "'" in value:
|
||||
return JSONResponse({"error": "value must not contain quotes or newlines"}, status_code=400)
|
||||
|
||||
try:
|
||||
_write_env_var("OMBRE_HOST_VAULT_DIR", value)
|
||||
except Exception as e:
|
||||
return JSONResponse({"error": f"failed to write .env: {e}"}, status_code=500)
|
||||
|
||||
return JSONResponse({
|
||||
"ok": True,
|
||||
"value": value,
|
||||
"env_file": _project_env_path(),
|
||||
"note": "已写入 .env;需在宿主机执行 `docker compose down && docker compose up -d` 让新挂载生效。",
|
||||
})
|
||||
|
||||
|
||||
# =============================================================
|
||||
# Import API — conversation history import
|
||||
# 导入 API — 对话历史导入
|
||||
@@ -1755,7 +1916,7 @@ if __name__ == "__main__":
|
||||
async with httpx.AsyncClient() as client:
|
||||
while True:
|
||||
try:
|
||||
await client.get("http://localhost:8000/health", timeout=5)
|
||||
await client.get(f"http://localhost:{OMBRE_PORT}/health", timeout=5)
|
||||
logger.debug("Keepalive ping OK / 保活 ping 成功")
|
||||
except Exception as e:
|
||||
logger.warning(f"Keepalive ping failed / 保活 ping 失败: {e}")
|
||||
@@ -1782,6 +1943,6 @@ if __name__ == "__main__":
|
||||
expose_headers=["*"],
|
||||
)
|
||||
logger.info("CORS middleware enabled for remote transport / 已启用 CORS 中间件")
|
||||
uvicorn.run(_app, host="0.0.0.0", port=8000)
|
||||
uvicorn.run(_app, host="0.0.0.0", port=OMBRE_PORT)
|
||||
else:
|
||||
mcp.run(transport=transport)
|
||||
|
||||
Reference in New Issue
Block a user