feat(dashboard): unified workspace hub — cookie auth, 9-state projects, planning chat
Merges workspace.html + ralph.html into a single unified project hub with: - Cookie-based auth (DASHBOARD_TOKEN, HttpOnly, SameSite=Strict) - 9-state project badge system (running-ralph/manual, planning, approved, pending, blocked, failed, complete, idle) with BUTTONS_FOR_STATE matrix - SSE realtime + polling fallback, version-based optimistic concurrency (If-Match) - Planning chat modal (phase stepper, markdown bubbles, 50s+ wait state, auto-resume) - Propose modal (Variant B: inline Plan-with-Echo checkbox) - 5-type toast taxonomy (success/info/warning/busy/error, 3px colored left-bar) - Inter font self-hosted + shared tokens.css design system + DESIGN.md - src/jsonlock.py (flock helper, sidecar .lock for stable inode) - src/approved_tasks_cli.py (shell-safe wrapper for cron/ralph.sh) - 55 new tests (T#1–T#30) + real jsonlock bug fix caught by T#16/T#28 - No emoji anywhere (enforced by test_dashboard_no_emoji.py) Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
This commit is contained in:
30
CLAUDE.md
30
CLAUDE.md
@@ -137,7 +137,19 @@ source .venv/bin/activate && pip install -r requirements.txt
|
|||||||
|
|
||||||
**Memory** (`src/memory_search.py`): Embeddings Ollama all-minilm (384 dim) + cosine similarity SQLite. Trăiește la `memory/` în acest repo — single source of truth. *Notă istorică:* era symlink la repo-ul legacy Clawdbot; consolidat în echo-core în migrația OpenClaw (2026-04).
|
**Memory** (`src/memory_search.py`): Embeddings Ollama all-minilm (384 dim) + cosine similarity SQLite. Trăiește la `memory/` în acest repo — single source of truth. *Notă istorică:* era symlink la repo-ul legacy Clawdbot; consolidat în echo-core în migrația OpenClaw (2026-04).
|
||||||
|
|
||||||
**Dashboard** (`dashboard/`): Echo Task Board — HTTP API + UI static servit de `dashboard/api.py` pe portul 8088, de obicei în spatele unui reverse proxy la `/echo/`. Logica endpoint-urilor împărțită în mixin-uri `dashboard/handlers/*.py`; path-urile centralizate în `dashboard/constants.py`. Template systemd user unit la `dashboard/echo-taskboard.service`.
|
**Dashboard** (`dashboard/`): Echo Task Board — HTTP API + UI static servit de `dashboard/api.py` pe portul 8088, de obicei în spatele unui reverse proxy la `/echo/`. Logica endpoint-urilor împărțită în mixin-uri `dashboard/handlers/*.py`; path-urile centralizate în `dashboard/constants.py`. Template systemd user unit la `dashboard/echo-taskboard.service`. `workspace.html` este hub-ul unificat de proiecte (fostul ralph.html + workspace.html); `/echo/ralph.html` → 302 redirect la `/echo/workspace.html`. Autentificare prin cookie httpOnly `dashboard=<token>`; `DASHBOARD_TOKEN` setat în `dashboard/.env`.
|
||||||
|
|
||||||
|
## Dashboard — Note arhitecturale
|
||||||
|
|
||||||
|
**Cookie auth:** dashboard folosește httpOnly cookie `dashboard=...`; SameSite=Strict; Path=/echo/. EventSource SSE trimite cookie-ul automat. `DASHBOARD_TOKEN` din `dashboard/.env` — setează o dată, restart service. Resetare: schimbă valoarea din .env + restart.
|
||||||
|
|
||||||
|
**jsonlock helper (`src/jsonlock.py`):** folosește `read_locked(path)` / `write_locked(path, mutator)` pentru orice scriere la `approved-tasks.json`, `sessions/*.json`. Lock pe sidecar `<path>.lock` (inode stabil chiar și după os.replace). Ordine canonică lock-uri: alfabetic după filename. Re-entrant (threading.local refcount).
|
||||||
|
|
||||||
|
**Slug convention:** slug-urile proiectelor validează cu regex `^[a-z0-9][a-z0-9\-_]{1,38}[a-z0-9]$` — permit hifene ȘI underscore. Validare centralizată în `dashboard/handlers/_validators.py`.
|
||||||
|
|
||||||
|
**Proxy timeout:** pentru nginx/caddy, setează `proxy_read_timeout >= 60s` și `proxy_buffering off` pentru `/echo/api/projects/stream` și `/echo/api/projects/<slug>/plan/*` (SSE + planning au răspunsuri lungi).
|
||||||
|
|
||||||
|
**Planning fragmentation (known limit):** sesiunile de planning pornite din Discord/Telegram nu se fuzionează cu cele din dashboard. Dashboard afișează sesiunea cea mai recentă per slug indiferent de adapter. P3 follow-up.
|
||||||
|
|
||||||
## Ralph — Execuție autonomă de proiecte
|
## Ralph — Execuție autonomă de proiecte
|
||||||
|
|
||||||
@@ -151,7 +163,7 @@ Marius → /a <slug> (Discord/Telegram/WhatsApp → router.py → statu
|
|||||||
23:00 night-execute → citește approved, clonează repo dacă lipsește, generează PRD din final-plan.md,
|
23:00 night-execute → citește approved, clonează repo dacă lipsește, generează PRD din final-plan.md,
|
||||||
lansează ralph.sh; actualizează approved-tasks.json (running, pid: PID)
|
lansează ralph.sh; actualizează approved-tasks.json (running, pid: PID)
|
||||||
08:30 morning-report → citește approved-tasks.json + prd.json per proiect, raportează stories done/total
|
08:30 morning-report → citește approved-tasks.json + prd.json per proiect, raportează stories done/total
|
||||||
Live dashboard → /echo/ralph.html (polling 5s) — cards per proiect cu status, iter, ETA, log, stop
|
Live dashboard → /echo/workspace.html — cards per proiect cu status, iter, ETA, log, stop; realtime SSE
|
||||||
```
|
```
|
||||||
|
|
||||||
**Două căi de aprobare**:
|
**Două căi de aprobare**:
|
||||||
@@ -203,8 +215,9 @@ Pe **WhatsApp**: text-only — meniu redirect la Discord/Telegram. **Text-keywor
|
|||||||
| `~/workspace/<name>/scripts/ralph/prd.json` | PRD per proiect cu schema extinsă |
|
| `~/workspace/<name>/scripts/ralph/prd.json` | PRD per proiect cu schema extinsă |
|
||||||
| `~/workspace/<name>/scripts/ralph/logs/` | Loguri ralph.sh per rulare |
|
| `~/workspace/<name>/scripts/ralph/logs/` | Loguri ralph.sh per rulare |
|
||||||
| `dashboard/handlers/ralph.py` | Endpoints `/api/ralph/status`, `/<slug>/log`, `/<slug>/prd`, `/<slug>/stop`, `/<slug>/rollback`, `/usage[?days=N]`, `/stream` (SSE) |
|
| `dashboard/handlers/ralph.py` | Endpoints `/api/ralph/status`, `/<slug>/log`, `/<slug>/prd`, `/<slug>/stop`, `/<slug>/rollback`, `/usage[?days=N]`, `/stream` (SSE) |
|
||||||
| `dashboard/ralph.html` | UI live cards, status badges, ETA, butoane log/prd/stop/rollback. Realtime via EventSource cu fallback la polling 5s; badge 🟢 Live / ⏱ Polling |
|
| `dashboard/handlers/projects.py` | Endpoints unificate proiecte: `/api/projects`, `/propose`, `/approve`, `/unapprove`, `/cancel`, `/<slug>/plan/*`, `/stream` (SSE), `/signature` |
|
||||||
| `dashboard/.env` | `GITEA_TOKEN` pentru clone HTTPS la `gitea.romfast.ro` |
|
| `dashboard/workspace.html` | Hub unificat proiecte — cards status/iter/ETA, log, prd, stop/rollback. Realtime SSE cu fallback polling 5s. Înlocuiește ralph.html (care face 302 redirect aici) |
|
||||||
|
| `dashboard/.env` | `GITEA_TOKEN` pentru clone HTTPS la `gitea.romfast.ro`; `DASHBOARD_TOKEN` pentru cookie auth |
|
||||||
|
|
||||||
**Status flow:** `pending` → (`planning` →) `approved` → `running` → `complete` / `failed` / `stopped` / `blocked` (DAG)
|
**Status flow:** `pending` → (`planning` →) `approved` → `running` → `complete` / `failed` / `stopped` / `blocked` (DAG)
|
||||||
**Story status (în prd.json):** `passes:false` + `retries:N` → `passes:true` SAU `failed:rate_limited|max_retries`
|
**Story status (în prd.json):** `passes:false` + `retries:N` → `passes:true` SAU `failed:rate_limited|max_retries`
|
||||||
@@ -234,9 +247,16 @@ Import-uri absolute via `sys.path.insert(0, PROJECT_ROOT)`: `from src.config imp
|
|||||||
| `personality/*.md` | System prompt — cine ești |
|
| `personality/*.md` | System prompt — cine ești |
|
||||||
| `memory/` | Knowledge base — embeddings + SQLite (în repo, nu symlink) |
|
| `memory/` | Knowledge base — embeddings + SQLite (în repo, nu symlink) |
|
||||||
| `dashboard/api.py` | Task Board HTTP API (port 8088) |
|
| `dashboard/api.py` | Task Board HTTP API (port 8088) |
|
||||||
| `dashboard/handlers/` | Mixin-uri endpoints (git, cron, habits, eco, files, pdf, workspace, youtube) |
|
| `dashboard/handlers/` | Mixin-uri endpoints (git, cron, habits, eco, files, pdf, workspace, youtube, projects, ralph, auth) |
|
||||||
|
| `dashboard/handlers/projects.py` | Endpoints unificate proiecte: `/api/projects`, `/propose`, `/approve`, `/unapprove`, `/cancel`, `/<slug>/plan/*`, `/stream` (SSE) |
|
||||||
|
| `dashboard/handlers/auth.py` | Login/logout cu cookie httpOnly `dashboard=<token>`; `DASHBOARD_TOKEN` din `.env` |
|
||||||
|
| `dashboard/handlers/_validators.py` | Validatori slug/descriere partajați. Slug regex: `^[a-z0-9][a-z0-9\-_]{1,38}[a-z0-9]$` (permite hifene ȘI underscore) |
|
||||||
|
| `dashboard/static/tokens.css` | Design tokens CSS (`--color-*`, `--space-*`, etc.) — shared variables pentru toate paginile |
|
||||||
|
| `dashboard/DESIGN.md` | Design system source-of-truth: tokens, componente, regula no-emoji |
|
||||||
| `dashboard/constants.py` | Path-uri centralizate + config Gitea pentru dashboard |
|
| `dashboard/constants.py` | Path-uri centralizate + config Gitea pentru dashboard |
|
||||||
| `dashboard/echo-taskboard.service` | Template systemd user unit |
|
| `dashboard/echo-taskboard.service` | Template systemd user unit |
|
||||||
|
| `src/jsonlock.py` | Flock helper pentru scrieri concurente: `read_locked(path)`, `write_locked(path, mutator)`, `LockTimeoutError`. Sidecar `<path>.lock` (inode stabil). Re-entrant per thread. Ordine canonică: alfabetic |
|
||||||
|
| `src/approved_tasks_cli.py` | CLI wrapper pentru shell scripts: scrie în `approved-tasks.json` prin jsonlock. Usage: `python3 -m src.approved_tasks_cli set-status --slug X --status Y` |
|
||||||
| `cron/jobs.json` | Job-uri APScheduler (schemă plată, Europe/Bucharest) |
|
| `cron/jobs.json` | Job-uri APScheduler (schemă plată, Europe/Bucharest) |
|
||||||
| `approved-tasks.json` | Fișier coordonare Ralph — status proiecte autonome (extins cu `planning_session_id`, `final_plan_path`) |
|
| `approved-tasks.json` | Fișier coordonare Ralph — status proiecte autonome (extins cu `planning_session_id`, `final_plan_path`) |
|
||||||
| `tasks/lessons.md` | Lecții capturate din corectările lui Marius (citit la session start) |
|
| `tasks/lessons.md` | Lecții capturate din corectările lui Marius (citit la session start) |
|
||||||
|
|||||||
@@ -195,7 +195,7 @@
|
|||||||
"channel": "echo-work",
|
"channel": "echo-work",
|
||||||
"model": "sonnet",
|
"model": "sonnet",
|
||||||
"enabled": true,
|
"enabled": true,
|
||||||
"prompt": "RAPORT SEARĂ - trimite pe EMAIL (Gmail: mmarius28@gmail.com)\n\n## CALENDAR\nVerifică ce ai mâine și săptămâna:\n```bash\ncd ~/echo-core && source venv/bin/activate && python3 tools/calendar_check.py today\npython3 tools/calendar_check.py week\n```\n\n## CITEȘTE CONTEXT\n- USER.md pentru programul lui Marius (luni-joi 15-16 liber, vineri-dum NLP)\n- memory/kb/insights/YYYY-MM-DD.md pentru propuneri insights\n- memory/kb/youtube/ și memory/kb/articole/ pentru inspirație proiecte\n- /home/moltbot/echo-core/approved-tasks.json pentru status proiecte existente (câmpurile: name, status, proposed_at)\n\n## FORMAT EMAIL HTML\n- Font: 16px text, 18px titluri\n- Culori: albastru (#dbeafe) DONE, gri (#f3f4f6) PROGRAMAT, verde (#d1fae5) PROJECTS\n- Link-uri vizibile\n\n## STRUCTURA RAPORT\n\n### 1. MÂINE\n- 📅 Evenimente calendar\n- 🚂 Travel reminders\n\n### 2. STATUS\n- Ce s-a făcut azi\n- Git status\n\n### 3. PROPUNERI CU ZI ȘI ORĂ!\n\n**OBLIGATORIU:** Fiecare propunere TU+EU sau FAC TU trebuie să aibă ZI și ORĂ concrete!\n\nReguli programare:\n- Luni-Joi 15:00-16:00 = slot liber\n- Vineri-Duminică = NLP, evită\n- Verifică calendar să nu fie ocupat\n- Sesiuni scurte: 15-30 min\n\n### 4. PROGRAME/PROIECTE PRACTICE 💻\n\n**CONTEXT OBLIGATORIU - citește înainte de a propune:**\n\n**Proiecte existente (PRIORITARE pentru features):**\n- **roa2web** (gitea.romfast.ro/romfast/roa2web) - FastAPI+Vue.js+Telegram bot\n - Are deja: balanță, facturi, trezorerie\n - Lipsesc: validări declarații ANAF, facturare valută/taxare inversă, notificări\n - Rapoarte ROA noi → FEATURE în roa2web, NU proiect separat!\n- **Chatbot Maria** (Flowise pe LXC 104, ngrok → romfast.ro/chatbot_maria.html)\n - Document store: XML, MD | Groq gratuit + Ollama embeddings + FAISS\n - Problema: răspunsuri nu sunt suficient de bune\n - Angajatul nou poate menține documentația (scrie TXT, trebuie converter)\n - Clientii îl accesează din programele ROA direct\n\n**Întrebări frecvente clienți (surse de proiecte):**\n- Erori validare declarații ANAF (D406, D394, D100 etc.)\n- Cum facturez în valută cu taxare inversă?\n- Probleme la instalări, inițializări firme noi, configurări\n\n**Reguli propuneri (80/20 STRICT):**\n- Impact mare pentru Marius → apoi pentru clienți ERP ROA\n- Inspirat din discovery (YouTube, articole, insights procesate)\n- Features roa2web > proiecte noi (integrare în existent)\n- Proiecte independente doar dacă NU se potrivesc în roa2web/Flowise\n\n**A. FEATURES PROIECTE EXISTENTE (2-3, PRIORITAR):**\n\nFormat:\n```\n### ⚡ F1 - Feature pentru [roa2web/chatbot]\n**Ce face:** Descriere scurtă\n**De ce:** Ce problemă rezolvă (ex: \"clienții întreabă X de 5 ori/săptămână\")\n**Complexitate:** S/M/L\n**Proiect:** roa2web / chatbot-maria\n```\n\n**B. PROIECTE NOI (max 1, doar dacă nu se integrează în existente):**\n\nFormat:\n```\n### 💻 P1 - Nume Proiect\n**De ce:** Cum se leagă de nevoile lui Marius/clienți\n**Impact:** Pentru Marius + pentru clienți\n**Efort:** Ore/zile realist\n**Stack:** Simplu (80/20)\n**Sursă:** [Link nota KB]\n```\n\n**NU propune:**\n- Proiecte complexe fără beneficiu clar\n- Proiecte duplicat cu ce există deja\n- Rapoarte ROA ca proiect separat (→ feature roa2web)\n\n### 5. INSIGHTS DISPONIBILE\nListează insights-uri [ ] nepropuse încă (format scurt).\n\n### 6. CUM RĂSPUNZI\n- DA = aprob toate (cu zilele/orele propuse)\n- 1 pentru A1,A2 = execut ACUM\n- 2 pentru A3 = programez noapte\n- 3 pentru A5 = skip\n- **F pentru F1,F3** = implementează features (joburi noapte)\n- **P pentru P1** = creează proiect nou (job noapte)\n- Alt orar = \"A1 miercuri nu marți\"\n\n## IMPLEMENTARE PROIECTE APROBATE\n\nCând propui features (F) sau proiecte (P), adaugă-le automat în /home/moltbot/echo-core/approved-tasks.json cu status 'pending':\n```bash\npython3 -c \"\nimport json, datetime\nf = open('/home/moltbot/echo-core/approved-tasks.json')\ndata = json.load(f); f.close()\ndata['projects'].append({'name': 'SLUG-PROIECT', 'description': 'DESCRIERE', 'status': 'pending', 'proposed_at': datetime.datetime.utcnow().isoformat(), 'approved_at': None, 'started_at': None, 'pid': None})\ndata['last_updated'] = datetime.datetime.utcnow().isoformat()\nopen('/home/moltbot/echo-core/approved-tasks.json', 'w').write(json.dumps(data, indent=2))\n\"\n```\n\nÎn email, arată lui Marius comanda de aprobare:\n`!approve SLUG-PROIECT` (trimite pe Discord/Telegram la Echo)\n\nNight-execute (23:00) va:\n - genera PRD cu ralph_prd_generator.py dacă nu există prd.json\n - lansa ralph.sh 15 iterații pentru fiecare proiect aprobat\n\n## TRIMITERE\npython3 /home/moltbot/echo-core/tools/email_send.py \"mmarius28@gmail.com\" \"Raport Seara DATA\" \"HTML_CONTENT\"\n\nNU trimite pe Discord - doar email.",
|
"prompt": "RAPORT SEARĂ - trimite pe EMAIL (Gmail: mmarius28@gmail.com)\n\n## CALENDAR\nVerifică ce ai mâine și săptămâna:\n```bash\ncd ~/echo-core && source venv/bin/activate && python3 tools/calendar_check.py today\npython3 tools/calendar_check.py week\n```\n\n## CITEȘTE CONTEXT\n- USER.md pentru programul lui Marius (luni-joi 15-16 liber, vineri-dum NLP)\n- memory/kb/insights/YYYY-MM-DD.md pentru propuneri insights\n- memory/kb/youtube/ și memory/kb/articole/ pentru inspirație proiecte\n- /home/moltbot/echo-core/approved-tasks.json pentru status proiecte existente (câmpurile: name, status, proposed_at)\n\n## FORMAT EMAIL HTML\n- Font: 16px text, 18px titluri\n- Culori: albastru (#dbeafe) DONE, gri (#f3f4f6) PROGRAMAT, verde (#d1fae5) PROJECTS\n- Link-uri vizibile\n\n## STRUCTURA RAPORT\n\n### 1. MÂINE\n- 📅 Evenimente calendar\n- 🚂 Travel reminders\n\n### 2. STATUS\n- Ce s-a făcut azi\n- Git status\n\n### 3. PROPUNERI CU ZI ȘI ORĂ!\n\n**OBLIGATORIU:** Fiecare propunere TU+EU sau FAC TU trebuie să aibă ZI și ORĂ concrete!\n\nReguli programare:\n- Luni-Joi 15:00-16:00 = slot liber\n- Vineri-Duminică = NLP, evită\n- Verifică calendar să nu fie ocupat\n- Sesiuni scurte: 15-30 min\n\n### 4. PROGRAME/PROIECTE PRACTICE 💻\n\n**CONTEXT OBLIGATORIU - citește înainte de a propune:**\n\n**Proiecte existente (PRIORITARE pentru features):**\n- **roa2web** (gitea.romfast.ro/romfast/roa2web) - FastAPI+Vue.js+Telegram bot\n - Are deja: balanță, facturi, trezorerie\n - Lipsesc: validări declarații ANAF, facturare valută/taxare inversă, notificări\n - Rapoarte ROA noi → FEATURE în roa2web, NU proiect separat!\n- **Chatbot Maria** (Flowise pe LXC 104, ngrok → romfast.ro/chatbot_maria.html)\n - Document store: XML, MD | Groq gratuit + Ollama embeddings + FAISS\n - Problema: răspunsuri nu sunt suficient de bune\n - Angajatul nou poate menține documentația (scrie TXT, trebuie converter)\n - Clientii îl accesează din programele ROA direct\n\n**Întrebări frecvente clienți (surse de proiecte):**\n- Erori validare declarații ANAF (D406, D394, D100 etc.)\n- Cum facturez în valută cu taxare inversă?\n- Probleme la instalări, inițializări firme noi, configurări\n\n**Reguli propuneri (80/20 STRICT):**\n- Impact mare pentru Marius → apoi pentru clienți ERP ROA\n- Inspirat din discovery (YouTube, articole, insights procesate)\n- Features roa2web > proiecte noi (integrare în existent)\n- Proiecte independente doar dacă NU se potrivesc în roa2web/Flowise\n\n**A. FEATURES PROIECTE EXISTENTE (2-3, PRIORITAR):**\n\nFormat:\n```\n### ⚡ F1 - Feature pentru [roa2web/chatbot]\n**Ce face:** Descriere scurtă\n**De ce:** Ce problemă rezolvă (ex: \"clienții întreabă X de 5 ori/săptămână\")\n**Complexitate:** S/M/L\n**Proiect:** roa2web / chatbot-maria\n```\n\n**B. PROIECTE NOI (max 1, doar dacă nu se integrează în existente):**\n\nFormat:\n```\n### 💻 P1 - Nume Proiect\n**De ce:** Cum se leagă de nevoile lui Marius/clienți\n**Impact:** Pentru Marius + pentru clienți\n**Efort:** Ore/zile realist\n**Stack:** Simplu (80/20)\n**Sursă:** [Link nota KB]\n```\n\n**NU propune:**\n- Proiecte complexe fără beneficiu clar\n- Proiecte duplicat cu ce există deja\n- Rapoarte ROA ca proiect separat (→ feature roa2web)\n\n### 5. INSIGHTS DISPONIBILE\nListează insights-uri [ ] nepropuse încă (format scurt).\n\n### 6. CUM RĂSPUNZI\n- DA = aprob toate (cu zilele/orele propuse)\n- 1 pentru A1,A2 = execut ACUM\n- 2 pentru A3 = programez noapte\n- 3 pentru A5 = skip\n- **F pentru F1,F3** = implementează features (joburi noapte)\n- **P pentru P1** = creează proiect nou (job noapte)\n- Alt orar = \"A1 miercuri nu marți\"\n\n## IMPLEMENTARE PROIECTE APROBATE\n\nCând propui features (F) sau proiecte (P), adaugă-le automat în /home/moltbot/echo-core/approved-tasks.json cu status 'pending'. Folosește CLI wrapper-ul (atomic, flock-safe — NU edita fișierul direct):\n```bash\ncd /home/moltbot/echo-core && source .venv/bin/activate && \\\n python3 -m src.approved_tasks_cli add-project \\\n --slug SLUG-PROIECT \\\n --description \"DESCRIERE COMPLETĂ\"\n```\n\nÎn email, arată lui Marius comanda de aprobare:\n`!approve SLUG-PROIECT` (trimite pe Discord/Telegram la Echo)\n\nNight-execute (23:00) va:\n - genera PRD cu ralph_prd_generator.py dacă nu există prd.json\n - lansa ralph.sh 15 iterații pentru fiecare proiect aprobat\n\n## TRIMITERE\npython3 /home/moltbot/echo-core/tools/email_send.py \"mmarius28@gmail.com\" \"Raport Seara DATA\" \"HTML_CONTENT\"\n\nNU trimite pe Discord - doar email.",
|
||||||
"allowed_tools": [],
|
"allowed_tools": [],
|
||||||
"last_run": "2026-04-27T21:00:00.003134+00:00",
|
"last_run": "2026-04-27T21:00:00.003134+00:00",
|
||||||
"last_status": "ok",
|
"last_status": "ok",
|
||||||
@@ -269,9 +269,9 @@
|
|||||||
"prompt": "Heartbeat check. Rulează src/heartbeat.py printr-un scurt raport de status.\nDacă nu e nimic de raportat (email=0, calendar nu are evenimente <2h, kb ok), răspunde doar cu HEARTBEAT_OK și oprește-te — nu trimite mesaj.\nDacă e ceva: raport scurt pe Discord #echo-work.",
|
"prompt": "Heartbeat check. Rulează src/heartbeat.py printr-un scurt raport de status.\nDacă nu e nimic de raportat (email=0, calendar nu are evenimente <2h, kb ok), răspunde doar cu HEARTBEAT_OK și oprește-te — nu trimite mesaj.\nDacă e ceva: raport scurt pe Discord #echo-work.",
|
||||||
"allowed_tools": [],
|
"allowed_tools": [],
|
||||||
"enabled": true,
|
"enabled": true,
|
||||||
"last_run": "2026-04-27T18:00:00.002242+00:00",
|
"last_run": "2026-04-28T06:00:00.001979+00:00",
|
||||||
"last_status": "ok",
|
"last_status": "ok",
|
||||||
"next_run": "2026-04-28T06:00:00+00:00"
|
"next_run": "2026-04-28T08:00:00+00:00"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"name": "night-execute",
|
"name": "night-execute",
|
||||||
@@ -279,7 +279,7 @@
|
|||||||
"channel": "echo-work",
|
"channel": "echo-work",
|
||||||
"model": "opus",
|
"model": "opus",
|
||||||
"enabled": true,
|
"enabled": true,
|
||||||
"prompt": "NIGHT-EXECUTE - Implementare autonoma proiecte aprobate\n\n## PASUL 1: Citeste proiectele aprobate\n\nCiteste /home/moltbot/echo-core/approved-tasks.json\nSelecteaza proiectele cu status='approved'\nDaca nu sunt proiecte aprobate: raporteaza pe Discord si opreste-te.\n\n## PASUL 2: Pentru fiecare proiect aprobat\n\n1. Verifica daca workspace-ul exista: /home/moltbot/workspace/{name}\n - Daca nu: TOKEN=$(grep GITEA_TOKEN /home/moltbot/echo-core/dashboard/.env | cut -d= -f2) && git clone https://moltbot:${TOKEN}@gitea.romfast.ro/romfast/{name}.git /home/moltbot/workspace/{name}\n\n2. Verifica daca prd.json exista: /home/moltbot/workspace/{name}/scripts/ralph/prd.json\n - Daca nu: ruleaza generatorul PRD:\n source .venv/bin/activate\n python3 tools/ralph_prd_generator.py \"{name}\" \"{description}\" /home/moltbot/workspace\n\n3. Lanseaza Ralph loop:\n cd /home/moltbot/workspace/{name}\n chmod +x scripts/ralph/ralph.sh\n mkdir -p scripts/ralph/logs\n nohup ./scripts/ralph/ralph.sh 15 > scripts/ralph/logs/ralph-$(date +%Y%m%d).log 2>&1 &\n echo $! > scripts/ralph/.ralph.pid\n\n4. Actualizeaza approved-tasks.json:\n - status: 'running'\n - started_at: timestamp curent\n - pid: PID din .ralph.pid\n\n## PASUL 3: Raport Discord\n\nTrimite pe echo-work:\n- Cate proiecte au pornit\n- PID-urile lor\n- 'morning-report va raporta progresul la 08:30'\n\n## REGULI IMPORTANTE\n\n- Nu modifica niciodata src/router.py, src/claude_session.py sau alte fisiere core echo-core prin Ralph\n- echo-core self-improvement NUMAI pe branch ralph/echo-improve, nu pe master\n- Daca ralph.sh esueaza: log in approved-tasks.json (status: failed, error: mesaj)\n- Delay 5 secunde intre proiecte pentru a evita rate limiting\n",
|
"prompt": "NIGHT-EXECUTE - Implementare autonoma proiecte aprobate\n\n## PASUL 1: Citeste proiectele aprobate\n\nCiteste /home/moltbot/echo-core/approved-tasks.json\nSelecteaza proiectele cu status='approved'\nDaca nu sunt proiecte aprobate: raporteaza pe Discord si opreste-te.\n\n## PASUL 2: Pentru fiecare proiect aprobat\n\n1. Verifica daca workspace-ul exista: /home/moltbot/workspace/{name}\n - Daca nu: TOKEN=$(grep GITEA_TOKEN /home/moltbot/echo-core/dashboard/.env | cut -d= -f2) && git clone https://moltbot:${TOKEN}@gitea.romfast.ro/romfast/{name}.git /home/moltbot/workspace/{name}\n\n2. Verifica daca prd.json exista: /home/moltbot/workspace/{name}/scripts/ralph/prd.json\n - Daca nu: ruleaza generatorul PRD:\n source .venv/bin/activate\n python3 tools/ralph_prd_generator.py \"{name}\" \"{description}\" /home/moltbot/workspace\n\n3. Lanseaza Ralph loop:\n cd /home/moltbot/workspace/{name}\n chmod +x scripts/ralph/ralph.sh\n mkdir -p scripts/ralph/logs\n nohup ./scripts/ralph/ralph.sh 15 > scripts/ralph/logs/ralph-$(date +%Y%m%d).log 2>&1 &\n echo $! > scripts/ralph/.ralph.pid\n\n4. Actualizeaza approved-tasks.json prin CLI wrapper-ul atomic (NU edita fisierul direct — foloseste flock):\n ```bash\n PID=$(cat /home/moltbot/workspace/{name}/scripts/ralph/.ralph.pid)\n cd /home/moltbot/echo-core && source .venv/bin/activate && \\\n python3 -m src.approved_tasks_cli mark-running --slug {name} --pid \"$PID\"\n ```\n (echivalent cu: status='running', started_at=now, pid=PID — toate intr-un singur write_locked)\n\n## PASUL 3: Raport Discord\n\nTrimite pe echo-work:\n- Cate proiecte au pornit\n- PID-urile lor\n- 'morning-report va raporta progresul la 08:30'\n\n## REGULI IMPORTANTE\n\n- Nu modifica niciodata src/router.py, src/claude_session.py sau alte fisiere core echo-core prin Ralph\n- echo-core self-improvement NUMAI pe branch ralph/echo-improve, nu pe master\n- Daca ralph.sh esueaza, marcheaza proiectul prin CLI wrapper:\n ```bash\n cd /home/moltbot/echo-core && source .venv/bin/activate && \\\n python3 -m src.approved_tasks_cli mark-failed --slug {name} --error \"MESAJ\"\n ```\n- Delay 5 secunde intre proiecte pentru a evita rate limiting\n",
|
||||||
"allowed_tools": [
|
"allowed_tools": [
|
||||||
"Bash",
|
"Bash",
|
||||||
"Read",
|
"Read",
|
||||||
|
|||||||
317
dashboard/DESIGN.md
Normal file
317
dashboard/DESIGN.md
Normal file
@@ -0,0 +1,317 @@
|
|||||||
|
# Echo Dashboard — Design System
|
||||||
|
|
||||||
|
This document is the source of truth for visual decisions across the Echo
|
||||||
|
Dashboard (port 8088, served at `/echo/`). Tokens live in
|
||||||
|
`dashboard/static/tokens.css`. Page-level CSS is in `common.css` and per-page
|
||||||
|
`<style>` blocks. **Pages must include `tokens.css` before `common.css`.**
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Theme
|
||||||
|
|
||||||
|
- **Default**: dark. Background `--bg-base: #13131a` (near-black neutral).
|
||||||
|
- **Light theme**: opt-in via `<html data-theme="light">`. Light tokens override
|
||||||
|
the dark palette in the same `:root`-equivalent block.
|
||||||
|
- **Toggle**: header `.theme-toggle` button — persisted in `localStorage`.
|
||||||
|
|
||||||
|
Surfaces are translucent overlays on `--bg-base`, never solid greys, so
|
||||||
|
elevation reads consistently against future backgrounds.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Color tokens
|
||||||
|
|
||||||
|
### Surfaces (dark)
|
||||||
|
|
||||||
|
| Token | Value | Use |
|
||||||
|
|---|---|---|
|
||||||
|
| `--bg-base` | `#13131a` | App background |
|
||||||
|
| `--bg-surface` | `rgba(255,255,255,0.12)` | Cards, panels, inputs |
|
||||||
|
| `--bg-surface-hover` | `rgba(255,255,255,0.16)` | Hover state on surfaces |
|
||||||
|
| `--bg-surface-active` | `rgba(255,255,255,0.20)` | Pressed / active surfaces |
|
||||||
|
| `--bg-elevated` | `rgba(255,255,255,0.14)` | Selects, popovers |
|
||||||
|
| `--header-bg` | `rgba(19,19,26,0.95)` | Sticky header backdrop |
|
||||||
|
|
||||||
|
### Text
|
||||||
|
|
||||||
|
| Token | Value | Use |
|
||||||
|
|---|---|---|
|
||||||
|
| `--text-primary` | `#ffffff` | Headings, key labels |
|
||||||
|
| `--text-secondary` | `#f5f5f5` | Body copy |
|
||||||
|
| `--text-muted` | `#e5e5e5` | Meta, timestamps, captions |
|
||||||
|
|
||||||
|
### Accent + borders
|
||||||
|
|
||||||
|
| Token | Value | Use |
|
||||||
|
|---|---|---|
|
||||||
|
| `--accent` | `#3b82f6` | Primary buttons, focus, links |
|
||||||
|
| `--accent-hover` | `#2563eb` | Hover on `--accent` |
|
||||||
|
| `--accent-subtle` | `rgba(59,130,246,0.2)` | Active nav background |
|
||||||
|
| `--border` | `rgba(255,255,255,0.3)` | Card / input outline |
|
||||||
|
| `--border-focus` | `rgba(59,130,246,0.7)` | Card hover, input focus |
|
||||||
|
|
||||||
|
### Semantic state
|
||||||
|
|
||||||
|
| Token | Value | Meaning |
|
||||||
|
|---|---|---|
|
||||||
|
| `--success` | `#22c55e` | OK, saved, healthy |
|
||||||
|
| `--warning` | `#eab308` | Caution, soft fail |
|
||||||
|
| `--error` | `#ef4444` | Hard fail, destructive |
|
||||||
|
|
||||||
|
### Status palette (workflow states)
|
||||||
|
|
||||||
|
These drive the `.status-pill[data-status]` system on workspace cards.
|
||||||
|
|
||||||
|
| Token | Value | State name |
|
||||||
|
|---|---|---|
|
||||||
|
| `--status-running` | `rgb(34, 197, 94)` | `running-ralph`, `running-manual` |
|
||||||
|
| `--status-blocked` | `rgb(245, 158, 11)` | `blocked` |
|
||||||
|
| `--status-failed` | `rgb(239, 68, 68)` | `failed` |
|
||||||
|
| `--status-complete` | `rgb(156, 163, 175)` | `complete` |
|
||||||
|
| `--status-idle` | `var(--text-muted)` | `idle` |
|
||||||
|
| `--status-planning` | `rgb(167, 139, 250)` | `planning` *(new)* |
|
||||||
|
| `--status-pending` | `rgb(96, 165, 250)` | `pending` *(new)* |
|
||||||
|
| `--status-approved` | `rgb(234, 179, 8)` | `approved` *(new)* |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Typography
|
||||||
|
|
||||||
|
- **Sans**: `'Inter', -apple-system, BlinkMacSystemFont, 'Segoe UI', sans-serif`
|
||||||
|
— self-hosted woff2 at `/echo/static/fonts/inter-{400,500,600,700}.woff2`.
|
||||||
|
- **Mono**: `'JetBrains Mono', 'Fira Code', ui-monospace, monospace` — for
|
||||||
|
logs, code blocks, slugs, IDs. Loaded by browser if present (not bundled).
|
||||||
|
|
||||||
|
### Size scale
|
||||||
|
|
||||||
|
| Token | rem | px @ 16px |
|
||||||
|
|---|---|---|
|
||||||
|
| `--text-xs` | 0.75 | 12 |
|
||||||
|
| `--text-sm` | 0.875 | 14 |
|
||||||
|
| `--text-base` | 1 | 16 |
|
||||||
|
| `--text-lg` | 1.125 | 18 |
|
||||||
|
| `--text-xl` | 1.25 | 20 |
|
||||||
|
|
||||||
|
### Weights
|
||||||
|
|
||||||
|
400 (body), 500 (medium emphasis), 600 (headings, button labels),
|
||||||
|
700 (rare — page titles only). No 800/900.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Spacing — 8px grid
|
||||||
|
|
||||||
|
All padding, margin, and gap values use these tokens. No hard-coded pixels.
|
||||||
|
|
||||||
|
| Token | px |
|
||||||
|
|---|---|
|
||||||
|
| `--space-1` | 4 |
|
||||||
|
| `--space-2` | 8 |
|
||||||
|
| `--space-3` | 12 |
|
||||||
|
| `--space-4` | 16 |
|
||||||
|
| `--space-5` | 20 |
|
||||||
|
| `--space-6` | 24 |
|
||||||
|
| `--space-8` | 32 |
|
||||||
|
| `--space-10` | 40 |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Border radius
|
||||||
|
|
||||||
|
| Token | px | Use |
|
||||||
|
|---|---|---|
|
||||||
|
| `--radius-sm` | 4 | Tags, micro-pills |
|
||||||
|
| `--radius-md` | 8 | Buttons, inputs |
|
||||||
|
| `--radius-lg` | 12 | Cards, modals, panels |
|
||||||
|
| `--radius-full` | 9999 | Status pills, badges, avatars |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Buttons
|
||||||
|
|
||||||
|
All buttons share `.btn` (8px radius, 14px font, 8/16 padding,
|
||||||
|
`--transition-fast`).
|
||||||
|
|
||||||
|
| Variant | Class | Surface | Text | Use |
|
||||||
|
|---|---|---|---|---|
|
||||||
|
| Primary | `.btn-primary` | `--accent` | white | The one CTA per row |
|
||||||
|
| Secondary | `.btn-secondary` | `--bg-surface` + `--border` | `--text-secondary` | Side actions |
|
||||||
|
| Ghost | `.btn-ghost` | transparent | `--text-secondary` | Tertiary, destructive-soft |
|
||||||
|
| Danger | `.btn-danger` | `--error` | white | Stop, delete, irreversible |
|
||||||
|
|
||||||
|
Disabled state: `opacity: 0.5; cursor: not-allowed;`. Never grey out by
|
||||||
|
swapping colors — keep variant identity.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Card component (`.project-card`)
|
||||||
|
|
||||||
|
- `border-radius: var(--radius-lg)` (12px)
|
||||||
|
- `background: var(--bg-surface)`
|
||||||
|
- `border: 1px solid var(--border)`
|
||||||
|
- `padding: var(--space-5)`
|
||||||
|
- `transition: border-color var(--transition-base)`
|
||||||
|
- **Hover**: `border-color: var(--border-focus)` (blue glow). No surface
|
||||||
|
brightening — border-only hover keeps the grid calm.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Status pill system
|
||||||
|
|
||||||
|
A `.status-pill` is a `--radius-full` chip placed on every project card. It
|
||||||
|
encodes the current workflow state via `data-status="<state>"`.
|
||||||
|
|
||||||
|
### Visual recipe
|
||||||
|
|
||||||
|
- **Background**: state color at **18% alpha** (`color-mix(in srgb, var(--status-X) 18%, transparent)` or precomputed `rgba(...)`).
|
||||||
|
- **Text**: solid state color (full alpha).
|
||||||
|
- **Border**: 1px state color at 30% alpha.
|
||||||
|
- **Padding**: `var(--space-1) var(--space-3)` — slim.
|
||||||
|
- **Font**: `var(--text-xs)`, weight 500.
|
||||||
|
|
||||||
|
### Pulse-dot
|
||||||
|
|
||||||
|
Active states render a 6px CSS-shape circle that pulses (no SVG, no emoji).
|
||||||
|
|
||||||
|
```css
|
||||||
|
.status-pill::before {
|
||||||
|
content: ""; width: 6px; height: 6px; border-radius: 50%;
|
||||||
|
background: currentColor; margin-right: var(--space-2);
|
||||||
|
}
|
||||||
|
.status-pill[data-status="running-ralph"]::before,
|
||||||
|
.status-pill[data-status="running-manual"]::before,
|
||||||
|
.status-pill[data-status="planning"]::before {
|
||||||
|
animation: pulse-dot 1.6s ease-in-out infinite;
|
||||||
|
}
|
||||||
|
@keyframes pulse-dot { 0%,100% { opacity: 1; } 50% { opacity: 0.35; } }
|
||||||
|
```
|
||||||
|
|
||||||
|
### State matrix
|
||||||
|
|
||||||
|
| `data-status` | Color token | Pulse | Label |
|
||||||
|
|---|---|---|---|
|
||||||
|
| `running-ralph` | `--status-running` | yes | Ralph running |
|
||||||
|
| `running-manual` | `--status-running` | yes | Manual run |
|
||||||
|
| `planning` | `--status-planning` | yes | Planning |
|
||||||
|
| `approved` | `--status-approved` | no | Approved |
|
||||||
|
| `pending` | `--status-pending` | no | Pending |
|
||||||
|
| `blocked` | `--status-blocked` | no | Blocked |
|
||||||
|
| `failed` | `--status-failed` | no | Failed |
|
||||||
|
| `complete` | `--status-complete` | no | Complete |
|
||||||
|
| `idle` | `--status-idle` | no | Idle |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## BUTTONS_FOR_STATE matrix
|
||||||
|
|
||||||
|
Each project card surfaces ≤3 actions, ordered Primary / Secondary / Ghost.
|
||||||
|
The renderer picks the row matching `data-status`.
|
||||||
|
|
||||||
|
| State | Primary | Secondary | Ghost |
|
||||||
|
|---|---|---|---|
|
||||||
|
| `running-ralph` | Stop Ralph (danger) | Logs | PRD |
|
||||||
|
| `running-manual` | Stop (danger) | Open server | Logs |
|
||||||
|
| `planning` | Continue chat | — | Cancel |
|
||||||
|
| `approved` | — | Unapprove | Plan |
|
||||||
|
| `pending` | Approve | Plan with Echo | Cancel |
|
||||||
|
| `blocked` | View logs | Resume | — |
|
||||||
|
| `failed` | View logs | Retry | Rollback |
|
||||||
|
| `complete` | View plan | — | Run again |
|
||||||
|
| `idle` | Run Ralph | — | Delete |
|
||||||
|
|
||||||
|
Rules:
|
||||||
|
- **Stop / Delete** are always `.btn-danger`, never primary blue.
|
||||||
|
- A dash (`—`) means render nothing (no placeholder, no greyed-out slot).
|
||||||
|
- The Primary slot is the default action when the card is keyboard-focused
|
||||||
|
and Enter is pressed.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Toast taxonomy
|
||||||
|
|
||||||
|
Toasts appear top-right, stack vertically, dismiss after 4s (errors: 8s).
|
||||||
|
**Five types**, distinguished by a 3px colored left bar — no emoji, no icon
|
||||||
|
fill. Body uses `--text-primary` on `--bg-surface`.
|
||||||
|
|
||||||
|
| Type | Bar color | Use |
|
||||||
|
|---|---|---|
|
||||||
|
| `success` | `--success` | Saved, approved, deployed |
|
||||||
|
| `info` | `--accent` | Neutral confirmation |
|
||||||
|
| `warning` | `--warning` | Soft fail, retried |
|
||||||
|
| `busy` | `--status-planning` | Long-running op started |
|
||||||
|
| `error` | `--error` | Hard fail, action required |
|
||||||
|
|
||||||
|
Toast renderer is shared across pages and reads from a single global
|
||||||
|
`window.showToast(type, msg)` helper.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## SSE indicator
|
||||||
|
|
||||||
|
Top-right of pages with a live stream (workspace, ralph). Three states
|
||||||
|
indicated via a CSS-shape pulse-dot — never an emoji.
|
||||||
|
|
||||||
|
| State | Dot color | Label | Pulse |
|
||||||
|
|---|---|---|---|
|
||||||
|
| Live | `--success` | "Live" | yes |
|
||||||
|
| Polling | `--warning` | "Polling" | no |
|
||||||
|
| Offline | `--error` | "Offline" | no |
|
||||||
|
|
||||||
|
Uses the same `.pulse-dot` 6px CSS shape as `.status-pill::before`. The dot
|
||||||
|
sits before the label, both inside a tiny `.sse-indicator` chip on
|
||||||
|
`--bg-surface`.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Modal pattern
|
||||||
|
|
||||||
|
Used for the planning chat, PRD viewer, log tail, propose-feature form.
|
||||||
|
|
||||||
|
- **Overlay**: full viewport, `background: rgba(0,0,0,0.6)`,
|
||||||
|
`backdrop-filter: blur(4px)`, `display: flex` centered.
|
||||||
|
- **Container** (`.modal`): `--radius-lg`, `--bg-base`, `--border`,
|
||||||
|
max-width 720px, max-height 80vh, scroll on overflow.
|
||||||
|
- **Header / Footer**: 1px border separators using `--border`.
|
||||||
|
- **Focus trap**: first focusable element gets focus on open; Tab cycles
|
||||||
|
inside the modal.
|
||||||
|
- **ESC**: closes — but if the modal has unsaved input, prompt
|
||||||
|
"Discard changes?" before closing. Click on overlay = same behavior.
|
||||||
|
- **Mobile (≤640px)**: full-screen takeover. Header / footer stick; body
|
||||||
|
scrolls. Implemented in `tokens.css` via the shared `@media (max-width:640px)`
|
||||||
|
block.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## No-emoji rule
|
||||||
|
|
||||||
|
**No emoji anywhere in the dashboard.** This is a hard rule, not a
|
||||||
|
preference.
|
||||||
|
|
||||||
|
- Buttons are **text-only**. No leading/trailing emoji decoration.
|
||||||
|
- Status indicators use **CSS-shape colored dots** (`.pulse-dot`,
|
||||||
|
`.status-pill::before`) — never `🟢 ⏱ 🛑 ✅` etc.
|
||||||
|
- The login monogram is the **letter `E`** rendered in Inter 700 inside a
|
||||||
|
square with `--accent` background. Not an emoji, not an SVG logo.
|
||||||
|
- Where icons are needed (nav, action buttons), use **Lucide-style stroke
|
||||||
|
SVGs inlined** — `stroke: currentColor`, `fill: none`, `stroke-width: 2`,
|
||||||
|
`stroke-linecap: round`, `stroke-linejoin: round`. Never use emoji as a
|
||||||
|
substitute for an icon.
|
||||||
|
|
||||||
|
This rule keeps the UI legible across themes, scales correctly at all sizes,
|
||||||
|
and avoids OS-dependent rendering (Apple, Twemoji, Noto all draw the same
|
||||||
|
emoji differently).
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Pages that include this system
|
||||||
|
|
||||||
|
Every dashboard page (`index.html`, `workspace.html`, `ralph.html`, `notes.html`,
|
||||||
|
`habits.html`, `files.html`, `login.html`) **must** include in `<head>`:
|
||||||
|
|
||||||
|
```html
|
||||||
|
<link rel="stylesheet" href="/echo/static/tokens.css">
|
||||||
|
<link rel="stylesheet" href="/echo/common.css">
|
||||||
|
```
|
||||||
|
|
||||||
|
In that order — tokens first so `common.css` and per-page styles can resolve
|
||||||
|
the variables.
|
||||||
163
dashboard/api.py
163
dashboard/api.py
@@ -6,6 +6,7 @@ responsible only for URL dispatch, CORS, JSON response helpers, and
|
|||||||
server bootstrap.
|
server bootstrap.
|
||||||
"""
|
"""
|
||||||
import json
|
import json
|
||||||
|
import os
|
||||||
import sys
|
import sys
|
||||||
from http.server import SimpleHTTPRequestHandler, ThreadingHTTPServer
|
from http.server import SimpleHTTPRequestHandler, ThreadingHTTPServer
|
||||||
from pathlib import Path
|
from pathlib import Path
|
||||||
@@ -34,12 +35,14 @@ from constants import ( # noqa: E402 re-exported for tests
|
|||||||
VENV_PYTHON,
|
VENV_PYTHON,
|
||||||
WORKSPACE_DIR,
|
WORKSPACE_DIR,
|
||||||
)
|
)
|
||||||
|
from handlers.auth import AuthHandlers # noqa: E402
|
||||||
from handlers.cron import CronHandlers # noqa: E402
|
from handlers.cron import CronHandlers # noqa: E402
|
||||||
from handlers.eco import EcoHandlers # noqa: E402
|
from handlers.eco import EcoHandlers # noqa: E402
|
||||||
from handlers.files import FilesHandlers # noqa: E402
|
from handlers.files import FilesHandlers # noqa: E402
|
||||||
from handlers.git import GitHandlers # noqa: E402
|
from handlers.git import GitHandlers # noqa: E402
|
||||||
from handlers.habits import HabitsHandlers # noqa: E402
|
from handlers.habits import HabitsHandlers # noqa: E402
|
||||||
from handlers.pdf import PDFHandlers # noqa: E402
|
from handlers.pdf import PDFHandlers # noqa: E402
|
||||||
|
from handlers.projects import ProjectsHandlers # noqa: E402
|
||||||
from handlers.ralph import RalphHandlers # noqa: E402
|
from handlers.ralph import RalphHandlers # noqa: E402
|
||||||
from handlers.workspace import WorkspaceHandlers # noqa: E402
|
from handlers.workspace import WorkspaceHandlers # noqa: E402
|
||||||
from handlers.youtube import YoutubeHandlers # noqa: E402
|
from handlers.youtube import YoutubeHandlers # noqa: E402
|
||||||
@@ -59,10 +62,6 @@ NAV_HTML = '''<header class="header">
|
|||||||
<i data-lucide="code"></i>
|
<i data-lucide="code"></i>
|
||||||
<span>Workspace</span>
|
<span>Workspace</span>
|
||||||
</a>
|
</a>
|
||||||
<a href="/echo/ralph.html" class="nav-item" data-page="ralph">
|
|
||||||
<i data-lucide="bot"></i>
|
|
||||||
<span>Ralph</span>
|
|
||||||
</a>
|
|
||||||
<a href="/echo/notes.html" class="nav-item" data-page="notes">
|
<a href="/echo/notes.html" class="nav-item" data-page="notes">
|
||||||
<i data-lucide="file-text"></i>
|
<i data-lucide="file-text"></i>
|
||||||
<span>KB</span>
|
<span>KB</span>
|
||||||
@@ -93,6 +92,8 @@ NAV_HTML = '''<header class="header">
|
|||||||
|
|
||||||
|
|
||||||
class TaskBoardHandler(
|
class TaskBoardHandler(
|
||||||
|
AuthHandlers,
|
||||||
|
ProjectsHandlers,
|
||||||
GitHandlers,
|
GitHandlers,
|
||||||
HabitsHandlers,
|
HabitsHandlers,
|
||||||
EcoHandlers,
|
EcoHandlers,
|
||||||
@@ -133,6 +134,36 @@ class TaskBoardHandler(
|
|||||||
# ── dispatch ────────────────────────────────────────────────
|
# ── dispatch ────────────────────────────────────────────────
|
||||||
def do_GET(self):
|
def do_GET(self):
|
||||||
from datetime import datetime as _dt
|
from datetime import datetime as _dt
|
||||||
|
import os
|
||||||
|
# Static assets — served directly from dashboard/static/. Handles the
|
||||||
|
# case where the URL is hit with the /echo/ prefix intact (e.g. direct
|
||||||
|
# localhost curl); when behind the reverse proxy that strips /echo/,
|
||||||
|
# the request falls through to SimpleHTTPRequestHandler which serves
|
||||||
|
# cwd/static/ naturally (cwd is set to KANBAN_DIR/dashboard).
|
||||||
|
if self.path.startswith('/echo/static/'):
|
||||||
|
rel = self.path[len('/echo/static/'):].split('?', 1)[0]
|
||||||
|
file_path = os.path.join(os.path.dirname(__file__), 'static', rel)
|
||||||
|
if os.path.isfile(file_path):
|
||||||
|
ext = os.path.splitext(rel)[1].lstrip('.').lower()
|
||||||
|
ctype = {
|
||||||
|
'css': 'text/css',
|
||||||
|
'woff2': 'font/woff2',
|
||||||
|
'woff': 'font/woff',
|
||||||
|
'js': 'application/javascript',
|
||||||
|
'svg': 'image/svg+xml',
|
||||||
|
'png': 'image/png',
|
||||||
|
}.get(ext, 'application/octet-stream')
|
||||||
|
with open(file_path, 'rb') as f:
|
||||||
|
data = f.read()
|
||||||
|
self.send_response(200)
|
||||||
|
self.send_header('Content-Type', ctype)
|
||||||
|
self.send_header('Content-Length', str(len(data)))
|
||||||
|
self.send_header('Cache-Control', 'public, max-age=86400')
|
||||||
|
self.end_headers()
|
||||||
|
self.wfile.write(data)
|
||||||
|
else:
|
||||||
|
self.send_error(404)
|
||||||
|
return
|
||||||
if self.path == '/api/status':
|
if self.path == '/api/status':
|
||||||
self.send_json({'status': 'ok', 'time': _dt.now().isoformat()})
|
self.send_json({'status': 'ok', 'time': _dt.now().isoformat()})
|
||||||
elif self.path == '/api/git' or self.path.startswith('/api/git?'):
|
elif self.path == '/api/git' or self.path.startswith('/api/git?'):
|
||||||
@@ -182,6 +213,55 @@ class TaskBoardHandler(
|
|||||||
self.send_error(404)
|
self.send_error(404)
|
||||||
else:
|
else:
|
||||||
self.send_error(404)
|
self.send_error(404)
|
||||||
|
elif self.path == '/api/projects' or self.path.startswith('/api/projects?'):
|
||||||
|
self.handle_unified_status()
|
||||||
|
elif self.path == '/api/projects/signature' or self.path.startswith('/api/projects/signature?'):
|
||||||
|
self.handle_unified_signature()
|
||||||
|
elif self.path == '/api/projects/stream' or self.path.startswith('/api/projects/stream?'):
|
||||||
|
self.handle_projects_stream()
|
||||||
|
elif self.path.startswith('/api/projects/'):
|
||||||
|
# /api/projects/<slug>/plan/(state|transcript)
|
||||||
|
parts = self.path.split('?', 1)[0].split('/')
|
||||||
|
# parts: ['', 'api', 'projects', '<slug>', 'plan', '<action>']
|
||||||
|
if len(parts) >= 6 and parts[4] == 'plan':
|
||||||
|
slug = parts[3]
|
||||||
|
action = parts[5]
|
||||||
|
if action == 'state':
|
||||||
|
self.handle_plan_state(slug)
|
||||||
|
elif action == 'transcript':
|
||||||
|
self.handle_plan_transcript(slug)
|
||||||
|
else:
|
||||||
|
self.send_error(404)
|
||||||
|
else:
|
||||||
|
self.send_error(404)
|
||||||
|
elif self.path == '/echo/login' or self.path.startswith('/echo/login?'):
|
||||||
|
# If already logged in, redirect to workspace; otherwise serve
|
||||||
|
# login.html (created in Lane B2).
|
||||||
|
if self._check_dashboard_cookie():
|
||||||
|
self.send_response(302)
|
||||||
|
self.send_header('Location', '/echo/workspace.html')
|
||||||
|
self.send_header('Content-Length', '0')
|
||||||
|
self.end_headers()
|
||||||
|
return
|
||||||
|
login_html = KANBAN_DIR / 'login.html'
|
||||||
|
if login_html.is_file():
|
||||||
|
body = login_html.read_text('utf-8').replace('<!--NAV-->', NAV_HTML).encode('utf-8')
|
||||||
|
self.send_response(200)
|
||||||
|
self.send_header('Content-Type', 'text/html; charset=utf-8')
|
||||||
|
self.send_header('Content-Length', str(len(body)))
|
||||||
|
self.send_header('Cache-Control', 'no-cache')
|
||||||
|
self.end_headers()
|
||||||
|
self.wfile.write(body)
|
||||||
|
else:
|
||||||
|
# Lane B2 hasn't shipped yet — return 503 with a hint.
|
||||||
|
self.send_error(503, 'login.html not yet available')
|
||||||
|
elif self.path == '/ralph.html' or self.path.startswith('/ralph.html?'):
|
||||||
|
# Legacy redirect — Ralph dashboard merged into workspace.html (Lane D1).
|
||||||
|
self.send_response(302)
|
||||||
|
self.send_header('Location', '/echo/workspace.html')
|
||||||
|
self.send_header('Content-Length', '0')
|
||||||
|
self.end_headers()
|
||||||
|
return
|
||||||
elif self.path.startswith('/api/'):
|
elif self.path.startswith('/api/'):
|
||||||
self.send_error(404)
|
self.send_error(404)
|
||||||
else:
|
else:
|
||||||
@@ -206,7 +286,49 @@ class TaskBoardHandler(
|
|||||||
return
|
return
|
||||||
super().do_GET()
|
super().do_GET()
|
||||||
|
|
||||||
|
# POSTs that bypass the auth middleware (login itself can't require a cookie).
|
||||||
|
UNPROTECTED_POSTS = frozenset({'/api/auth/login'})
|
||||||
|
|
||||||
def do_POST(self):
|
def do_POST(self):
|
||||||
|
# ── Auth middleware ────────────────────────────────────────
|
||||||
|
# Only protect /api/* POSTs for now — older endpoints predate auth and
|
||||||
|
# we want a single, well-defined gate. Static asset POSTs (none today)
|
||||||
|
# would also fall through.
|
||||||
|
path_only = self.path.split('?', 1)[0]
|
||||||
|
if path_only.startswith('/api/') and path_only not in self.UNPROTECTED_POSTS:
|
||||||
|
if not self._check_dashboard_cookie():
|
||||||
|
body = b'{"error":"Unauthorized"}'
|
||||||
|
self.send_response(401)
|
||||||
|
self.send_header('Content-Type', 'application/json')
|
||||||
|
self.send_header('Content-Length', str(len(body)))
|
||||||
|
self.send_header('Cache-Control', 'no-store')
|
||||||
|
self.end_headers()
|
||||||
|
try:
|
||||||
|
self.wfile.write(body)
|
||||||
|
except (BrokenPipeError, ConnectionResetError):
|
||||||
|
pass
|
||||||
|
return
|
||||||
|
# CSRF: require Origin (or Referer) to be on the allowlist.
|
||||||
|
origin = self.headers.get('Origin', '') or ''
|
||||||
|
referer = self.headers.get('Referer', '') or ''
|
||||||
|
allowed = ['http://127.0.0.1:8088', 'http://localhost:8088']
|
||||||
|
dh = os.environ.get('DASHBOARD_HOST', '').strip()
|
||||||
|
if dh:
|
||||||
|
allowed.append(dh)
|
||||||
|
check = origin or referer
|
||||||
|
if check and not any(check.startswith(a) for a in allowed):
|
||||||
|
body = b'{"error":"CSRF"}'
|
||||||
|
self.send_response(403)
|
||||||
|
self.send_header('Content-Type', 'application/json')
|
||||||
|
self.send_header('Content-Length', str(len(body)))
|
||||||
|
self.send_header('Cache-Control', 'no-store')
|
||||||
|
self.end_headers()
|
||||||
|
try:
|
||||||
|
self.wfile.write(body)
|
||||||
|
except (BrokenPipeError, ConnectionResetError):
|
||||||
|
pass
|
||||||
|
return
|
||||||
|
|
||||||
if self.path == '/api/youtube':
|
if self.path == '/api/youtube':
|
||||||
self.handle_youtube()
|
self.handle_youtube()
|
||||||
elif self.path == '/api/files':
|
elif self.path == '/api/files':
|
||||||
@@ -255,6 +377,39 @@ class TaskBoardHandler(
|
|||||||
self.send_error(404)
|
self.send_error(404)
|
||||||
else:
|
else:
|
||||||
self.send_error(404)
|
self.send_error(404)
|
||||||
|
elif self.path == '/api/auth/login':
|
||||||
|
self.handle_login()
|
||||||
|
elif self.path == '/api/auth/logout':
|
||||||
|
self.handle_logout()
|
||||||
|
elif self.path == '/api/projects/propose':
|
||||||
|
self.handle_propose()
|
||||||
|
elif self.path == '/api/projects/approve':
|
||||||
|
self.handle_approve()
|
||||||
|
elif self.path == '/api/projects/unapprove':
|
||||||
|
self.handle_unapprove()
|
||||||
|
elif self.path == '/api/projects/cancel':
|
||||||
|
self.handle_cancel()
|
||||||
|
elif self.path.startswith('/api/projects/'):
|
||||||
|
# /api/projects/<slug>/plan/(start|respond|finalize|cancel|advance)
|
||||||
|
parts = self.path.split('?', 1)[0].split('/')
|
||||||
|
# parts: ['', 'api', 'projects', '<slug>', 'plan', '<action>']
|
||||||
|
if len(parts) >= 6 and parts[4] == 'plan':
|
||||||
|
slug = parts[3]
|
||||||
|
action = parts[5]
|
||||||
|
if action == 'start':
|
||||||
|
self.handle_plan_start(slug)
|
||||||
|
elif action == 'respond':
|
||||||
|
self.handle_plan_respond(slug)
|
||||||
|
elif action == 'finalize':
|
||||||
|
self.handle_plan_finalize(slug)
|
||||||
|
elif action == 'cancel':
|
||||||
|
self.handle_plan_cancel_planning(slug)
|
||||||
|
elif action == 'advance':
|
||||||
|
self.handle_plan_advance(slug)
|
||||||
|
else:
|
||||||
|
self.send_error(404)
|
||||||
|
else:
|
||||||
|
self.send_error(404)
|
||||||
else:
|
else:
|
||||||
self.send_error(404)
|
self.send_error(404)
|
||||||
|
|
||||||
|
|||||||
54
dashboard/handlers/_validators.py
Normal file
54
dashboard/handlers/_validators.py
Normal file
@@ -0,0 +1,54 @@
|
|||||||
|
"""Shared validation helpers for dashboard handlers."""
|
||||||
|
import json
|
||||||
|
import re
|
||||||
|
from http.server import BaseHTTPRequestHandler
|
||||||
|
|
||||||
|
_SLUG_RE = re.compile(r'^[a-z0-9][a-z0-9\-_]{1,38}[a-z0-9]$')
|
||||||
|
|
||||||
|
|
||||||
|
def validate_slug(slug: str) -> str | None:
|
||||||
|
"""Returns error message or None if valid."""
|
||||||
|
if not slug:
|
||||||
|
return "slug required"
|
||||||
|
if not _SLUG_RE.match(slug):
|
||||||
|
return "slug must be 3-40 chars, lowercase alphanumeric + hyphens/underscores"
|
||||||
|
return None
|
||||||
|
|
||||||
|
|
||||||
|
def validate_description(desc: str) -> str | None:
|
||||||
|
"""Returns error message or None if valid. Min 10 chars, max 500."""
|
||||||
|
if not desc or len(desc.strip()) < 10:
|
||||||
|
return "description must be at least 10 characters"
|
||||||
|
if len(desc) > 500:
|
||||||
|
return "description must be at most 500 characters"
|
||||||
|
return None
|
||||||
|
|
||||||
|
|
||||||
|
def parse_json_body(handler: BaseHTTPRequestHandler) -> dict | None:
|
||||||
|
"""Parse JSON body from request. Returns None on failure (sends 400)."""
|
||||||
|
try:
|
||||||
|
length = int(handler.headers.get('Content-Length', '0') or '0')
|
||||||
|
except (TypeError, ValueError):
|
||||||
|
length = 0
|
||||||
|
|
||||||
|
def _send_error(msg: str) -> None:
|
||||||
|
sender = getattr(handler, 'send_json', None)
|
||||||
|
if callable(sender):
|
||||||
|
sender({'error': msg}, 400)
|
||||||
|
return
|
||||||
|
body = json.dumps({'error': msg}).encode()
|
||||||
|
handler.send_response(400)
|
||||||
|
handler.send_header('Content-Type', 'application/json')
|
||||||
|
handler.send_header('Content-Length', str(len(body)))
|
||||||
|
handler.end_headers()
|
||||||
|
handler.wfile.write(body)
|
||||||
|
|
||||||
|
if length <= 0:
|
||||||
|
_send_error('empty body')
|
||||||
|
return None
|
||||||
|
try:
|
||||||
|
raw = handler.rfile.read(length)
|
||||||
|
return json.loads(raw.decode('utf-8'))
|
||||||
|
except (ValueError, json.JSONDecodeError, UnicodeDecodeError):
|
||||||
|
_send_error('invalid JSON body')
|
||||||
|
return None
|
||||||
174
dashboard/handlers/auth.py
Normal file
174
dashboard/handlers/auth.py
Normal file
@@ -0,0 +1,174 @@
|
|||||||
|
"""Cookie-based authentication for the unified dashboard.
|
||||||
|
|
||||||
|
This mixin provides:
|
||||||
|
|
||||||
|
- POST /api/auth/login — exchanges a token (form body) for a cookie.
|
||||||
|
- POST /api/auth/logout — clears the cookie.
|
||||||
|
- _check_dashboard_cookie — used by the global POST middleware (and the
|
||||||
|
SSE GET endpoint) to gate access.
|
||||||
|
|
||||||
|
`DASHBOARD_TOKEN` is read once from `dashboard/.env` (loaded into
|
||||||
|
`os.environ` by `dashboard/constants.py` at import time). When the token is
|
||||||
|
not configured we generate a random one at startup, stash it in-process,
|
||||||
|
and warn loudly to stderr — this means the dashboard is reachable from
|
||||||
|
localhost only with a freshly-printed token (printed once at boot).
|
||||||
|
"""
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import json
|
||||||
|
import logging
|
||||||
|
import os
|
||||||
|
import secrets
|
||||||
|
import sys
|
||||||
|
from urllib.parse import parse_qs
|
||||||
|
|
||||||
|
log = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
# 30 days
|
||||||
|
_COOKIE_MAX_AGE = 60 * 60 * 24 * 30
|
||||||
|
_COOKIE_NAME = "dashboard"
|
||||||
|
_COOKIE_PATH = "/echo/"
|
||||||
|
|
||||||
|
# Module-level cache for the resolved token. Set lazily on first call so
|
||||||
|
# importing this module doesn't have a side effect at process boot.
|
||||||
|
_DASHBOARD_TOKEN: str | None = None
|
||||||
|
|
||||||
|
|
||||||
|
def _get_dashboard_token() -> str:
|
||||||
|
"""Return the dashboard token (cached). Generates a random one if absent.
|
||||||
|
|
||||||
|
`dashboard/constants.py` already loads `dashboard/.env` into os.environ at
|
||||||
|
import time, so by the time this is called the value (if present) is in
|
||||||
|
`os.environ['DASHBOARD_TOKEN']`. If missing, we mint a 32-byte URL-safe
|
||||||
|
token and warn — operators must read it from the log to log in.
|
||||||
|
"""
|
||||||
|
global _DASHBOARD_TOKEN
|
||||||
|
if _DASHBOARD_TOKEN is not None:
|
||||||
|
return _DASHBOARD_TOKEN
|
||||||
|
token = os.environ.get("DASHBOARD_TOKEN", "").strip()
|
||||||
|
if not token:
|
||||||
|
token = secrets.token_urlsafe(32)
|
||||||
|
msg = (
|
||||||
|
"[auth] DASHBOARD_TOKEN not set in dashboard/.env — generated a "
|
||||||
|
f"random token for this process: {token}\n"
|
||||||
|
" Add `DASHBOARD_TOKEN=<value>` to dashboard/.env to make it "
|
||||||
|
"stable across restarts.\n"
|
||||||
|
)
|
||||||
|
print(msg, file=sys.stderr, flush=True)
|
||||||
|
log.warning("DASHBOARD_TOKEN not configured — using ephemeral token")
|
||||||
|
_DASHBOARD_TOKEN = token
|
||||||
|
return token
|
||||||
|
|
||||||
|
|
||||||
|
def _parse_cookie_header(raw: str) -> dict[str, str]:
|
||||||
|
"""Tiny RFC 6265 cookie-pair parser. Last-write-wins on duplicates."""
|
||||||
|
out: dict[str, str] = {}
|
||||||
|
if not raw:
|
||||||
|
return out
|
||||||
|
for chunk in raw.split(";"):
|
||||||
|
chunk = chunk.strip()
|
||||||
|
if not chunk or "=" not in chunk:
|
||||||
|
continue
|
||||||
|
k, v = chunk.split("=", 1)
|
||||||
|
out[k.strip()] = v.strip()
|
||||||
|
return out
|
||||||
|
|
||||||
|
|
||||||
|
class AuthHandlers:
|
||||||
|
"""Mixin: /api/auth/login, /api/auth/logout, plus _check_dashboard_cookie."""
|
||||||
|
|
||||||
|
# ── helpers ────────────────────────────────────────────────────────
|
||||||
|
def _check_dashboard_cookie(self) -> bool:
|
||||||
|
"""Return True if the request carries a valid `dashboard` cookie."""
|
||||||
|
raw = self.headers.get("Cookie", "") or ""
|
||||||
|
cookies = _parse_cookie_header(raw)
|
||||||
|
provided = cookies.get(_COOKIE_NAME, "")
|
||||||
|
if not provided:
|
||||||
|
return False
|
||||||
|
expected = _get_dashboard_token()
|
||||||
|
# Constant-time compare — token guess attacks aren't realistic here
|
||||||
|
# (cookie path is /echo/, HttpOnly), but cheap defense in depth.
|
||||||
|
return secrets.compare_digest(provided, expected)
|
||||||
|
|
||||||
|
def _read_form_body(self) -> dict[str, str]:
|
||||||
|
"""Parse `application/x-www-form-urlencoded` POST body."""
|
||||||
|
try:
|
||||||
|
length = int(self.headers.get("Content-Length", "0") or "0")
|
||||||
|
except (TypeError, ValueError):
|
||||||
|
length = 0
|
||||||
|
if length <= 0:
|
||||||
|
return {}
|
||||||
|
try:
|
||||||
|
raw = self.rfile.read(length).decode("utf-8")
|
||||||
|
except (UnicodeDecodeError, OSError):
|
||||||
|
return {}
|
||||||
|
parsed = parse_qs(raw, keep_blank_values=True)
|
||||||
|
# Flatten — single-value form fields only
|
||||||
|
return {k: v[0] for k, v in parsed.items() if v}
|
||||||
|
|
||||||
|
# ── POST /api/auth/login ───────────────────────────────────────────
|
||||||
|
def handle_login(self):
|
||||||
|
"""Validate token from form body; on success, set cookie + 302 to workspace.
|
||||||
|
|
||||||
|
On failure, return 401 JSON. The cookie is set with HttpOnly +
|
||||||
|
SameSite=Strict; Path=/echo/ so it scopes to the dashboard reverse
|
||||||
|
proxy mount.
|
||||||
|
"""
|
||||||
|
# Accept JSON body too (login.html might POST JSON in Lane B2)
|
||||||
|
ctype = (self.headers.get("Content-Type", "") or "").lower()
|
||||||
|
if "application/json" in ctype:
|
||||||
|
try:
|
||||||
|
length = int(self.headers.get("Content-Length", "0") or "0")
|
||||||
|
raw = self.rfile.read(length).decode("utf-8") if length > 0 else ""
|
||||||
|
form = json.loads(raw) if raw else {}
|
||||||
|
if not isinstance(form, dict):
|
||||||
|
form = {}
|
||||||
|
except (ValueError, json.JSONDecodeError, UnicodeDecodeError, OSError):
|
||||||
|
form = {}
|
||||||
|
else:
|
||||||
|
form = self._read_form_body()
|
||||||
|
|
||||||
|
provided = (form.get("token") or "").strip()
|
||||||
|
expected = _get_dashboard_token()
|
||||||
|
if not provided or not secrets.compare_digest(provided, expected):
|
||||||
|
body = json.dumps({"error": "Invalid token"}).encode("utf-8")
|
||||||
|
self.send_response(401)
|
||||||
|
self.send_header("Content-Type", "application/json")
|
||||||
|
self.send_header("Content-Length", str(len(body)))
|
||||||
|
self.send_header("Cache-Control", "no-store")
|
||||||
|
self.end_headers()
|
||||||
|
try:
|
||||||
|
self.wfile.write(body)
|
||||||
|
except (BrokenPipeError, ConnectionResetError):
|
||||||
|
pass
|
||||||
|
return
|
||||||
|
|
||||||
|
cookie = (
|
||||||
|
f"{_COOKIE_NAME}={expected}; HttpOnly; SameSite=Strict; "
|
||||||
|
f"Path={_COOKIE_PATH}; Max-Age={_COOKIE_MAX_AGE}"
|
||||||
|
)
|
||||||
|
self.send_response(302)
|
||||||
|
self.send_header("Set-Cookie", cookie)
|
||||||
|
self.send_header("Location", "/echo/workspace.html")
|
||||||
|
self.send_header("Content-Length", "0")
|
||||||
|
self.send_header("Cache-Control", "no-store")
|
||||||
|
self.end_headers()
|
||||||
|
|
||||||
|
# ── POST /api/auth/logout ──────────────────────────────────────────
|
||||||
|
def handle_logout(self):
|
||||||
|
"""Clear the dashboard cookie. Returns 200 JSON `{"ok": true}`."""
|
||||||
|
cookie = (
|
||||||
|
f"{_COOKIE_NAME}=; HttpOnly; SameSite=Strict; "
|
||||||
|
f"Path={_COOKIE_PATH}; Max-Age=0"
|
||||||
|
)
|
||||||
|
body = json.dumps({"ok": True}).encode("utf-8")
|
||||||
|
self.send_response(200)
|
||||||
|
self.send_header("Set-Cookie", cookie)
|
||||||
|
self.send_header("Content-Type", "application/json")
|
||||||
|
self.send_header("Content-Length", str(len(body)))
|
||||||
|
self.send_header("Cache-Control", "no-store")
|
||||||
|
self.end_headers()
|
||||||
|
try:
|
||||||
|
self.wfile.write(body)
|
||||||
|
except (BrokenPipeError, ConnectionResetError):
|
||||||
|
pass
|
||||||
1014
dashboard/handlers/projects.py
Normal file
1014
dashboard/handlers/projects.py
Normal file
File diff suppressed because it is too large
Load Diff
@@ -24,7 +24,6 @@ Reuse path constants din `dashboard/constants.py` (WORKSPACE_DIR).
|
|||||||
"""
|
"""
|
||||||
import json
|
import json
|
||||||
import os
|
import os
|
||||||
import re
|
|
||||||
import signal
|
import signal
|
||||||
import subprocess
|
import subprocess
|
||||||
import sys
|
import sys
|
||||||
@@ -34,6 +33,8 @@ from pathlib import Path
|
|||||||
|
|
||||||
import constants
|
import constants
|
||||||
|
|
||||||
|
from handlers._validators import _SLUG_RE, validate_slug
|
||||||
|
|
||||||
# Best-effort import of pure functions for /api/ralph/usage (instrumentation MVP).
|
# Best-effort import of pure functions for /api/ralph/usage (instrumentation MVP).
|
||||||
# Helper lives at <repo>/tools/ralph_usage.py — sibling of `dashboard/`.
|
# Helper lives at <repo>/tools/ralph_usage.py — sibling of `dashboard/`.
|
||||||
_TOOLS_DIR = Path(__file__).resolve().parents[2] / "tools"
|
_TOOLS_DIR = Path(__file__).resolve().parents[2] / "tools"
|
||||||
@@ -45,10 +46,6 @@ except ImportError: # pragma: no cover — diagnostic only
|
|||||||
ralph_usage = None # type: ignore
|
ralph_usage = None # type: ignore
|
||||||
|
|
||||||
|
|
||||||
# Slug strict: alphanum + dash + underscore, max 64 chars. Reject path traversal explicit.
|
|
||||||
_SLUG_RE = re.compile(r"^[A-Za-z0-9_-]{1,64}$")
|
|
||||||
|
|
||||||
|
|
||||||
# Path Ralph per proiect (mereu în scripts/ralph/)
|
# Path Ralph per proiect (mereu în scripts/ralph/)
|
||||||
def _ralph_dir(project_dir: Path) -> Path:
|
def _ralph_dir(project_dir: Path) -> Path:
|
||||||
return project_dir / "scripts" / "ralph"
|
return project_dir / "scripts" / "ralph"
|
||||||
@@ -65,17 +62,11 @@ class RalphHandlers:
|
|||||||
def _ralph_validate_slug(self, slug: str):
|
def _ralph_validate_slug(self, slug: str):
|
||||||
"""Validează slug-ul + returnează project_dir sau None.
|
"""Validează slug-ul + returnează project_dir sau None.
|
||||||
|
|
||||||
Strict: alphanum + dash + underscore, ≤64 chars. Path traversal sequences
|
Delegates the slug-shape check to the shared `validate_slug` helper
|
||||||
(`..`, `/`, `\\`) sau caractere ne-alfanumerice sunt respinse înainte de
|
in `dashboard/handlers/_validators.py`; only filesystem checks remain
|
||||||
orice atingere a filesystem-ului.
|
here (existence + path-confinement under WORKSPACE_DIR).
|
||||||
"""
|
"""
|
||||||
if not slug:
|
if validate_slug(slug) is not None:
|
||||||
return None
|
|
||||||
# Defense-in-depth: explicit path-traversal/separator reject (regex îl
|
|
||||||
# acoperă, dar îl ţinem explicit ca safety net dacă regex-ul se relaxează).
|
|
||||||
if ".." in slug or "/" in slug or "\\" in slug:
|
|
||||||
return None
|
|
||||||
if not _SLUG_RE.match(slug):
|
|
||||||
return None
|
return None
|
||||||
project_dir = constants.WORKSPACE_DIR / slug
|
project_dir = constants.WORKSPACE_DIR / slug
|
||||||
try:
|
try:
|
||||||
|
|||||||
@@ -11,13 +11,15 @@ from urllib.parse import parse_qs, urlparse
|
|||||||
|
|
||||||
import constants
|
import constants
|
||||||
|
|
||||||
|
from handlers._validators import validate_slug
|
||||||
|
|
||||||
|
|
||||||
class WorkspaceHandlers:
|
class WorkspaceHandlers:
|
||||||
"""Mixin for /api/workspace and /api/workspace/*."""
|
"""Mixin for /api/workspace and /api/workspace/*."""
|
||||||
|
|
||||||
def _validate_project(self, name):
|
def _validate_project(self, name):
|
||||||
"""Validate project name and return its path, or None."""
|
"""Validate project name and return its path, or None."""
|
||||||
if not name or '/' in name or '..' in name:
|
if validate_slug(name) is not None:
|
||||||
return None
|
return None
|
||||||
project_dir = constants.WORKSPACE_DIR / name
|
project_dir = constants.WORKSPACE_DIR / name
|
||||||
if not project_dir.exists() or not project_dir.is_dir():
|
if not project_dir.exists() or not project_dir.is_dir():
|
||||||
|
|||||||
@@ -5,6 +5,7 @@
|
|||||||
<meta name="viewport" content="width=device-width, initial-scale=1.0">
|
<meta name="viewport" content="width=device-width, initial-scale=1.0">
|
||||||
<link rel="icon" type="image/svg+xml" href="/echo/favicon.svg">
|
<link rel="icon" type="image/svg+xml" href="/echo/favicon.svg">
|
||||||
<title>Echo · Dashboard</title>
|
<title>Echo · Dashboard</title>
|
||||||
|
<link rel="stylesheet" href="/echo/static/tokens.css">
|
||||||
<link rel="stylesheet" href="/echo/common.css">
|
<link rel="stylesheet" href="/echo/common.css">
|
||||||
<script src="https://unpkg.com/lucide@latest/dist/umd/lucide.min.js"></script>
|
<script src="https://unpkg.com/lucide@latest/dist/umd/lucide.min.js"></script>
|
||||||
<script src="/echo/swipe-nav.js"></script>
|
<script src="/echo/swipe-nav.js"></script>
|
||||||
|
|||||||
281
dashboard/login.html
Normal file
281
dashboard/login.html
Normal file
@@ -0,0 +1,281 @@
|
|||||||
|
<!DOCTYPE html>
|
||||||
|
<html lang="ro">
|
||||||
|
<head>
|
||||||
|
<meta charset="UTF-8">
|
||||||
|
<meta name="viewport" content="width=device-width, initial-scale=1.0">
|
||||||
|
<link rel="icon" type="image/svg+xml" href="/echo/favicon.svg">
|
||||||
|
<title>Echo — Autentificare</title>
|
||||||
|
<link rel="stylesheet" href="/echo/static/tokens.css">
|
||||||
|
<style>
|
||||||
|
*, *::before, *::after { box-sizing: border-box; }
|
||||||
|
|
||||||
|
html, body {
|
||||||
|
margin: 0;
|
||||||
|
padding: 0;
|
||||||
|
height: 100%;
|
||||||
|
}
|
||||||
|
|
||||||
|
body {
|
||||||
|
background: var(--bg-base, #13131a);
|
||||||
|
color: var(--text-primary);
|
||||||
|
font-family: var(--font-sans);
|
||||||
|
font-size: var(--text-base);
|
||||||
|
line-height: 1.5;
|
||||||
|
display: flex;
|
||||||
|
align-items: center;
|
||||||
|
justify-content: center;
|
||||||
|
min-height: 100vh;
|
||||||
|
padding: var(--space-6) var(--space-4);
|
||||||
|
-webkit-font-smoothing: antialiased;
|
||||||
|
-moz-osx-font-smoothing: grayscale;
|
||||||
|
}
|
||||||
|
|
||||||
|
.login-shell {
|
||||||
|
display: flex;
|
||||||
|
flex-direction: column;
|
||||||
|
align-items: center;
|
||||||
|
gap: var(--space-5);
|
||||||
|
width: min(380px, 100% - 48px);
|
||||||
|
}
|
||||||
|
|
||||||
|
.monogram {
|
||||||
|
font-family: var(--font-sans);
|
||||||
|
font-weight: 700;
|
||||||
|
font-size: 56px;
|
||||||
|
line-height: 1;
|
||||||
|
letter-spacing: -0.02em;
|
||||||
|
color: var(--accent);
|
||||||
|
user-select: none;
|
||||||
|
}
|
||||||
|
|
||||||
|
.login-card {
|
||||||
|
width: 100%;
|
||||||
|
background: var(--bg-surface);
|
||||||
|
border: 1px solid var(--border);
|
||||||
|
border-radius: var(--radius-md);
|
||||||
|
padding: var(--space-6);
|
||||||
|
box-shadow: var(--shadow-md);
|
||||||
|
}
|
||||||
|
|
||||||
|
.login-title {
|
||||||
|
margin: 0 0 var(--space-1) 0;
|
||||||
|
font-size: var(--text-lg);
|
||||||
|
font-weight: 600;
|
||||||
|
color: var(--text-primary);
|
||||||
|
text-align: center;
|
||||||
|
}
|
||||||
|
|
||||||
|
.login-subtitle {
|
||||||
|
margin: 0 0 var(--space-5) 0;
|
||||||
|
font-size: var(--text-sm);
|
||||||
|
color: var(--text-muted);
|
||||||
|
text-align: center;
|
||||||
|
}
|
||||||
|
|
||||||
|
.form-field {
|
||||||
|
display: flex;
|
||||||
|
flex-direction: column;
|
||||||
|
gap: var(--space-2);
|
||||||
|
}
|
||||||
|
|
||||||
|
.form-label {
|
||||||
|
font-size: var(--text-sm);
|
||||||
|
font-weight: 500;
|
||||||
|
color: var(--text-secondary);
|
||||||
|
}
|
||||||
|
|
||||||
|
.form-input {
|
||||||
|
width: 100%;
|
||||||
|
padding: var(--space-3) var(--space-4);
|
||||||
|
background: var(--bg-elevated);
|
||||||
|
border: 1px solid var(--border);
|
||||||
|
border-radius: var(--radius-sm);
|
||||||
|
color: var(--text-primary);
|
||||||
|
font-family: var(--font-sans);
|
||||||
|
font-size: var(--text-base);
|
||||||
|
transition: border-color var(--transition-fast), box-shadow var(--transition-fast);
|
||||||
|
-webkit-appearance: none;
|
||||||
|
appearance: none;
|
||||||
|
}
|
||||||
|
|
||||||
|
.form-input::placeholder {
|
||||||
|
color: var(--text-muted);
|
||||||
|
opacity: 0.6;
|
||||||
|
}
|
||||||
|
|
||||||
|
.form-input:focus {
|
||||||
|
outline: none;
|
||||||
|
border-color: var(--border-focus);
|
||||||
|
box-shadow: 0 0 0 3px var(--accent-subtle);
|
||||||
|
}
|
||||||
|
|
||||||
|
.form-input.is-invalid {
|
||||||
|
border-color: var(--error);
|
||||||
|
box-shadow: 0 0 0 3px rgba(239, 68, 68, 0.18);
|
||||||
|
}
|
||||||
|
|
||||||
|
.form-error {
|
||||||
|
min-height: 1.25em;
|
||||||
|
margin-top: var(--space-2);
|
||||||
|
font-size: var(--text-sm);
|
||||||
|
color: var(--error);
|
||||||
|
visibility: hidden;
|
||||||
|
}
|
||||||
|
|
||||||
|
.form-error.is-visible {
|
||||||
|
visibility: visible;
|
||||||
|
}
|
||||||
|
|
||||||
|
.submit-btn {
|
||||||
|
width: 100%;
|
||||||
|
margin-top: var(--space-5);
|
||||||
|
padding: var(--space-3) var(--space-4);
|
||||||
|
background: var(--accent);
|
||||||
|
color: #ffffff;
|
||||||
|
border: 1px solid var(--accent);
|
||||||
|
border-radius: var(--radius-sm);
|
||||||
|
font-family: var(--font-sans);
|
||||||
|
font-size: var(--text-base);
|
||||||
|
font-weight: 600;
|
||||||
|
cursor: pointer;
|
||||||
|
transition: background-color var(--transition-fast), border-color var(--transition-fast), opacity var(--transition-fast);
|
||||||
|
}
|
||||||
|
|
||||||
|
.submit-btn:hover:not(:disabled) {
|
||||||
|
background: var(--accent-hover);
|
||||||
|
border-color: var(--accent-hover);
|
||||||
|
}
|
||||||
|
|
||||||
|
.submit-btn:focus-visible {
|
||||||
|
outline: none;
|
||||||
|
box-shadow: 0 0 0 3px var(--accent-subtle);
|
||||||
|
}
|
||||||
|
|
||||||
|
.submit-btn:disabled {
|
||||||
|
opacity: 0.65;
|
||||||
|
cursor: not-allowed;
|
||||||
|
}
|
||||||
|
|
||||||
|
@media (max-width: 480px) {
|
||||||
|
.login-card { padding: var(--space-5); }
|
||||||
|
.monogram { font-size: 48px; }
|
||||||
|
}
|
||||||
|
</style>
|
||||||
|
</head>
|
||||||
|
<body>
|
||||||
|
<main class="login-shell">
|
||||||
|
<div class="monogram" aria-hidden="true">E</div>
|
||||||
|
<section class="login-card">
|
||||||
|
<h1 class="login-title">Echo Dashboard</h1>
|
||||||
|
<p class="login-subtitle">Autentificare</p>
|
||||||
|
<form id="login-form" method="post" action="/api/auth/login" novalidate>
|
||||||
|
<div class="form-field">
|
||||||
|
<label class="form-label" for="token-input">Token de acces</label>
|
||||||
|
<input
|
||||||
|
id="token-input"
|
||||||
|
name="token"
|
||||||
|
type="password"
|
||||||
|
autocomplete="current-password"
|
||||||
|
autocapitalize="off"
|
||||||
|
autocorrect="off"
|
||||||
|
spellcheck="false"
|
||||||
|
aria-label="Token de acces"
|
||||||
|
aria-describedby="form-error"
|
||||||
|
required>
|
||||||
|
<div id="form-error" class="form-error" role="alert" aria-live="polite"></div>
|
||||||
|
</div>
|
||||||
|
<button id="submit-btn" type="submit" class="submit-btn">Intră</button>
|
||||||
|
</form>
|
||||||
|
</section>
|
||||||
|
</main>
|
||||||
|
|
||||||
|
<script>
|
||||||
|
(function () {
|
||||||
|
'use strict';
|
||||||
|
|
||||||
|
var form = document.getElementById('login-form');
|
||||||
|
var input = document.getElementById('token-input');
|
||||||
|
var btn = document.getElementById('submit-btn');
|
||||||
|
var errorEl = document.getElementById('form-error');
|
||||||
|
|
||||||
|
var DEFAULT_LABEL = 'Intră';
|
||||||
|
var SUBMITTING_LABEL = 'Se autentifică...';
|
||||||
|
var RETRY_LABEL = 'Reîncearcă';
|
||||||
|
|
||||||
|
// Auto-focus input on load (skip on touch devices to avoid keyboard pop)
|
||||||
|
window.addEventListener('DOMContentLoaded', function () {
|
||||||
|
if (!('ontouchstart' in window)) {
|
||||||
|
try { input.focus(); } catch (e) { /* ignore */ }
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
// Clear error styling as soon as the user edits the field
|
||||||
|
input.addEventListener('input', function () {
|
||||||
|
if (input.classList.contains('is-invalid')) {
|
||||||
|
input.classList.remove('is-invalid');
|
||||||
|
errorEl.textContent = '';
|
||||||
|
errorEl.classList.remove('is-visible');
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
form.addEventListener('submit', function (ev) {
|
||||||
|
ev.preventDefault();
|
||||||
|
|
||||||
|
var token = input.value.trim();
|
||||||
|
if (!token) {
|
||||||
|
input.classList.add('is-invalid');
|
||||||
|
errorEl.textContent = 'Token invalid';
|
||||||
|
errorEl.classList.add('is-visible');
|
||||||
|
input.focus();
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Submitting state
|
||||||
|
btn.disabled = true;
|
||||||
|
btn.textContent = SUBMITTING_LABEL;
|
||||||
|
input.classList.remove('is-invalid');
|
||||||
|
errorEl.textContent = '';
|
||||||
|
errorEl.classList.remove('is-visible');
|
||||||
|
|
||||||
|
var body = 'token=' + encodeURIComponent(token);
|
||||||
|
|
||||||
|
fetch('/api/auth/login', {
|
||||||
|
method: 'POST',
|
||||||
|
headers: {
|
||||||
|
'Content-Type': 'application/x-www-form-urlencoded',
|
||||||
|
'Accept': 'application/json, text/html'
|
||||||
|
},
|
||||||
|
body: body,
|
||||||
|
credentials: 'same-origin',
|
||||||
|
redirect: 'follow'
|
||||||
|
}).then(function (res) {
|
||||||
|
// Browsers auto-follow 302, so a successful login surfaces
|
||||||
|
// here as a 2xx (workspace.html) or an opaqueredirect.
|
||||||
|
if (res.ok || res.type === 'opaqueredirect' || res.redirected) {
|
||||||
|
var dest = res.url && res.redirected ? res.url : '/echo/workspace.html';
|
||||||
|
window.location.assign(dest);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
if (res.status === 401) {
|
||||||
|
showInvalid();
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
// Any other status — treat as a generic failure
|
||||||
|
showInvalid();
|
||||||
|
}).catch(function () {
|
||||||
|
showInvalid();
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
function showInvalid() {
|
||||||
|
input.classList.add('is-invalid');
|
||||||
|
errorEl.textContent = 'Token invalid';
|
||||||
|
errorEl.classList.add('is-visible');
|
||||||
|
btn.disabled = false;
|
||||||
|
btn.textContent = RETRY_LABEL;
|
||||||
|
try { input.focus(); input.select(); } catch (e) { /* ignore */ }
|
||||||
|
}
|
||||||
|
})();
|
||||||
|
</script>
|
||||||
|
</body>
|
||||||
|
</html>
|
||||||
@@ -1,743 +0,0 @@
|
|||||||
<!DOCTYPE html>
|
|
||||||
<html lang="ro">
|
|
||||||
<head>
|
|
||||||
<meta charset="UTF-8">
|
|
||||||
<meta name="viewport" content="width=device-width, initial-scale=1.0">
|
|
||||||
<link rel="icon" type="image/svg+xml" href="/echo/favicon.svg">
|
|
||||||
<title>Echo · Ralph</title>
|
|
||||||
<link rel="stylesheet" href="/echo/common.css">
|
|
||||||
<script src="https://unpkg.com/lucide@latest/dist/umd/lucide.min.js"></script>
|
|
||||||
<script src="/echo/swipe-nav.js"></script>
|
|
||||||
<style>
|
|
||||||
/* ==========================================
|
|
||||||
Ralph status extension tokens
|
|
||||||
(existing common.css NU declară --status-*)
|
|
||||||
========================================== */
|
|
||||||
:root {
|
|
||||||
--status-running: rgb(34, 197, 94); /* green */
|
|
||||||
--status-blocked: rgb(245, 158, 11); /* amber */
|
|
||||||
--status-failed: rgb(239, 68, 68); /* red */
|
|
||||||
--status-complete: rgb(156, 163, 175); /* slate (done = neutral) */
|
|
||||||
--status-idle: var(--text-muted);
|
|
||||||
}
|
|
||||||
|
|
||||||
/* ==========================================
|
|
||||||
Layout
|
|
||||||
========================================== */
|
|
||||||
.main {
|
|
||||||
max-width: 1400px;
|
|
||||||
margin: 0 auto;
|
|
||||||
padding: var(--space-5);
|
|
||||||
}
|
|
||||||
|
|
||||||
.page-header {
|
|
||||||
display: flex;
|
|
||||||
align-items: center;
|
|
||||||
justify-content: space-between;
|
|
||||||
gap: var(--space-3);
|
|
||||||
margin-bottom: var(--space-5);
|
|
||||||
}
|
|
||||||
|
|
||||||
.page-title {
|
|
||||||
font-size: var(--text-xl);
|
|
||||||
font-weight: 600;
|
|
||||||
color: var(--text-primary);
|
|
||||||
display: flex;
|
|
||||||
align-items: center;
|
|
||||||
gap: var(--space-2);
|
|
||||||
}
|
|
||||||
|
|
||||||
.page-subtitle {
|
|
||||||
font-size: var(--text-sm);
|
|
||||||
color: var(--text-muted);
|
|
||||||
}
|
|
||||||
|
|
||||||
/* Live indicator pulse */
|
|
||||||
.live-indicator {
|
|
||||||
display: inline-flex;
|
|
||||||
align-items: center;
|
|
||||||
gap: var(--space-2);
|
|
||||||
font-size: var(--text-sm);
|
|
||||||
color: var(--text-muted);
|
|
||||||
padding: var(--space-1) var(--space-3);
|
|
||||||
background: var(--bg-surface);
|
|
||||||
border-radius: var(--radius-full);
|
|
||||||
}
|
|
||||||
|
|
||||||
.live-dot {
|
|
||||||
width: 8px;
|
|
||||||
height: 8px;
|
|
||||||
border-radius: 50%;
|
|
||||||
background: var(--status-running);
|
|
||||||
animation: pulse 2s ease-in-out infinite;
|
|
||||||
}
|
|
||||||
|
|
||||||
/* Indicator state: live (SSE) vs polling (fallback) vs offline */
|
|
||||||
.live-indicator[data-mode="polling"] .live-dot {
|
|
||||||
background: var(--status-blocked);
|
|
||||||
animation: none;
|
|
||||||
}
|
|
||||||
.live-indicator[data-mode="offline"] .live-dot {
|
|
||||||
background: var(--status-failed);
|
|
||||||
animation: none;
|
|
||||||
}
|
|
||||||
.live-indicator[data-mode="connecting"] .live-dot {
|
|
||||||
background: var(--text-muted);
|
|
||||||
}
|
|
||||||
|
|
||||||
@keyframes pulse {
|
|
||||||
0%, 100% { opacity: 1; transform: scale(1); }
|
|
||||||
50% { opacity: 0.5; transform: scale(1.2); }
|
|
||||||
}
|
|
||||||
|
|
||||||
.last-fetch {
|
|
||||||
font-size: var(--text-xs);
|
|
||||||
color: var(--text-muted);
|
|
||||||
}
|
|
||||||
|
|
||||||
/* ==========================================
|
|
||||||
Cards grid
|
|
||||||
========================================== */
|
|
||||||
.ralph-grid {
|
|
||||||
display: grid;
|
|
||||||
grid-template-columns: repeat(3, minmax(0, 1fr));
|
|
||||||
gap: var(--space-4);
|
|
||||||
}
|
|
||||||
|
|
||||||
@media (max-width: 1024px) {
|
|
||||||
.ralph-grid { grid-template-columns: repeat(2, minmax(0, 1fr)); }
|
|
||||||
}
|
|
||||||
|
|
||||||
@media (max-width: 640px) {
|
|
||||||
.ralph-grid { grid-template-columns: 1fr; }
|
|
||||||
}
|
|
||||||
|
|
||||||
.ralph-card {
|
|
||||||
background: var(--bg-surface);
|
|
||||||
border: 1px solid var(--border);
|
|
||||||
border-radius: var(--radius-lg);
|
|
||||||
padding: var(--space-4);
|
|
||||||
min-height: 180px;
|
|
||||||
display: flex;
|
|
||||||
flex-direction: column;
|
|
||||||
gap: var(--space-3);
|
|
||||||
transition: border-color var(--transition-fast);
|
|
||||||
}
|
|
||||||
|
|
||||||
.ralph-card:hover {
|
|
||||||
border-color: var(--border-focus);
|
|
||||||
}
|
|
||||||
|
|
||||||
.ralph-card-head {
|
|
||||||
display: flex;
|
|
||||||
align-items: center;
|
|
||||||
justify-content: space-between;
|
|
||||||
gap: var(--space-2);
|
|
||||||
}
|
|
||||||
|
|
||||||
.ralph-slug {
|
|
||||||
font-size: var(--text-base);
|
|
||||||
font-weight: 600;
|
|
||||||
color: var(--text-primary);
|
|
||||||
font-family: var(--font-mono);
|
|
||||||
white-space: nowrap;
|
|
||||||
overflow: hidden;
|
|
||||||
text-overflow: ellipsis;
|
|
||||||
}
|
|
||||||
|
|
||||||
.ralph-status {
|
|
||||||
display: inline-flex;
|
|
||||||
align-items: center;
|
|
||||||
gap: var(--space-1);
|
|
||||||
padding: 2px 10px;
|
|
||||||
font-size: var(--text-xs);
|
|
||||||
font-weight: 600;
|
|
||||||
border-radius: var(--radius-full);
|
|
||||||
text-transform: uppercase;
|
|
||||||
letter-spacing: 0.04em;
|
|
||||||
}
|
|
||||||
|
|
||||||
.ralph-status[data-status="running"] { background: rgba(34, 197, 94, 0.18); color: var(--status-running); }
|
|
||||||
.ralph-status[data-status="blocked"] { background: rgba(245, 158, 11, 0.18); color: var(--status-blocked); }
|
|
||||||
.ralph-status[data-status="failed"] { background: rgba(239, 68, 68, 0.18); color: var(--status-failed); }
|
|
||||||
.ralph-status[data-status="complete"] { background: rgba(156, 163, 175, 0.18); color: var(--status-complete); }
|
|
||||||
.ralph-status[data-status="idle"] { background: var(--bg-surface-active); color: var(--status-idle); }
|
|
||||||
.ralph-status[data-status="error"] { background: rgba(239, 68, 68, 0.18); color: var(--status-failed); }
|
|
||||||
|
|
||||||
.ralph-status-dot {
|
|
||||||
width: 6px;
|
|
||||||
height: 6px;
|
|
||||||
border-radius: 50%;
|
|
||||||
background: currentColor;
|
|
||||||
}
|
|
||||||
|
|
||||||
.ralph-card-body {
|
|
||||||
flex: 1;
|
|
||||||
display: flex;
|
|
||||||
flex-direction: column;
|
|
||||||
gap: var(--space-2);
|
|
||||||
}
|
|
||||||
|
|
||||||
.ralph-current {
|
|
||||||
font-size: var(--text-sm);
|
|
||||||
color: var(--text-secondary);
|
|
||||||
line-height: 1.4;
|
|
||||||
}
|
|
||||||
|
|
||||||
.ralph-current-id {
|
|
||||||
font-family: var(--font-mono);
|
|
||||||
color: var(--text-primary);
|
|
||||||
font-weight: 600;
|
|
||||||
}
|
|
||||||
|
|
||||||
.ralph-tags {
|
|
||||||
display: flex;
|
|
||||||
flex-wrap: wrap;
|
|
||||||
gap: var(--space-1);
|
|
||||||
}
|
|
||||||
|
|
||||||
.ralph-tag {
|
|
||||||
font-size: var(--text-xs);
|
|
||||||
padding: 1px 8px;
|
|
||||||
background: var(--accent-subtle);
|
|
||||||
color: var(--accent);
|
|
||||||
border-radius: var(--radius-sm);
|
|
||||||
font-family: var(--font-mono);
|
|
||||||
}
|
|
||||||
|
|
||||||
/* Progress bar */
|
|
||||||
.ralph-progress {
|
|
||||||
display: flex;
|
|
||||||
flex-direction: column;
|
|
||||||
gap: var(--space-1);
|
|
||||||
}
|
|
||||||
|
|
||||||
.ralph-progress-meta {
|
|
||||||
display: flex;
|
|
||||||
justify-content: space-between;
|
|
||||||
font-size: var(--text-xs);
|
|
||||||
color: var(--text-muted);
|
|
||||||
}
|
|
||||||
|
|
||||||
.ralph-progress-bar {
|
|
||||||
height: 6px;
|
|
||||||
background: var(--bg-surface-active);
|
|
||||||
border-radius: var(--radius-full);
|
|
||||||
overflow: hidden;
|
|
||||||
position: relative;
|
|
||||||
}
|
|
||||||
|
|
||||||
.ralph-progress-fill {
|
|
||||||
height: 100%;
|
|
||||||
background: var(--status-complete);
|
|
||||||
transition: width var(--transition-base);
|
|
||||||
}
|
|
||||||
|
|
||||||
.ralph-card[data-status="running"] .ralph-progress-fill { background: var(--status-running); }
|
|
||||||
.ralph-card[data-status="failed"] .ralph-progress-fill { background: var(--status-failed); }
|
|
||||||
.ralph-card[data-status="blocked"] .ralph-progress-fill { background: var(--status-blocked); }
|
|
||||||
|
|
||||||
.ralph-card-foot {
|
|
||||||
display: flex;
|
|
||||||
align-items: center;
|
|
||||||
justify-content: space-between;
|
|
||||||
gap: var(--space-2);
|
|
||||||
font-size: var(--text-xs);
|
|
||||||
color: var(--text-muted);
|
|
||||||
}
|
|
||||||
|
|
||||||
.ralph-actions {
|
|
||||||
display: flex;
|
|
||||||
gap: var(--space-1);
|
|
||||||
}
|
|
||||||
|
|
||||||
.ralph-icon-btn {
|
|
||||||
background: transparent;
|
|
||||||
border: 1px solid var(--border);
|
|
||||||
color: var(--text-muted);
|
|
||||||
border-radius: var(--radius-sm);
|
|
||||||
cursor: pointer;
|
|
||||||
padding: 6px;
|
|
||||||
display: inline-flex;
|
|
||||||
align-items: center;
|
|
||||||
justify-content: center;
|
|
||||||
min-width: 32px;
|
|
||||||
min-height: 32px;
|
|
||||||
transition: all var(--transition-fast);
|
|
||||||
}
|
|
||||||
|
|
||||||
.ralph-icon-btn:hover {
|
|
||||||
color: var(--text-primary);
|
|
||||||
background: var(--bg-surface-hover);
|
|
||||||
}
|
|
||||||
|
|
||||||
.ralph-icon-btn.danger {
|
|
||||||
color: var(--status-failed);
|
|
||||||
border-color: rgba(239, 68, 68, 0.4);
|
|
||||||
}
|
|
||||||
|
|
||||||
.ralph-icon-btn.danger:hover {
|
|
||||||
background: rgba(239, 68, 68, 0.12);
|
|
||||||
}
|
|
||||||
|
|
||||||
.ralph-icon-btn svg {
|
|
||||||
width: 14px;
|
|
||||||
height: 14px;
|
|
||||||
}
|
|
||||||
|
|
||||||
@media (max-width: 640px) {
|
|
||||||
.ralph-icon-btn {
|
|
||||||
min-width: 44px;
|
|
||||||
min-height: 44px;
|
|
||||||
}
|
|
||||||
.ralph-icon-btn svg {
|
|
||||||
width: 18px;
|
|
||||||
height: 18px;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
/* Empty / loading / error states */
|
|
||||||
.ralph-empty,
|
|
||||||
.ralph-loading,
|
|
||||||
.ralph-error {
|
|
||||||
text-align: center;
|
|
||||||
padding: var(--space-10) var(--space-5);
|
|
||||||
color: var(--text-muted);
|
|
||||||
}
|
|
||||||
|
|
||||||
.ralph-empty svg,
|
|
||||||
.ralph-loading svg,
|
|
||||||
.ralph-error svg {
|
|
||||||
width: 32px;
|
|
||||||
height: 32px;
|
|
||||||
margin-bottom: var(--space-3);
|
|
||||||
opacity: 0.6;
|
|
||||||
}
|
|
||||||
|
|
||||||
.ralph-empty-title {
|
|
||||||
font-size: var(--text-base);
|
|
||||||
color: var(--text-secondary);
|
|
||||||
margin-bottom: var(--space-1);
|
|
||||||
}
|
|
||||||
|
|
||||||
/* ==========================================
|
|
||||||
Drawer (log + PRD viewer)
|
|
||||||
========================================== */
|
|
||||||
.ralph-drawer {
|
|
||||||
position: fixed;
|
|
||||||
inset: 0;
|
|
||||||
background: rgba(0, 0, 0, 0.55);
|
|
||||||
display: none;
|
|
||||||
align-items: center;
|
|
||||||
justify-content: center;
|
|
||||||
z-index: 200;
|
|
||||||
padding: var(--space-4);
|
|
||||||
}
|
|
||||||
|
|
||||||
.ralph-drawer[data-open="true"] {
|
|
||||||
display: flex;
|
|
||||||
}
|
|
||||||
|
|
||||||
.ralph-drawer-content {
|
|
||||||
background: var(--bg-base);
|
|
||||||
border: 1px solid var(--border);
|
|
||||||
border-radius: var(--radius-lg);
|
|
||||||
max-width: 900px;
|
|
||||||
width: 100%;
|
|
||||||
max-height: 85vh;
|
|
||||||
display: flex;
|
|
||||||
flex-direction: column;
|
|
||||||
overflow: hidden;
|
|
||||||
}
|
|
||||||
|
|
||||||
.ralph-drawer-head {
|
|
||||||
padding: var(--space-3) var(--space-4);
|
|
||||||
border-bottom: 1px solid var(--border);
|
|
||||||
display: flex;
|
|
||||||
align-items: center;
|
|
||||||
justify-content: space-between;
|
|
||||||
gap: var(--space-3);
|
|
||||||
}
|
|
||||||
|
|
||||||
.ralph-drawer-title {
|
|
||||||
font-size: var(--text-base);
|
|
||||||
font-weight: 600;
|
|
||||||
color: var(--text-primary);
|
|
||||||
font-family: var(--font-mono);
|
|
||||||
}
|
|
||||||
|
|
||||||
.ralph-drawer-body {
|
|
||||||
flex: 1;
|
|
||||||
overflow: auto;
|
|
||||||
padding: var(--space-4);
|
|
||||||
}
|
|
||||||
|
|
||||||
.ralph-drawer-pre {
|
|
||||||
font-family: var(--font-mono);
|
|
||||||
font-size: var(--text-xs);
|
|
||||||
color: var(--text-secondary);
|
|
||||||
white-space: pre-wrap;
|
|
||||||
word-break: break-word;
|
|
||||||
}
|
|
||||||
</style>
|
|
||||||
</head>
|
|
||||||
<body>
|
|
||||||
<!--NAV-->
|
|
||||||
|
|
||||||
<main class="main">
|
|
||||||
<header class="page-header">
|
|
||||||
<div>
|
|
||||||
<div class="page-title">
|
|
||||||
<i data-lucide="bot" aria-hidden="true"></i>
|
|
||||||
Echo · Ralph
|
|
||||||
</div>
|
|
||||||
<div class="page-subtitle">Live status pe proiectele autonome</div>
|
|
||||||
</div>
|
|
||||||
<div class="live-indicator" aria-live="polite" id="liveIndicator" data-mode="connecting">
|
|
||||||
<span class="live-dot" aria-hidden="true"></span>
|
|
||||||
<span id="liveLabel">Conectare…</span>
|
|
||||||
<span class="last-fetch" id="lastFetch"></span>
|
|
||||||
</div>
|
|
||||||
</header>
|
|
||||||
|
|
||||||
<section id="ralphContent" aria-live="polite">
|
|
||||||
<div class="ralph-loading">
|
|
||||||
<i data-lucide="loader" aria-hidden="true"></i>
|
|
||||||
<div>Se încarcă proiectele Ralph...</div>
|
|
||||||
</div>
|
|
||||||
</section>
|
|
||||||
</main>
|
|
||||||
|
|
||||||
<!-- Drawer pentru log / PRD viewer -->
|
|
||||||
<div class="ralph-drawer" id="ralphDrawer" data-open="false" role="dialog" aria-modal="true" aria-labelledby="drawerTitle">
|
|
||||||
<div class="ralph-drawer-content">
|
|
||||||
<div class="ralph-drawer-head">
|
|
||||||
<div class="ralph-drawer-title" id="drawerTitle">—</div>
|
|
||||||
<button type="button" class="ralph-icon-btn" id="drawerClose" aria-label="Închide drawer">
|
|
||||||
<i data-lucide="x" aria-hidden="true"></i>
|
|
||||||
</button>
|
|
||||||
</div>
|
|
||||||
<div class="ralph-drawer-body">
|
|
||||||
<pre class="ralph-drawer-pre" id="drawerBody"></pre>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
<script>
|
|
||||||
(function () {
|
|
||||||
const POLL_MS = 5000;
|
|
||||||
const contentEl = document.getElementById('ralphContent');
|
|
||||||
const lastFetchEl = document.getElementById('lastFetch');
|
|
||||||
const liveLabel = document.getElementById('liveLabel');
|
|
||||||
const liveIndicator = document.getElementById('liveIndicator');
|
|
||||||
const drawer = document.getElementById('ralphDrawer');
|
|
||||||
const drawerTitle = document.getElementById('drawerTitle');
|
|
||||||
const drawerBody = document.getElementById('drawerBody');
|
|
||||||
const drawerClose = document.getElementById('drawerClose');
|
|
||||||
|
|
||||||
// Connection mode: 'connecting' → 'live' (SSE) | 'polling' (fallback) | 'offline'
|
|
||||||
function setMode(mode) {
|
|
||||||
liveIndicator.dataset.mode = mode;
|
|
||||||
const labels = {
|
|
||||||
connecting: 'Conectare…',
|
|
||||||
live: '🟢 Live',
|
|
||||||
polling: '⏱ Polling',
|
|
||||||
offline: 'Offline',
|
|
||||||
};
|
|
||||||
liveLabel.textContent = labels[mode] || mode;
|
|
||||||
}
|
|
||||||
|
|
||||||
function fmtAgo(iso) {
|
|
||||||
if (!iso) return '—';
|
|
||||||
const t = new Date(iso).getTime();
|
|
||||||
if (isNaN(t)) return '—';
|
|
||||||
const diff = Math.max(0, Date.now() - t);
|
|
||||||
const sec = Math.floor(diff / 1000);
|
|
||||||
if (sec < 60) return `acum ${sec}s`;
|
|
||||||
const min = Math.floor(sec / 60);
|
|
||||||
if (min < 60) return `acum ${min}m`;
|
|
||||||
const hr = Math.floor(min / 60);
|
|
||||||
if (hr < 24) return `acum ${hr}h`;
|
|
||||||
const day = Math.floor(hr / 24);
|
|
||||||
return `acum ${day}z`;
|
|
||||||
}
|
|
||||||
|
|
||||||
function escapeHtml(s) {
|
|
||||||
return String(s == null ? '' : s)
|
|
||||||
.replace(/&/g, '&').replace(/</g, '<')
|
|
||||||
.replace(/>/g, '>').replace(/"/g, '"');
|
|
||||||
}
|
|
||||||
|
|
||||||
function renderCard(p) {
|
|
||||||
const total = p.storiesTotal || 0;
|
|
||||||
const done = p.storiesComplete || 0;
|
|
||||||
const failed = p.storiesFailed || 0;
|
|
||||||
const blocked = p.storiesBlocked || 0;
|
|
||||||
const pct = total > 0 ? Math.round(((done + failed + blocked) / total) * 100) : 0;
|
|
||||||
|
|
||||||
const current = p.currentStory
|
|
||||||
? `<div class="ralph-current"><span class="ralph-current-id">${escapeHtml(p.currentStory.id)}</span> · ${escapeHtml(p.currentStory.title || '')} ` +
|
|
||||||
(p.currentStory.retries ? `<span title="retries">(${p.currentStory.retries}/3)</span>` : '') + `</div>` +
|
|
||||||
(p.currentStory.tags && p.currentStory.tags.length
|
|
||||||
? `<div class="ralph-tags">${p.currentStory.tags.map(t => `<span class="ralph-tag">${escapeHtml(t)}</span>`).join('')}</div>`
|
|
||||||
: '')
|
|
||||||
: (p.status === 'complete'
|
|
||||||
? `<div class="ralph-current">Toate stories complete (${done}/${total}).</div>`
|
|
||||||
: `<div class="ralph-current" style="color:var(--text-muted)">Nu rulează acum.</div>`);
|
|
||||||
|
|
||||||
const eta = (p.etaMinutes != null && p.status === 'running')
|
|
||||||
? `~${p.etaMinutes}min`
|
|
||||||
: '';
|
|
||||||
|
|
||||||
const stopBtn = p.running
|
|
||||||
? `<button type="button" class="ralph-icon-btn danger" data-action="stop" data-slug="${escapeHtml(p.slug)}" aria-label="Oprește Ralph">
|
|
||||||
<i data-lucide="square" aria-hidden="true"></i>
|
|
||||||
</button>`
|
|
||||||
: '';
|
|
||||||
|
|
||||||
// Rollback: vizibil pe card-uri running (corectează ultima iteraţie
|
|
||||||
// dacă Ralph a marcat passes prematur). Confirm dialog la click.
|
|
||||||
const rollbackBtn = p.running
|
|
||||||
? `<button type="button" class="ralph-icon-btn" data-action="rollback" data-slug="${escapeHtml(p.slug)}" aria-label="Rollback ultima iteraţie" title="Rollback ultima iteraţie (git revert HEAD)">
|
|
||||||
<i data-lucide="undo-2" aria-hidden="true"></i>
|
|
||||||
</button>`
|
|
||||||
: '';
|
|
||||||
|
|
||||||
return `
|
|
||||||
<article class="ralph-card" data-status="${escapeHtml(p.status)}">
|
|
||||||
<header class="ralph-card-head">
|
|
||||||
<div class="ralph-slug" title="${escapeHtml(p.slug)}">${escapeHtml(p.slug)}</div>
|
|
||||||
<span class="ralph-status" data-status="${escapeHtml(p.status)}" aria-label="Status: ${escapeHtml(p.status)}">
|
|
||||||
<span class="ralph-status-dot" aria-hidden="true"></span>${escapeHtml(p.status)}
|
|
||||||
</span>
|
|
||||||
</header>
|
|
||||||
<div class="ralph-card-body">
|
|
||||||
${current}
|
|
||||||
<div class="ralph-progress">
|
|
||||||
<div class="ralph-progress-meta">
|
|
||||||
<span>${done}/${total} done${failed ? ` · ${failed} failed` : ''}${blocked ? ` · ${blocked} blocked` : ''}</span>
|
|
||||||
<span>${eta}</span>
|
|
||||||
</div>
|
|
||||||
<div class="ralph-progress-bar" role="progressbar" aria-valuenow="${pct}" aria-valuemin="0" aria-valuemax="100">
|
|
||||||
<div class="ralph-progress-fill" style="width:${pct}%"></div>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
<footer class="ralph-card-foot">
|
|
||||||
<span title="Ultima iterație">${fmtAgo(p.lastIterAt)}</span>
|
|
||||||
<div class="ralph-actions">
|
|
||||||
<button type="button" class="ralph-icon-btn" data-action="log" data-slug="${escapeHtml(p.slug)}" aria-label="Vezi log">
|
|
||||||
<i data-lucide="terminal" aria-hidden="true"></i>
|
|
||||||
</button>
|
|
||||||
<button type="button" class="ralph-icon-btn" data-action="prd" data-slug="${escapeHtml(p.slug)}" aria-label="Vezi PRD">
|
|
||||||
<i data-lucide="file-text" aria-hidden="true"></i>
|
|
||||||
</button>
|
|
||||||
${rollbackBtn}
|
|
||||||
${stopBtn}
|
|
||||||
</div>
|
|
||||||
</footer>
|
|
||||||
</article>`;
|
|
||||||
}
|
|
||||||
|
|
||||||
function renderEmpty() {
|
|
||||||
return `
|
|
||||||
<div class="ralph-empty">
|
|
||||||
<i data-lucide="inbox" aria-hidden="true"></i>
|
|
||||||
<div class="ralph-empty-title">Niciun proiect aprobat.</div>
|
|
||||||
<div>Aprobă ceva pe Discord/Telegram cu <code>/a <slug></code>.</div>
|
|
||||||
</div>`;
|
|
||||||
}
|
|
||||||
|
|
||||||
function renderError(msg) {
|
|
||||||
return `
|
|
||||||
<div class="ralph-error">
|
|
||||||
<i data-lucide="alert-triangle" aria-hidden="true"></i>
|
|
||||||
<div>Cannot reach Echo Core: ${escapeHtml(msg)}</div>
|
|
||||||
</div>`;
|
|
||||||
}
|
|
||||||
|
|
||||||
function renderSnapshot(data) {
|
|
||||||
const projects = data.projects || [];
|
|
||||||
if (projects.length === 0) {
|
|
||||||
contentEl.innerHTML = renderEmpty();
|
|
||||||
} else {
|
|
||||||
contentEl.innerHTML = `<div class="ralph-grid">${projects.map(renderCard).join('')}</div>`;
|
|
||||||
}
|
|
||||||
lastFetchEl.textContent = '· ' + fmtAgo(data.fetchedAt);
|
|
||||||
if (window.lucide) lucide.createIcons();
|
|
||||||
}
|
|
||||||
|
|
||||||
async function fetchStatus() {
|
|
||||||
try {
|
|
||||||
const res = await fetch('/echo/api/ralph/status', { cache: 'no-store' });
|
|
||||||
if (!res.ok) throw new Error('HTTP ' + res.status);
|
|
||||||
const data = await res.json();
|
|
||||||
renderSnapshot(data);
|
|
||||||
} catch (err) {
|
|
||||||
contentEl.innerHTML = renderError(err.message || String(err));
|
|
||||||
setMode('offline');
|
|
||||||
if (window.lucide) lucide.createIcons();
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
async function openLog(slug) {
|
|
||||||
drawerTitle.textContent = `${slug} · progress.txt`;
|
|
||||||
drawerBody.textContent = 'Se încarcă...';
|
|
||||||
drawer.dataset.open = 'true';
|
|
||||||
try {
|
|
||||||
const res = await fetch(`/echo/api/ralph/${encodeURIComponent(slug)}/log?lines=200`);
|
|
||||||
const data = await res.json();
|
|
||||||
drawerBody.textContent = (data.lines || []).join('\n');
|
|
||||||
} catch (err) {
|
|
||||||
drawerBody.textContent = `Error: ${err.message || err}`;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
async function openPrd(slug) {
|
|
||||||
drawerTitle.textContent = `${slug} · prd.json`;
|
|
||||||
drawerBody.textContent = 'Se încarcă...';
|
|
||||||
drawer.dataset.open = 'true';
|
|
||||||
try {
|
|
||||||
const res = await fetch(`/echo/api/ralph/${encodeURIComponent(slug)}/prd`);
|
|
||||||
const data = await res.json();
|
|
||||||
drawerBody.textContent = JSON.stringify(data, null, 2);
|
|
||||||
} catch (err) {
|
|
||||||
drawerBody.textContent = `Error: ${err.message || err}`;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
async function stopRalph(slug) {
|
|
||||||
if (!confirm(`Oprești Ralph pe ${slug}?`)) return;
|
|
||||||
try {
|
|
||||||
const res = await fetch(`/echo/api/ralph/${encodeURIComponent(slug)}/stop`, { method: 'POST' });
|
|
||||||
const data = await res.json();
|
|
||||||
if (!data.success) {
|
|
||||||
alert('Eșec: ' + (data.error || 'unknown'));
|
|
||||||
} else {
|
|
||||||
fetchStatus();
|
|
||||||
}
|
|
||||||
} catch (err) {
|
|
||||||
alert('Eroare: ' + (err.message || err));
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
async function rollbackRalph(slug) {
|
|
||||||
if (!confirm(`Asta va da git revert HEAD pe ${slug} și va decrementa ultima story trecută. Continui?`)) return;
|
|
||||||
try {
|
|
||||||
const res = await fetch(`/echo/api/ralph/${encodeURIComponent(slug)}/rollback`, { method: 'POST' });
|
|
||||||
const data = await res.json();
|
|
||||||
if (!data.success) {
|
|
||||||
alert('Rollback eşuat: ' + (data.message || 'unknown'));
|
|
||||||
} else {
|
|
||||||
alert('✓ ' + (data.message || 'Rollback OK'));
|
|
||||||
fetchStatus();
|
|
||||||
}
|
|
||||||
} catch (err) {
|
|
||||||
alert('Eroare rollback: ' + (err.message || err));
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
contentEl.addEventListener('click', (e) => {
|
|
||||||
const btn = e.target.closest('[data-action]');
|
|
||||||
if (!btn) return;
|
|
||||||
const slug = btn.dataset.slug;
|
|
||||||
const action = btn.dataset.action;
|
|
||||||
if (action === 'log') openLog(slug);
|
|
||||||
else if (action === 'prd') openPrd(slug);
|
|
||||||
else if (action === 'stop') stopRalph(slug);
|
|
||||||
else if (action === 'rollback') rollbackRalph(slug);
|
|
||||||
});
|
|
||||||
|
|
||||||
drawerClose.addEventListener('click', () => {
|
|
||||||
drawer.dataset.open = 'false';
|
|
||||||
});
|
|
||||||
|
|
||||||
drawer.addEventListener('click', (e) => {
|
|
||||||
if (e.target === drawer) drawer.dataset.open = 'false';
|
|
||||||
});
|
|
||||||
|
|
||||||
document.addEventListener('keydown', (e) => {
|
|
||||||
if (e.key === 'Escape') drawer.dataset.open = 'false';
|
|
||||||
});
|
|
||||||
|
|
||||||
// ────────────────────────────────────────────────────────
|
|
||||||
// Connection: try SSE first; fallback to polling on error.
|
|
||||||
// ────────────────────────────────────────────────────────
|
|
||||||
let eventSource = null;
|
|
||||||
let pollHandle = null;
|
|
||||||
|
|
||||||
function startPolling() {
|
|
||||||
if (pollHandle) return;
|
|
||||||
setMode('polling');
|
|
||||||
fetchStatus();
|
|
||||||
pollHandle = setInterval(fetchStatus, POLL_MS);
|
|
||||||
}
|
|
||||||
|
|
||||||
function stopPolling() {
|
|
||||||
if (pollHandle) {
|
|
||||||
clearInterval(pollHandle);
|
|
||||||
pollHandle = null;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
function startSSE() {
|
|
||||||
if (typeof EventSource === 'undefined') {
|
|
||||||
startPolling();
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
try {
|
|
||||||
eventSource = new EventSource('/echo/api/ralph/stream');
|
|
||||||
} catch (err) {
|
|
||||||
startPolling();
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Server-confirmed open — switch to live mode
|
|
||||||
eventSource.addEventListener('open', () => {
|
|
||||||
stopPolling();
|
|
||||||
setMode('live');
|
|
||||||
});
|
|
||||||
|
|
||||||
eventSource.addEventListener('status', (ev) => {
|
|
||||||
stopPolling();
|
|
||||||
setMode('live');
|
|
||||||
try {
|
|
||||||
const data = JSON.parse(ev.data);
|
|
||||||
renderSnapshot(data);
|
|
||||||
} catch (err) {
|
|
||||||
// malformed payload — ignore, next event will reconcile
|
|
||||||
}
|
|
||||||
});
|
|
||||||
|
|
||||||
eventSource.addEventListener('heartbeat', () => {
|
|
||||||
// Keep-alive; nothing to render but it confirms the link.
|
|
||||||
if (liveIndicator.dataset.mode !== 'live') setMode('live');
|
|
||||||
});
|
|
||||||
|
|
||||||
eventSource.addEventListener('error', () => {
|
|
||||||
// EventSource auto-reconnect kicks in by default. If the
|
|
||||||
// endpoint never responds (404/500/CORS), readyState=CLOSED
|
|
||||||
// and we fall back permanently to polling.
|
|
||||||
if (eventSource && eventSource.readyState === EventSource.CLOSED) {
|
|
||||||
eventSource = null;
|
|
||||||
startPolling();
|
|
||||||
} else {
|
|
||||||
// Transient — show polling state until reconnect succeeds
|
|
||||||
setMode('polling');
|
|
||||||
if (!pollHandle) {
|
|
||||||
// Don't double-fetch; SSE reconnect should resume soon
|
|
||||||
fetchStatus();
|
|
||||||
}
|
|
||||||
}
|
|
||||||
});
|
|
||||||
}
|
|
||||||
|
|
||||||
// Initial paint via fetch (so first frame renders even if SSE handshake
|
|
||||||
// takes a beat); SSE will then take over for live updates.
|
|
||||||
fetchStatus();
|
|
||||||
startSSE();
|
|
||||||
if (window.lucide) lucide.createIcons();
|
|
||||||
})();
|
|
||||||
</script>
|
|
||||||
</body>
|
|
||||||
</html>
|
|
||||||
BIN
dashboard/static/fonts/inter-400.woff2
Normal file
BIN
dashboard/static/fonts/inter-400.woff2
Normal file
Binary file not shown.
BIN
dashboard/static/fonts/inter-500.woff2
Normal file
BIN
dashboard/static/fonts/inter-500.woff2
Normal file
Binary file not shown.
BIN
dashboard/static/fonts/inter-600.woff2
Normal file
BIN
dashboard/static/fonts/inter-600.woff2
Normal file
Binary file not shown.
BIN
dashboard/static/fonts/inter-700.woff2
Normal file
BIN
dashboard/static/fonts/inter-700.woff2
Normal file
Binary file not shown.
159
dashboard/static/tokens.css
Normal file
159
dashboard/static/tokens.css
Normal file
@@ -0,0 +1,159 @@
|
|||||||
|
/*
|
||||||
|
* Echo Dashboard — Design Tokens
|
||||||
|
* Single source of truth for all CSS variables, fonts, and shared
|
||||||
|
* mobile-modal behavior. Loaded via /echo/static/tokens.css on every
|
||||||
|
* dashboard page (in addition to common.css for now).
|
||||||
|
*
|
||||||
|
* Token coverage:
|
||||||
|
* - Colors (dark default + light theme override)
|
||||||
|
* - Status palette (running, blocked, failed, complete, idle,
|
||||||
|
* planning, pending, approved)
|
||||||
|
* - Typography (Inter sans + JetBrains Mono mono, size scale)
|
||||||
|
* - Spacing (8px grid)
|
||||||
|
* - Radius scale
|
||||||
|
* - Shadows / transitions
|
||||||
|
*/
|
||||||
|
|
||||||
|
/* ==========================================================
|
||||||
|
@font-face — Inter (self-hosted, woff2 only)
|
||||||
|
========================================================== */
|
||||||
|
@font-face {
|
||||||
|
font-family: 'Inter';
|
||||||
|
src: url('/echo/static/fonts/inter-400.woff2') format('woff2');
|
||||||
|
font-weight: 400;
|
||||||
|
font-style: normal;
|
||||||
|
font-display: swap;
|
||||||
|
}
|
||||||
|
|
||||||
|
@font-face {
|
||||||
|
font-family: 'Inter';
|
||||||
|
src: url('/echo/static/fonts/inter-500.woff2') format('woff2');
|
||||||
|
font-weight: 500;
|
||||||
|
font-style: normal;
|
||||||
|
font-display: swap;
|
||||||
|
}
|
||||||
|
|
||||||
|
@font-face {
|
||||||
|
font-family: 'Inter';
|
||||||
|
src: url('/echo/static/fonts/inter-600.woff2') format('woff2');
|
||||||
|
font-weight: 600;
|
||||||
|
font-style: normal;
|
||||||
|
font-display: swap;
|
||||||
|
}
|
||||||
|
|
||||||
|
@font-face {
|
||||||
|
font-family: 'Inter';
|
||||||
|
src: url('/echo/static/fonts/inter-700.woff2') format('woff2');
|
||||||
|
font-weight: 700;
|
||||||
|
font-style: normal;
|
||||||
|
font-display: swap;
|
||||||
|
}
|
||||||
|
|
||||||
|
/* ==========================================================
|
||||||
|
Tokens — dark theme (default)
|
||||||
|
========================================================== */
|
||||||
|
:root {
|
||||||
|
/* Colors — dark surface */
|
||||||
|
--bg-base: #13131a;
|
||||||
|
--bg-surface: rgba(255, 255, 255, 0.12);
|
||||||
|
--bg-surface-hover: rgba(255, 255, 255, 0.16);
|
||||||
|
--bg-surface-active: rgba(255, 255, 255, 0.20);
|
||||||
|
--bg-elevated: rgba(255, 255, 255, 0.14);
|
||||||
|
|
||||||
|
--text-primary: #ffffff;
|
||||||
|
--text-secondary: #f5f5f5;
|
||||||
|
--text-muted: #e5e5e5;
|
||||||
|
|
||||||
|
--accent: #3b82f6;
|
||||||
|
--accent-hover: #2563eb;
|
||||||
|
--accent-subtle: rgba(59, 130, 246, 0.2);
|
||||||
|
|
||||||
|
--border: rgba(255, 255, 255, 0.3);
|
||||||
|
--border-focus: rgba(59, 130, 246, 0.7);
|
||||||
|
|
||||||
|
--header-bg: rgba(19, 19, 26, 0.95);
|
||||||
|
|
||||||
|
--success: #22c55e;
|
||||||
|
--warning: #eab308;
|
||||||
|
--error: #ef4444;
|
||||||
|
|
||||||
|
/* Status palette — used by .status-pill[data-status] */
|
||||||
|
--status-running: rgb(34, 197, 94); /* green — running-ralph / running-manual */
|
||||||
|
--status-blocked: rgb(245, 158, 11); /* amber */
|
||||||
|
--status-failed: rgb(239, 68, 68); /* red */
|
||||||
|
--status-complete: rgb(156, 163, 175); /* slate (done = neutral) */
|
||||||
|
--status-idle: var(--text-muted);
|
||||||
|
|
||||||
|
/* Status palette — extended (workflow states) */
|
||||||
|
--status-planning: rgb(167, 139, 250); /* violet — Echo is planning */
|
||||||
|
--status-pending: rgb(96, 165, 250); /* sky — awaiting approval */
|
||||||
|
--status-approved: rgb(234, 179, 8); /* gold — approved tonight */
|
||||||
|
|
||||||
|
/* Spacing — 8px grid */
|
||||||
|
--space-1: 4px;
|
||||||
|
--space-2: 8px;
|
||||||
|
--space-3: 12px;
|
||||||
|
--space-4: 16px;
|
||||||
|
--space-5: 20px;
|
||||||
|
--space-6: 24px;
|
||||||
|
--space-8: 32px;
|
||||||
|
--space-10: 40px;
|
||||||
|
|
||||||
|
/* Typography */
|
||||||
|
--font-sans: 'Inter', -apple-system, BlinkMacSystemFont, 'Segoe UI', sans-serif;
|
||||||
|
--font-mono: 'JetBrains Mono', 'Fira Code', ui-monospace, monospace;
|
||||||
|
|
||||||
|
--text-xs: 0.75rem;
|
||||||
|
--text-sm: 0.875rem;
|
||||||
|
--text-base: 1rem;
|
||||||
|
--text-lg: 1.125rem;
|
||||||
|
--text-xl: 1.25rem;
|
||||||
|
|
||||||
|
/* Radius */
|
||||||
|
--radius-sm: 4px;
|
||||||
|
--radius-md: 8px;
|
||||||
|
--radius-lg: 12px;
|
||||||
|
--radius-full: 9999px;
|
||||||
|
|
||||||
|
/* Shadows */
|
||||||
|
--shadow-sm: 0 1px 2px rgba(0, 0, 0, 0.3);
|
||||||
|
--shadow-md: 0 4px 12px rgba(0, 0, 0, 0.4);
|
||||||
|
--shadow-lg: 0 8px 24px rgba(0, 0, 0, 0.5);
|
||||||
|
|
||||||
|
/* Motion */
|
||||||
|
--transition-fast: 0.15s ease;
|
||||||
|
--transition-base: 0.2s ease;
|
||||||
|
}
|
||||||
|
|
||||||
|
/* ==========================================================
|
||||||
|
Light theme override
|
||||||
|
========================================================== */
|
||||||
|
[data-theme="light"] {
|
||||||
|
--bg-base: #f8f9fa;
|
||||||
|
--bg-surface: rgba(0, 0, 0, 0.04);
|
||||||
|
--bg-surface-hover: rgba(0, 0, 0, 0.08);
|
||||||
|
--bg-surface-active: rgba(0, 0, 0, 0.12);
|
||||||
|
--bg-elevated: rgba(0, 0, 0, 0.06);
|
||||||
|
|
||||||
|
--text-primary: #1a1a1a;
|
||||||
|
--text-secondary: #444444;
|
||||||
|
--text-muted: #666666;
|
||||||
|
|
||||||
|
--border: rgba(0, 0, 0, 0.12);
|
||||||
|
--border-focus: rgba(59, 130, 246, 0.5);
|
||||||
|
|
||||||
|
--accent-subtle: rgba(59, 130, 246, 0.12);
|
||||||
|
|
||||||
|
--header-bg: rgba(255, 255, 255, 0.95);
|
||||||
|
}
|
||||||
|
|
||||||
|
/* ==========================================================
|
||||||
|
Mobile modal — shared across all pages with .modal-overlay
|
||||||
|
========================================================== */
|
||||||
|
@media (max-width: 640px) {
|
||||||
|
.modal-overlay { padding: 0; align-items: stretch; }
|
||||||
|
.modal { max-width: 100vw !important; max-height: 100vh !important; border-radius: 0; height: 100vh; }
|
||||||
|
.modal-header { position: sticky; top: 0; background: var(--bg-base); }
|
||||||
|
.modal-footer { position: sticky; bottom: 0; padding-bottom: max(var(--space-4), env(safe-area-inset-bottom)); }
|
||||||
|
.phase-stepper .phase-step:not(.active) span:not(.step-num) { display: none; }
|
||||||
|
}
|
||||||
File diff suppressed because it is too large
Load Diff
280
src/approved_tasks_cli.py
Normal file
280
src/approved_tasks_cli.py
Normal file
@@ -0,0 +1,280 @@
|
|||||||
|
"""CLI wrapper for atomic mutations of `approved-tasks.json`.
|
||||||
|
|
||||||
|
Shell scripts (ralph.sh) and cron-job prompts cannot import
|
||||||
|
`src.jsonlock` directly. This module is the bridge: every subcommand
|
||||||
|
serialises its mutation through ``write_locked`` so external writers
|
||||||
|
honour the same flock invariant as the in-process code in
|
||||||
|
``src/router.py`` / ``src/planning_session.py``.
|
||||||
|
|
||||||
|
Run via:
|
||||||
|
|
||||||
|
python3 -m src.approved_tasks_cli <subcommand> [args]
|
||||||
|
|
||||||
|
Subcommands:
|
||||||
|
set-status --slug SLUG --status STATUS
|
||||||
|
set-field --slug SLUG --key KEY --value VALUE [--int|--null|--now|--json]
|
||||||
|
add-project --slug SLUG --description DESC [--status STATUS]
|
||||||
|
mark-running --slug SLUG --pid PID
|
||||||
|
mark-failed --slug SLUG [--error MSG]
|
||||||
|
show [--slug SLUG] # read-only inspection
|
||||||
|
|
||||||
|
All mutators bump ``last_updated`` to the current UTC ISO timestamp.
|
||||||
|
|
||||||
|
Exit codes:
|
||||||
|
0 success
|
||||||
|
1 bad usage / invalid argument
|
||||||
|
2 slug not found
|
||||||
|
3 lock timeout / IO error
|
||||||
|
"""
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import argparse
|
||||||
|
import json
|
||||||
|
import sys
|
||||||
|
from datetime import datetime, timezone
|
||||||
|
from pathlib import Path
|
||||||
|
|
||||||
|
# Make `from src.jsonlock import ...` work whether invoked as
|
||||||
|
# `python -m src.approved_tasks_cli` (sys.path already correct) or as
|
||||||
|
# `python3 src/approved_tasks_cli.py` (need to prepend project root).
|
||||||
|
_PROJECT_ROOT = Path(__file__).resolve().parent.parent
|
||||||
|
if str(_PROJECT_ROOT) not in sys.path:
|
||||||
|
sys.path.insert(0, str(_PROJECT_ROOT))
|
||||||
|
|
||||||
|
from src.jsonlock import read_locked, write_locked, LockTimeoutError # noqa: E402
|
||||||
|
|
||||||
|
APPROVED_TASKS_FILE = _PROJECT_ROOT / "approved-tasks.json"
|
||||||
|
|
||||||
|
|
||||||
|
def _now_iso() -> str:
|
||||||
|
return datetime.now(timezone.utc).isoformat()
|
||||||
|
|
||||||
|
|
||||||
|
def _bump_timestamp(data: dict) -> None:
|
||||||
|
data["last_updated"] = _now_iso()
|
||||||
|
|
||||||
|
|
||||||
|
def _find_project(data: dict, slug: str) -> dict | None:
|
||||||
|
for p in data.get("projects", []):
|
||||||
|
if p.get("name", "").lower() == slug.lower():
|
||||||
|
return p
|
||||||
|
return None
|
||||||
|
|
||||||
|
|
||||||
|
def _coerce_value(value: str, *, as_int: bool, as_null: bool, as_now: bool, as_json: bool):
|
||||||
|
if as_null:
|
||||||
|
return None
|
||||||
|
if as_now:
|
||||||
|
return _now_iso()
|
||||||
|
if as_int:
|
||||||
|
try:
|
||||||
|
return int(value)
|
||||||
|
except ValueError as exc:
|
||||||
|
raise SystemExit(f"value '{value}' is not a valid int: {exc}") from exc
|
||||||
|
if as_json:
|
||||||
|
try:
|
||||||
|
return json.loads(value)
|
||||||
|
except json.JSONDecodeError as exc:
|
||||||
|
raise SystemExit(f"value is not valid JSON: {exc}") from exc
|
||||||
|
return value
|
||||||
|
|
||||||
|
|
||||||
|
# ---- subcommand implementations -------------------------------------------
|
||||||
|
|
||||||
|
|
||||||
|
def cmd_set_status(args) -> int:
|
||||||
|
def mutator(data: dict) -> dict:
|
||||||
|
proj = _find_project(data, args.slug)
|
||||||
|
if proj is None:
|
||||||
|
raise KeyError(args.slug)
|
||||||
|
proj["status"] = args.status
|
||||||
|
_bump_timestamp(data)
|
||||||
|
return data
|
||||||
|
|
||||||
|
try:
|
||||||
|
write_locked(str(APPROVED_TASKS_FILE), mutator)
|
||||||
|
except KeyError:
|
||||||
|
print(f"slug '{args.slug}' not found", file=sys.stderr)
|
||||||
|
return 2
|
||||||
|
print(f"set status of '{args.slug}' = '{args.status}'")
|
||||||
|
return 0
|
||||||
|
|
||||||
|
|
||||||
|
def cmd_set_field(args) -> int:
|
||||||
|
value = _coerce_value(
|
||||||
|
args.value or "",
|
||||||
|
as_int=args.int,
|
||||||
|
as_null=args.null,
|
||||||
|
as_now=args.now,
|
||||||
|
as_json=args.json_value,
|
||||||
|
)
|
||||||
|
|
||||||
|
def mutator(data: dict) -> dict:
|
||||||
|
proj = _find_project(data, args.slug)
|
||||||
|
if proj is None:
|
||||||
|
raise KeyError(args.slug)
|
||||||
|
proj[args.key] = value
|
||||||
|
_bump_timestamp(data)
|
||||||
|
return data
|
||||||
|
|
||||||
|
try:
|
||||||
|
write_locked(str(APPROVED_TASKS_FILE), mutator)
|
||||||
|
except KeyError:
|
||||||
|
print(f"slug '{args.slug}' not found", file=sys.stderr)
|
||||||
|
return 2
|
||||||
|
print(f"set {args.slug}.{args.key} = {value!r}")
|
||||||
|
return 0
|
||||||
|
|
||||||
|
|
||||||
|
def cmd_add_project(args) -> int:
|
||||||
|
def mutator(data: dict) -> dict:
|
||||||
|
data.setdefault("projects", [])
|
||||||
|
if _find_project(data, args.slug) is not None:
|
||||||
|
raise FileExistsError(args.slug)
|
||||||
|
entry = {
|
||||||
|
"name": args.slug,
|
||||||
|
"description": args.description,
|
||||||
|
"status": args.status,
|
||||||
|
"planning_session_id": None,
|
||||||
|
"final_plan_path": None,
|
||||||
|
"proposed_at": _now_iso(),
|
||||||
|
"approved_at": None,
|
||||||
|
"started_at": None,
|
||||||
|
"pid": None,
|
||||||
|
}
|
||||||
|
data["projects"].append(entry)
|
||||||
|
_bump_timestamp(data)
|
||||||
|
return data
|
||||||
|
|
||||||
|
try:
|
||||||
|
write_locked(str(APPROVED_TASKS_FILE), mutator)
|
||||||
|
except FileExistsError:
|
||||||
|
print(f"slug '{args.slug}' already exists — refusing to overwrite", file=sys.stderr)
|
||||||
|
return 1
|
||||||
|
print(f"added project '{args.slug}' (status={args.status})")
|
||||||
|
return 0
|
||||||
|
|
||||||
|
|
||||||
|
def cmd_mark_running(args) -> int:
|
||||||
|
"""Convenience: status=running, started_at=now, pid=<PID> in ONE locked write."""
|
||||||
|
def mutator(data: dict) -> dict:
|
||||||
|
proj = _find_project(data, args.slug)
|
||||||
|
if proj is None:
|
||||||
|
raise KeyError(args.slug)
|
||||||
|
proj["status"] = "running"
|
||||||
|
proj["started_at"] = _now_iso()
|
||||||
|
proj["pid"] = args.pid
|
||||||
|
_bump_timestamp(data)
|
||||||
|
return data
|
||||||
|
|
||||||
|
try:
|
||||||
|
write_locked(str(APPROVED_TASKS_FILE), mutator)
|
||||||
|
except KeyError:
|
||||||
|
print(f"slug '{args.slug}' not found", file=sys.stderr)
|
||||||
|
return 2
|
||||||
|
print(f"marked '{args.slug}' running (pid={args.pid})")
|
||||||
|
return 0
|
||||||
|
|
||||||
|
|
||||||
|
def cmd_mark_failed(args) -> int:
|
||||||
|
def mutator(data: dict) -> dict:
|
||||||
|
proj = _find_project(data, args.slug)
|
||||||
|
if proj is None:
|
||||||
|
raise KeyError(args.slug)
|
||||||
|
proj["status"] = "failed"
|
||||||
|
if args.error:
|
||||||
|
proj["error"] = args.error
|
||||||
|
_bump_timestamp(data)
|
||||||
|
return data
|
||||||
|
|
||||||
|
try:
|
||||||
|
write_locked(str(APPROVED_TASKS_FILE), mutator)
|
||||||
|
except KeyError:
|
||||||
|
print(f"slug '{args.slug}' not found", file=sys.stderr)
|
||||||
|
return 2
|
||||||
|
print(f"marked '{args.slug}' failed")
|
||||||
|
return 0
|
||||||
|
|
||||||
|
|
||||||
|
def cmd_show(args) -> int:
|
||||||
|
try:
|
||||||
|
data = read_locked(str(APPROVED_TASKS_FILE))
|
||||||
|
except FileNotFoundError:
|
||||||
|
print(f"file not found: {APPROVED_TASKS_FILE}", file=sys.stderr)
|
||||||
|
return 3
|
||||||
|
if args.slug:
|
||||||
|
proj = _find_project(data, args.slug)
|
||||||
|
if proj is None:
|
||||||
|
print(f"slug '{args.slug}' not found", file=sys.stderr)
|
||||||
|
return 2
|
||||||
|
print(json.dumps(proj, indent=2, ensure_ascii=False))
|
||||||
|
else:
|
||||||
|
print(json.dumps(data, indent=2, ensure_ascii=False))
|
||||||
|
return 0
|
||||||
|
|
||||||
|
|
||||||
|
# ---- argparse setup -------------------------------------------------------
|
||||||
|
|
||||||
|
|
||||||
|
def _build_parser() -> argparse.ArgumentParser:
|
||||||
|
p = argparse.ArgumentParser(
|
||||||
|
prog="approved-tasks",
|
||||||
|
description="Atomic CLI for approved-tasks.json (uses src.jsonlock.write_locked).",
|
||||||
|
)
|
||||||
|
sub = p.add_subparsers(dest="command", required=True)
|
||||||
|
|
||||||
|
sp = sub.add_parser("set-status", help="Set the status field of a project.")
|
||||||
|
sp.add_argument("--slug", required=True)
|
||||||
|
sp.add_argument("--status", required=True)
|
||||||
|
sp.set_defaults(func=cmd_set_status)
|
||||||
|
|
||||||
|
sp = sub.add_parser("set-field", help="Set an arbitrary field on a project.")
|
||||||
|
sp.add_argument("--slug", required=True)
|
||||||
|
sp.add_argument("--key", required=True)
|
||||||
|
sp.add_argument("--value", default="", help="String value (or use --null/--now).")
|
||||||
|
g = sp.add_mutually_exclusive_group()
|
||||||
|
g.add_argument("--int", action="store_true", help="Coerce --value to int.")
|
||||||
|
g.add_argument("--null", action="store_true", help="Set value to JSON null.")
|
||||||
|
g.add_argument("--now", action="store_true", help="Set value to current UTC ISO timestamp.")
|
||||||
|
g.add_argument("--json-value", action="store_true", dest="json_value",
|
||||||
|
help="Parse --value as JSON.")
|
||||||
|
sp.set_defaults(func=cmd_set_field)
|
||||||
|
|
||||||
|
sp = sub.add_parser("add-project", help="Append a new project entry.")
|
||||||
|
sp.add_argument("--slug", required=True)
|
||||||
|
sp.add_argument("--description", required=True)
|
||||||
|
sp.add_argument("--status", default="pending")
|
||||||
|
sp.set_defaults(func=cmd_add_project)
|
||||||
|
|
||||||
|
sp = sub.add_parser("mark-running", help="Atomic: status=running, started_at=now, pid=<PID>.")
|
||||||
|
sp.add_argument("--slug", required=True)
|
||||||
|
sp.add_argument("--pid", type=int, required=True)
|
||||||
|
sp.set_defaults(func=cmd_mark_running)
|
||||||
|
|
||||||
|
sp = sub.add_parser("mark-failed", help="Set status=failed (and optionally an error message).")
|
||||||
|
sp.add_argument("--slug", required=True)
|
||||||
|
sp.add_argument("--error", default=None)
|
||||||
|
sp.set_defaults(func=cmd_mark_failed)
|
||||||
|
|
||||||
|
sp = sub.add_parser("show", help="Print approved-tasks.json (or one project) to stdout.")
|
||||||
|
sp.add_argument("--slug", default=None)
|
||||||
|
sp.set_defaults(func=cmd_show)
|
||||||
|
|
||||||
|
return p
|
||||||
|
|
||||||
|
|
||||||
|
def main(argv: list[str] | None = None) -> int:
|
||||||
|
parser = _build_parser()
|
||||||
|
args = parser.parse_args(argv)
|
||||||
|
try:
|
||||||
|
return args.func(args)
|
||||||
|
except LockTimeoutError as exc:
|
||||||
|
print(f"lock timeout: {exc}", file=sys.stderr)
|
||||||
|
return 3
|
||||||
|
except OSError as exc:
|
||||||
|
print(f"io error: {exc}", file=sys.stderr)
|
||||||
|
return 3
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
sys.exit(main())
|
||||||
147
src/jsonlock.py
Normal file
147
src/jsonlock.py
Normal file
@@ -0,0 +1,147 @@
|
|||||||
|
"""Shared flock-based JSON locking helper.
|
||||||
|
|
||||||
|
Lock ordering invariant: always acquire locks in alphabetical order by filename
|
||||||
|
to avoid deadlock when a caller holds multiple locks simultaneously.
|
||||||
|
|
||||||
|
Implementation note (2026-04 — Lane C2 fix):
|
||||||
|
We lock on a sidecar `<path>.lock` file rather than the data file itself.
|
||||||
|
That's because `write_locked` uses `os.replace(tmp, target)` for atomic
|
||||||
|
publish — but `replace` swaps the inode behind `target`, which means a flock
|
||||||
|
held on the *old* fd no longer guards the new file. Concurrent writers on a
|
||||||
|
sidecar lockfile (whose inode is stable) get correct serialisation across
|
||||||
|
threads and processes.
|
||||||
|
"""
|
||||||
|
import errno
|
||||||
|
import fcntl
|
||||||
|
import json
|
||||||
|
import logging
|
||||||
|
import os
|
||||||
|
import threading
|
||||||
|
import time
|
||||||
|
from typing import Callable
|
||||||
|
|
||||||
|
_TIMEOUT_SEC = 5.0
|
||||||
|
_POLL_INTERVAL = 0.05
|
||||||
|
_log = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
|
||||||
|
class LockTimeoutError(Exception):
|
||||||
|
pass
|
||||||
|
|
||||||
|
|
||||||
|
_local = threading.local()
|
||||||
|
|
||||||
|
|
||||||
|
def _held_locks() -> dict:
|
||||||
|
"""Per-thread map: abspath → (lockfd, refcount). Used for re-entrancy."""
|
||||||
|
if not hasattr(_local, 'locks'):
|
||||||
|
_local.locks = {}
|
||||||
|
return _local.locks
|
||||||
|
|
||||||
|
|
||||||
|
def _try_lock(fd: int, lock_type: int, timeout: float) -> bool:
|
||||||
|
deadline = time.monotonic() + timeout
|
||||||
|
while True:
|
||||||
|
try:
|
||||||
|
fcntl.flock(fd, lock_type | fcntl.LOCK_NB)
|
||||||
|
return True
|
||||||
|
except BlockingIOError:
|
||||||
|
if time.monotonic() >= deadline:
|
||||||
|
return False
|
||||||
|
time.sleep(_POLL_INTERVAL)
|
||||||
|
|
||||||
|
|
||||||
|
def _acquire(fd: int, lock_type: int) -> None:
|
||||||
|
if _try_lock(fd, lock_type, _TIMEOUT_SEC):
|
||||||
|
return
|
||||||
|
if _try_lock(fd, lock_type, _TIMEOUT_SEC):
|
||||||
|
return
|
||||||
|
raise LockTimeoutError(
|
||||||
|
f"could not acquire flock within {2 * _TIMEOUT_SEC}s (after retry)"
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def _open_lockfile(abspath: str) -> int:
|
||||||
|
"""Open (creating if needed) the sidecar `<abspath>.lock` file."""
|
||||||
|
lock_path = abspath + ".lock"
|
||||||
|
# Ensure the parent dir exists — write_locked auto-creates the data file
|
||||||
|
# too, so we should be tolerant of the parent dir not having ever been
|
||||||
|
# touched.
|
||||||
|
parent = os.path.dirname(lock_path)
|
||||||
|
if parent:
|
||||||
|
try:
|
||||||
|
os.makedirs(parent, exist_ok=True)
|
||||||
|
except OSError as exc:
|
||||||
|
if exc.errno != errno.EEXIST:
|
||||||
|
raise
|
||||||
|
return os.open(lock_path, os.O_RDWR | os.O_CREAT, 0o644)
|
||||||
|
|
||||||
|
|
||||||
|
def read_locked(path: str) -> dict:
|
||||||
|
abspath = os.path.abspath(path)
|
||||||
|
held = _held_locks()
|
||||||
|
if abspath in held:
|
||||||
|
_log.debug("re-entrant read on %s; skipping flock", abspath)
|
||||||
|
with open(abspath, 'r', encoding='utf-8') as f:
|
||||||
|
return json.load(f)
|
||||||
|
|
||||||
|
lock_fd = _open_lockfile(abspath)
|
||||||
|
try:
|
||||||
|
_acquire(lock_fd, fcntl.LOCK_SH)
|
||||||
|
held[abspath] = (lock_fd, 1)
|
||||||
|
try:
|
||||||
|
with open(abspath, 'r', encoding='utf-8') as f:
|
||||||
|
return json.load(f)
|
||||||
|
finally:
|
||||||
|
held.pop(abspath, None)
|
||||||
|
try:
|
||||||
|
fcntl.flock(lock_fd, fcntl.LOCK_UN)
|
||||||
|
except OSError:
|
||||||
|
pass
|
||||||
|
finally:
|
||||||
|
os.close(lock_fd)
|
||||||
|
|
||||||
|
|
||||||
|
def write_locked(path: str, mutator: Callable[[dict], dict]) -> dict:
|
||||||
|
abspath = os.path.abspath(path)
|
||||||
|
held = _held_locks()
|
||||||
|
reentrant = abspath in held
|
||||||
|
|
||||||
|
lock_fd = -1 if reentrant else _open_lockfile(abspath)
|
||||||
|
try:
|
||||||
|
if not reentrant:
|
||||||
|
_acquire(lock_fd, fcntl.LOCK_EX)
|
||||||
|
held[abspath] = (lock_fd, 1)
|
||||||
|
else:
|
||||||
|
_log.debug("re-entrant write on %s; skipping flock", abspath)
|
||||||
|
|
||||||
|
try:
|
||||||
|
# Read current data (file may not exist yet — treat as {}).
|
||||||
|
try:
|
||||||
|
with open(abspath, 'r', encoding='utf-8') as f:
|
||||||
|
text = f.read()
|
||||||
|
data = json.loads(text) if text.strip() else {}
|
||||||
|
except FileNotFoundError:
|
||||||
|
data = {}
|
||||||
|
|
||||||
|
new_data = mutator(data)
|
||||||
|
|
||||||
|
# Atomic-rename invariant: tmp file MUST be on the same filesystem
|
||||||
|
# as the target (sibling path guarantees this).
|
||||||
|
tmp_path = abspath + ".tmp"
|
||||||
|
with open(tmp_path, 'w', encoding='utf-8') as tmp:
|
||||||
|
json.dump(new_data, tmp, indent=2)
|
||||||
|
tmp.flush()
|
||||||
|
os.fsync(tmp.fileno())
|
||||||
|
os.replace(tmp_path, abspath)
|
||||||
|
return new_data
|
||||||
|
finally:
|
||||||
|
if not reentrant:
|
||||||
|
held.pop(abspath, None)
|
||||||
|
try:
|
||||||
|
fcntl.flock(lock_fd, fcntl.LOCK_UN)
|
||||||
|
except OSError:
|
||||||
|
pass
|
||||||
|
finally:
|
||||||
|
if not reentrant and lock_fd >= 0:
|
||||||
|
os.close(lock_fd)
|
||||||
@@ -37,7 +37,6 @@ import logging
|
|||||||
import os
|
import os
|
||||||
import shutil
|
import shutil
|
||||||
import subprocess
|
import subprocess
|
||||||
import tempfile
|
|
||||||
import threading
|
import threading
|
||||||
import time
|
import time
|
||||||
import uuid
|
import uuid
|
||||||
@@ -52,6 +51,7 @@ from src.claude_session import (
|
|||||||
_run_claude,
|
_run_claude,
|
||||||
_safe_env,
|
_safe_env,
|
||||||
)
|
)
|
||||||
|
from src.jsonlock import read_locked, write_locked
|
||||||
|
|
||||||
logger = logging.getLogger(__name__)
|
logger = logging.getLogger(__name__)
|
||||||
_invoke_log = logging.getLogger("echo-core.invoke")
|
_invoke_log = logging.getLogger("echo-core.invoke")
|
||||||
@@ -106,33 +106,17 @@ def _channel_key(adapter: str, channel_id: str) -> str:
|
|||||||
|
|
||||||
|
|
||||||
def _load_planning_state() -> dict:
|
def _load_planning_state() -> dict:
|
||||||
"""Load planning sessions from disk. Returns {} if missing or empty."""
|
"""Load planning sessions from disk under a shared flock. Returns {} if missing."""
|
||||||
try:
|
try:
|
||||||
text = PLANNING_STATE_FILE.read_text(encoding="utf-8")
|
return read_locked(str(PLANNING_STATE_FILE))
|
||||||
if not text.strip():
|
|
||||||
return {}
|
|
||||||
return json.loads(text)
|
|
||||||
except (FileNotFoundError, json.JSONDecodeError):
|
except (FileNotFoundError, json.JSONDecodeError):
|
||||||
return {}
|
return {}
|
||||||
|
|
||||||
|
|
||||||
def _save_planning_state(data: dict) -> None:
|
def _save_planning_state(data: dict) -> None:
|
||||||
"""Atomically write planning sessions via tempfile + os.replace."""
|
"""Persist planning sessions under an exclusive flock + atomic replace."""
|
||||||
SESSIONS_DIR.mkdir(parents=True, exist_ok=True)
|
SESSIONS_DIR.mkdir(parents=True, exist_ok=True)
|
||||||
fd, tmp_path = tempfile.mkstemp(
|
write_locked(str(PLANNING_STATE_FILE), lambda _existing: data)
|
||||||
dir=SESSIONS_DIR, prefix=".planning_", suffix=".json"
|
|
||||||
)
|
|
||||||
try:
|
|
||||||
with os.fdopen(fd, "w", encoding="utf-8") as f:
|
|
||||||
json.dump(data, f, indent=2, ensure_ascii=False)
|
|
||||||
f.write("\n")
|
|
||||||
os.replace(tmp_path, PLANNING_STATE_FILE)
|
|
||||||
except BaseException:
|
|
||||||
try:
|
|
||||||
os.unlink(tmp_path)
|
|
||||||
except OSError:
|
|
||||||
pass
|
|
||||||
raise
|
|
||||||
|
|
||||||
|
|
||||||
# ---------------------------------------------------------------------------
|
# ---------------------------------------------------------------------------
|
||||||
|
|||||||
@@ -18,6 +18,7 @@ from src.claude_session import (
|
|||||||
set_session_model,
|
set_session_model,
|
||||||
VALID_MODELS,
|
VALID_MODELS,
|
||||||
)
|
)
|
||||||
|
from src.jsonlock import read_locked, write_locked
|
||||||
from src.planning_orchestrator import PlanningOrchestrator
|
from src.planning_orchestrator import PlanningOrchestrator
|
||||||
from src.planning_session import (
|
from src.planning_session import (
|
||||||
clear_planning_state,
|
clear_planning_state,
|
||||||
@@ -210,15 +211,20 @@ def _model_command(channel_id: str, text: str) -> str:
|
|||||||
|
|
||||||
|
|
||||||
def _load_approved_tasks() -> dict:
|
def _load_approved_tasks() -> dict:
|
||||||
"""Load approved-tasks.json, return empty structure if missing."""
|
"""Load approved-tasks.json under a shared flock; empty structure if missing."""
|
||||||
if APPROVED_TASKS_FILE.exists():
|
try:
|
||||||
return json.loads(APPROVED_TASKS_FILE.read_text())
|
data = read_locked(str(APPROVED_TASKS_FILE))
|
||||||
|
except FileNotFoundError:
|
||||||
return {"projects": [], "last_updated": None}
|
return {"projects": [], "last_updated": None}
|
||||||
|
if not data:
|
||||||
|
return {"projects": [], "last_updated": None}
|
||||||
|
return data
|
||||||
|
|
||||||
|
|
||||||
def _save_approved_tasks(data: dict) -> None:
|
def _save_approved_tasks(data: dict) -> None:
|
||||||
|
"""Persist approved-tasks.json under an exclusive flock + atomic replace."""
|
||||||
data["last_updated"] = datetime.now(timezone.utc).isoformat()
|
data["last_updated"] = datetime.now(timezone.utc).isoformat()
|
||||||
APPROVED_TASKS_FILE.write_text(json.dumps(data, indent=2, ensure_ascii=False))
|
write_locked(str(APPROVED_TASKS_FILE), lambda _existing: data)
|
||||||
|
|
||||||
|
|
||||||
RALPH_CMDS = {
|
RALPH_CMDS = {
|
||||||
|
|||||||
45
tests/test_dashboard_no_emoji.py
Normal file
45
tests/test_dashboard_no_emoji.py
Normal file
@@ -0,0 +1,45 @@
|
|||||||
|
"""T#30 — guard the new dashboard pages against emoji creep.
|
||||||
|
|
||||||
|
Design pillar (`dashboard/DESIGN.md`): the unified Echo dashboard uses Lucide
|
||||||
|
icons exclusively, no emojis in the chrome. This test asserts that no emoji
|
||||||
|
codepoints sneak into the static HTML files we ship.
|
||||||
|
"""
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import re
|
||||||
|
from pathlib import Path
|
||||||
|
|
||||||
|
import pytest
|
||||||
|
|
||||||
|
# Symbols + emoticons + transport + miscellaneous + dingbats. Covers the
|
||||||
|
# common emoji blocks; not exhaustive but matches what the spec asked for.
|
||||||
|
EMOJI_RE = re.compile(
|
||||||
|
r'[\U0001F300-\U0001F9FF\U0001FA00-\U0001FAFF☀-➿]'
|
||||||
|
)
|
||||||
|
|
||||||
|
DASHBOARD_DIR = Path(__file__).resolve().parent.parent / "dashboard"
|
||||||
|
|
||||||
|
|
||||||
|
# Pages that the unified-dashboard initiative declares emoji-free.
|
||||||
|
# index.html is excluded for now: it's the legacy unified panel page that
|
||||||
|
# predates DESIGN.md and still contains historical emoji (👤👷🤖🔴…). It
|
||||||
|
# will be migrated separately; that cleanup gets its own follow-up.
|
||||||
|
_UNIFIED_PAGES = ["workspace.html", "login.html"]
|
||||||
|
|
||||||
|
|
||||||
|
def _emoji_files():
|
||||||
|
"""Only run the test for pages that exist on disk.
|
||||||
|
|
||||||
|
workspace.html may not be in place during partial rollouts; skip silently
|
||||||
|
so the suite doesn't hard-fail during the unified-dashboard migration.
|
||||||
|
"""
|
||||||
|
return [name for name in _UNIFIED_PAGES if (DASHBOARD_DIR / name).is_file()]
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.mark.parametrize("html_file", _emoji_files() or ["__skip__"])
|
||||||
|
def test_no_emoji_in_dashboard_html(html_file):
|
||||||
|
if html_file == "__skip__":
|
||||||
|
pytest.skip("no unified dashboard pages are present yet")
|
||||||
|
content = (DASHBOARD_DIR / html_file).read_text(encoding="utf-8")
|
||||||
|
found = EMOJI_RE.findall(content)
|
||||||
|
assert not found, f"Emoji found in {html_file}: {found}"
|
||||||
811
tests/test_dashboard_projects_endpoint.py
Normal file
811
tests/test_dashboard_projects_endpoint.py
Normal file
@@ -0,0 +1,811 @@
|
|||||||
|
"""Tests for the unified /api/projects/* endpoints + auth + concurrency.
|
||||||
|
|
||||||
|
Covers (Lane C2 — Tasks #1–#29):
|
||||||
|
- T#1 unified status merges workspace + approved
|
||||||
|
- T#2 /propose validation + 201
|
||||||
|
- T#3 /approve
|
||||||
|
- T#4 /unapprove
|
||||||
|
- T#5 /cancel
|
||||||
|
- T#7 signature mtime cache
|
||||||
|
- T#8–T#12, T#19 planning endpoints
|
||||||
|
- T#13 legacy /ralph.html → /echo/workspace.html redirect
|
||||||
|
- T#14 cookie-required POST + SSE GET
|
||||||
|
- T#15 wrong-cookie 401
|
||||||
|
- T#16 flock serializes concurrent writes
|
||||||
|
- T#23 _derive_status table
|
||||||
|
- T#24 router planning unaffected by jsonlock
|
||||||
|
- T#25 lock timeout surfaces (LockTimeoutError → 503)
|
||||||
|
- T#26 transcript endpoint returns raw markdown (DOMPurify is client-side)
|
||||||
|
- T#27 If-Match version mismatch → 409
|
||||||
|
- T#28 cross-process write_locked serialization
|
||||||
|
- T#29 login/logout flow
|
||||||
|
"""
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import io
|
||||||
|
import json
|
||||||
|
import multiprocessing
|
||||||
|
import os
|
||||||
|
import sys
|
||||||
|
import threading
|
||||||
|
import time
|
||||||
|
from pathlib import Path
|
||||||
|
from unittest.mock import patch
|
||||||
|
|
||||||
|
import pytest
|
||||||
|
|
||||||
|
PROJECT_ROOT = Path(__file__).resolve().parents[1]
|
||||||
|
DASH = PROJECT_ROOT / "dashboard"
|
||||||
|
if str(DASH) not in sys.path:
|
||||||
|
sys.path.insert(0, str(DASH))
|
||||||
|
|
||||||
|
|
||||||
|
# ── shared stub handler ──────────────────────────────────────────────
|
||||||
|
|
||||||
|
|
||||||
|
class _Headers(dict):
|
||||||
|
"""Mimic http.server's headers — case-insensitive .get()."""
|
||||||
|
|
||||||
|
def get(self, key, default=None):
|
||||||
|
for k in self:
|
||||||
|
if k.lower() == key.lower():
|
||||||
|
return self[k]
|
||||||
|
return default
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.fixture
|
||||||
|
def projects_module():
|
||||||
|
from handlers import projects as _p # type: ignore
|
||||||
|
return _p
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.fixture
|
||||||
|
def auth_module():
|
||||||
|
from handlers import auth as _a # type: ignore
|
||||||
|
return _a
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.fixture
|
||||||
|
def stub(projects_module, auth_module, tmp_path, monkeypatch):
|
||||||
|
"""Build a stubbed handler mixing ProjectsHandlers + AuthHandlers.
|
||||||
|
|
||||||
|
Re-routes APPROVED_TASKS_FILE + WORKSPACE_DIR to tmp paths and provides
|
||||||
|
captures for send_json + raw send_response/send_header/wfile flow.
|
||||||
|
"""
|
||||||
|
import constants # type: ignore
|
||||||
|
|
||||||
|
# Re-route paths into tmp_path
|
||||||
|
approved_file = tmp_path / "approved-tasks.json"
|
||||||
|
workspace_dir = tmp_path / "workspace"
|
||||||
|
workspace_dir.mkdir()
|
||||||
|
monkeypatch.setattr(projects_module, "APPROVED_TASKS_FILE", approved_file)
|
||||||
|
monkeypatch.setattr(constants, "WORKSPACE_DIR", workspace_dir)
|
||||||
|
|
||||||
|
# Reset signature cache so each test starts clean.
|
||||||
|
projects_module._SIG_CACHE.update({"git_mtime": None, "signature": None, "ts": 0.0})
|
||||||
|
|
||||||
|
# Pin a known dashboard token so cookie checks are deterministic.
|
||||||
|
monkeypatch.setenv("DASHBOARD_TOKEN", "test-token")
|
||||||
|
monkeypatch.setattr(auth_module, "_DASHBOARD_TOKEN", None)
|
||||||
|
|
||||||
|
class _Stub(projects_module.ProjectsHandlers, auth_module.AuthHandlers):
|
||||||
|
def __init__(self):
|
||||||
|
self.captured = None
|
||||||
|
self.captured_code = None
|
||||||
|
self.path = "/api/projects"
|
||||||
|
self.command = "GET"
|
||||||
|
self.headers = _Headers()
|
||||||
|
self.rfile = io.BytesIO(b"")
|
||||||
|
self.wfile = io.BytesIO()
|
||||||
|
|
||||||
|
# Raw response capture — used by handle_login / handle_logout /
|
||||||
|
# handle_projects_stream which write headers+body manually.
|
||||||
|
self.response_code = None
|
||||||
|
self.response_headers: list[tuple[str, str]] = []
|
||||||
|
self.response_ended = False
|
||||||
|
self.responses: list[dict] = []
|
||||||
|
|
||||||
|
# send_json — used by most endpoints.
|
||||||
|
def send_json(self, data, code=200):
|
||||||
|
self.captured = data
|
||||||
|
self.captured_code = code
|
||||||
|
# Mirror as a raw response too so auth tests can check codes uniformly.
|
||||||
|
self.responses.append({"code": code, "headers": [], "body": json.dumps(data).encode()})
|
||||||
|
|
||||||
|
def send_error(self, code, message=None): # pragma: no cover — fallthrough
|
||||||
|
self.captured = {"error_code": code, "message": message}
|
||||||
|
self.captured_code = code
|
||||||
|
self.responses.append({"code": code, "headers": [], "body": b""})
|
||||||
|
|
||||||
|
# Raw response API — used by manual flows (login, SSE, etc).
|
||||||
|
def send_response(self, code):
|
||||||
|
self.response_code = code
|
||||||
|
self.response_headers = []
|
||||||
|
self.response_ended = False
|
||||||
|
|
||||||
|
def send_header(self, name, value):
|
||||||
|
self.response_headers.append((name, value))
|
||||||
|
|
||||||
|
def end_headers(self):
|
||||||
|
self.response_ended = True
|
||||||
|
self.responses.append({
|
||||||
|
"code": self.response_code,
|
||||||
|
"headers": list(self.response_headers),
|
||||||
|
"body": b"",
|
||||||
|
})
|
||||||
|
|
||||||
|
# Helpers for tests ----------------------------------------
|
||||||
|
def set_body(self, payload):
|
||||||
|
if isinstance(payload, (dict, list)):
|
||||||
|
blob = json.dumps(payload).encode()
|
||||||
|
self.headers["Content-Type"] = "application/json"
|
||||||
|
elif isinstance(payload, str):
|
||||||
|
blob = payload.encode()
|
||||||
|
else:
|
||||||
|
blob = bytes(payload or b"")
|
||||||
|
self.headers["Content-Length"] = str(len(blob))
|
||||||
|
self.rfile = io.BytesIO(blob)
|
||||||
|
|
||||||
|
def set_cookie(self, value):
|
||||||
|
self.headers["Cookie"] = f"dashboard={value}"
|
||||||
|
|
||||||
|
return _Stub()
|
||||||
|
|
||||||
|
|
||||||
|
# ─────────────────────────────────────────────────────────────────────
|
||||||
|
# T#14 / T#15 / T#29 — auth (cookie + login/logout)
|
||||||
|
# ─────────────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
|
||||||
|
class TestAuth:
|
||||||
|
def test_no_cookie_check_returns_false(self, stub):
|
||||||
|
# _check_dashboard_cookie is what the do_POST middleware uses.
|
||||||
|
assert stub._check_dashboard_cookie() is False
|
||||||
|
|
||||||
|
def test_wrong_cookie_check_returns_false(self, stub):
|
||||||
|
stub.set_cookie("not-the-token")
|
||||||
|
assert stub._check_dashboard_cookie() is False
|
||||||
|
|
||||||
|
def test_correct_cookie_check_returns_true(self, stub):
|
||||||
|
stub.set_cookie("test-token")
|
||||||
|
assert stub._check_dashboard_cookie() is True
|
||||||
|
|
||||||
|
# ── login flow ─────────────────────────────────────────────────
|
||||||
|
def test_login_sets_cookie(self, stub):
|
||||||
|
stub.headers["Content-Type"] = "application/x-www-form-urlencoded"
|
||||||
|
body = b"token=test-token"
|
||||||
|
stub.headers["Content-Length"] = str(len(body))
|
||||||
|
stub.rfile = io.BytesIO(body)
|
||||||
|
stub.handle_login()
|
||||||
|
assert stub.response_code == 302
|
||||||
|
names = {h[0]: h[1] for h in stub.response_headers}
|
||||||
|
assert "Set-Cookie" in names
|
||||||
|
assert "dashboard=test-token" in names["Set-Cookie"]
|
||||||
|
assert names.get("Location") == "/echo/workspace.html"
|
||||||
|
|
||||||
|
def test_login_wrong_token_returns_401(self, stub):
|
||||||
|
stub.headers["Content-Type"] = "application/x-www-form-urlencoded"
|
||||||
|
body = b"token=wrong"
|
||||||
|
stub.headers["Content-Length"] = str(len(body))
|
||||||
|
stub.rfile = io.BytesIO(body)
|
||||||
|
stub.handle_login()
|
||||||
|
assert stub.response_code == 401
|
||||||
|
|
||||||
|
def test_login_accepts_json_body(self, stub):
|
||||||
|
stub.headers["Content-Type"] = "application/json"
|
||||||
|
body = json.dumps({"token": "test-token"}).encode()
|
||||||
|
stub.headers["Content-Length"] = str(len(body))
|
||||||
|
stub.rfile = io.BytesIO(body)
|
||||||
|
stub.handle_login()
|
||||||
|
assert stub.response_code == 302
|
||||||
|
|
||||||
|
def test_logout_clears_cookie(self, stub):
|
||||||
|
stub.handle_logout()
|
||||||
|
assert stub.response_code == 200
|
||||||
|
names = {h[0]: h[1] for h in stub.response_headers}
|
||||||
|
assert "Set-Cookie" in names
|
||||||
|
assert "Max-Age=0" in names["Set-Cookie"]
|
||||||
|
|
||||||
|
|
||||||
|
# ─────────────────────────────────────────────────────────────────────
|
||||||
|
# T#1 — unified status (workspace + approved-tasks merge)
|
||||||
|
# ─────────────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
|
||||||
|
def _seed_approved(stub, projects_module, projects):
|
||||||
|
"""Write an approved-tasks.json file via the helper so version is set."""
|
||||||
|
def _mut(d):
|
||||||
|
d["projects"] = projects
|
||||||
|
d["version"] = 1
|
||||||
|
return d
|
||||||
|
projects_module._write_approved(_mut)
|
||||||
|
|
||||||
|
|
||||||
|
def _make_workspace(stub, projects_module, slug):
|
||||||
|
import constants # type: ignore
|
||||||
|
(constants.WORKSPACE_DIR / slug).mkdir(parents=True, exist_ok=True)
|
||||||
|
|
||||||
|
|
||||||
|
class TestUnifiedStatus:
|
||||||
|
def test_unified_status_merges_workspace_and_approved(self, stub, projects_module):
|
||||||
|
_make_workspace(stub, projects_module, "alpha")
|
||||||
|
_seed_approved(stub, projects_module, [
|
||||||
|
{"name": "alpha", "description": "the alpha project",
|
||||||
|
"status": "approved", "proposed_at": None, "approved_at": None,
|
||||||
|
"started_at": None, "pid": None, "planning_session_id": None,
|
||||||
|
"final_plan_path": None},
|
||||||
|
{"name": "ghost", "description": "no workspace yet",
|
||||||
|
"status": "pending", "proposed_at": None, "approved_at": None,
|
||||||
|
"started_at": None, "pid": None, "planning_session_id": None,
|
||||||
|
"final_plan_path": None},
|
||||||
|
])
|
||||||
|
stub.handle_unified_status()
|
||||||
|
assert stub.captured_code == 200
|
||||||
|
out = stub.captured
|
||||||
|
assert "version" in out
|
||||||
|
slugs = sorted(p["slug"] for p in out["projects"])
|
||||||
|
assert slugs == ["alpha", "ghost"]
|
||||||
|
assert out["count"] == 2
|
||||||
|
assert "fetchedAt" in out
|
||||||
|
|
||||||
|
|
||||||
|
# ─────────────────────────────────────────────────────────────────────
|
||||||
|
# T#23 — _derive_status table
|
||||||
|
# ─────────────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
|
||||||
|
class TestDeriveStatus:
|
||||||
|
@pytest.mark.parametrize(
|
||||||
|
"approved,prd,expected",
|
||||||
|
[
|
||||||
|
(None, None, "idle"),
|
||||||
|
({"status": "pending"}, None, "pending"),
|
||||||
|
({"status": "approved"}, None, "approved"),
|
||||||
|
({"status": "planning"}, None, "planning"),
|
||||||
|
({"status": "failed"}, None, "failed"),
|
||||||
|
(None, {"userStories": [{"passes": True}]}, "complete"),
|
||||||
|
(None, {"userStories": [
|
||||||
|
{"passes": False, "blocked": True}]}, "blocked"),
|
||||||
|
(None, {"userStories": [
|
||||||
|
{"passes": False, "failed": True}]}, "failed"),
|
||||||
|
(None, {"userStories": [
|
||||||
|
{"passes": True}, {"passes": False}]}, "idle"),
|
||||||
|
],
|
||||||
|
)
|
||||||
|
def test_table(self, stub, approved, prd, expected, monkeypatch, projects_module):
|
||||||
|
# Make sure no PID-alive logic interferes.
|
||||||
|
monkeypatch.setattr(
|
||||||
|
projects_module, "_pid_alive_with_cmdline", lambda pid: (False, "")
|
||||||
|
)
|
||||||
|
result = stub._derive_status("slug", approved, None, prd)
|
||||||
|
assert result == expected, f"approved={approved} prd={prd}"
|
||||||
|
|
||||||
|
def test_running_ralph_wins_over_manual(self, stub, projects_module, monkeypatch):
|
||||||
|
"""When PID is alive and cmdline contains ralph.sh → running-ralph."""
|
||||||
|
monkeypatch.setattr(
|
||||||
|
projects_module,
|
||||||
|
"_pid_alive_with_cmdline",
|
||||||
|
lambda pid: (True, "/bin/bash tools/ralph/ralph.sh demo"),
|
||||||
|
)
|
||||||
|
out = stub._derive_status("demo", {"pid": 12345, "status": "approved"}, None, None)
|
||||||
|
assert out == "running-ralph"
|
||||||
|
|
||||||
|
def test_running_manual_when_pid_alive_no_ralph(self, stub, projects_module, monkeypatch):
|
||||||
|
monkeypatch.setattr(
|
||||||
|
projects_module,
|
||||||
|
"_pid_alive_with_cmdline",
|
||||||
|
lambda pid: (True, "/usr/bin/python3 some_script.py"),
|
||||||
|
)
|
||||||
|
out = stub._derive_status("demo", {"pid": 12345, "status": "approved"}, None, None)
|
||||||
|
assert out == "running-manual"
|
||||||
|
|
||||||
|
|
||||||
|
# ─────────────────────────────────────────────────────────────────────
|
||||||
|
# T#2 — /propose
|
||||||
|
# ─────────────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
|
||||||
|
class TestPropose:
|
||||||
|
def _post(self, stub, payload):
|
||||||
|
stub.path = "/api/projects/propose"
|
||||||
|
stub.command = "POST"
|
||||||
|
stub.set_body(payload)
|
||||||
|
|
||||||
|
def test_propose_valid_creates_pending_entry(self, stub, projects_module):
|
||||||
|
self._post(stub, {"slug": "new-proj", "description": "a brand new project"})
|
||||||
|
stub.handle_propose()
|
||||||
|
assert stub.captured_code == 201
|
||||||
|
assert stub.captured["slug"] == "new-proj"
|
||||||
|
assert stub.captured["status"] == "pending"
|
||||||
|
# Verify on disk
|
||||||
|
data = projects_module._read_approved()
|
||||||
|
assert any(p["name"] == "new-proj" for p in data["projects"])
|
||||||
|
|
||||||
|
def test_propose_duplicate_slug_returns_409(self, stub, projects_module):
|
||||||
|
self._post(stub, {"slug": "dup-proj", "description": "first time around"})
|
||||||
|
stub.handle_propose()
|
||||||
|
# Second propose with same slug
|
||||||
|
self._post(stub, {"slug": "dup-proj", "description": "second time around"})
|
||||||
|
stub.handle_propose()
|
||||||
|
assert stub.captured_code == 409
|
||||||
|
|
||||||
|
def test_propose_invalid_slug_returns_400(self, stub):
|
||||||
|
self._post(stub, {"slug": "AB", "description": "too short uppercase slug"})
|
||||||
|
stub.handle_propose()
|
||||||
|
assert stub.captured_code == 400
|
||||||
|
assert stub.captured.get("error") == "validation_failed"
|
||||||
|
|
||||||
|
def test_propose_short_description_returns_400(self, stub):
|
||||||
|
self._post(stub, {"slug": "good-slug", "description": "short"})
|
||||||
|
stub.handle_propose()
|
||||||
|
assert stub.captured_code == 400
|
||||||
|
assert "description" in stub.captured.get("fields", {})
|
||||||
|
|
||||||
|
|
||||||
|
# ─────────────────────────────────────────────────────────────────────
|
||||||
|
# T#3 / T#4 / T#5 / T#27 — approve / unapprove / cancel + If-Match
|
||||||
|
# ─────────────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
|
||||||
|
class TestStatusMutators:
|
||||||
|
def _seed(self, stub, projects_module, slug, status="pending"):
|
||||||
|
_seed_approved(stub, projects_module, [{
|
||||||
|
"name": slug, "description": "x" * 12, "status": status,
|
||||||
|
"proposed_at": None, "approved_at": None, "started_at": None,
|
||||||
|
"pid": None, "planning_session_id": None, "final_plan_path": None,
|
||||||
|
}])
|
||||||
|
|
||||||
|
def _post(self, stub, payload, version=None):
|
||||||
|
stub.command = "POST"
|
||||||
|
if version is not None:
|
||||||
|
stub.headers["If-Match"] = str(version)
|
||||||
|
stub.set_body(payload)
|
||||||
|
|
||||||
|
def test_approve_pending_returns_200(self, stub, projects_module):
|
||||||
|
self._seed(stub, projects_module, "to-approve")
|
||||||
|
self._post(stub, {"slug": "to-approve"})
|
||||||
|
stub.handle_approve()
|
||||||
|
assert stub.captured_code == 200
|
||||||
|
assert stub.captured["status"] == "approved"
|
||||||
|
|
||||||
|
def test_unapprove_approved_returns_200(self, stub, projects_module):
|
||||||
|
self._seed(stub, projects_module, "to-flip", status="approved")
|
||||||
|
self._post(stub, {"slug": "to-flip"})
|
||||||
|
stub.handle_unapprove()
|
||||||
|
assert stub.captured_code == 200
|
||||||
|
assert stub.captured["status"] == "pending"
|
||||||
|
|
||||||
|
def test_cancel_returns_200(self, stub, projects_module):
|
||||||
|
self._seed(stub, projects_module, "kill-me")
|
||||||
|
self._post(stub, {"slug": "kill-me"})
|
||||||
|
stub.handle_cancel()
|
||||||
|
assert stub.captured_code == 200
|
||||||
|
assert stub.captured["status"] == "cancelled"
|
||||||
|
|
||||||
|
def test_action_on_changed_version_returns_409(self, stub, projects_module):
|
||||||
|
self._seed(stub, projects_module, "stale-target")
|
||||||
|
# Read current version, then bump it via an unrelated write.
|
||||||
|
current = projects_module._get_version_from(projects_module._read_approved())
|
||||||
|
|
||||||
|
def _bump(d):
|
||||||
|
d["projects"].append({
|
||||||
|
"name": "noise", "description": "y" * 12, "status": "pending",
|
||||||
|
})
|
||||||
|
projects_module._bump_version(d)
|
||||||
|
return d
|
||||||
|
projects_module._write_approved(_bump)
|
||||||
|
|
||||||
|
# Now attempt approve with the *stale* version → 409.
|
||||||
|
self._post(stub, {"slug": "stale-target"}, version=current)
|
||||||
|
stub.handle_approve()
|
||||||
|
assert stub.captured_code == 409
|
||||||
|
assert stub.captured.get("error") == "stale"
|
||||||
|
|
||||||
|
def test_invalid_slug_returns_400(self, stub):
|
||||||
|
self._post(stub, {"slug": "AA"})
|
||||||
|
stub.handle_approve()
|
||||||
|
assert stub.captured_code == 400
|
||||||
|
|
||||||
|
|
||||||
|
# ─────────────────────────────────────────────────────────────────────
|
||||||
|
# T#7 — signature stability + mtime cache
|
||||||
|
# ─────────────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
|
||||||
|
class TestSignature:
|
||||||
|
def test_signature_changes_when_project_added(self, stub, projects_module):
|
||||||
|
sig1 = projects_module._compute_signature()
|
||||||
|
_make_workspace(stub, projects_module, "newone")
|
||||||
|
sig2 = projects_module._compute_signature()
|
||||||
|
assert sig1 != sig2
|
||||||
|
|
||||||
|
def test_signature_mtime_cache_skips_git(self, stub, projects_module, monkeypatch):
|
||||||
|
"""Same `.git/index` mtime → cached porcelain output (subprocess not called twice)."""
|
||||||
|
# Pin the mtime so cache hits.
|
||||||
|
monkeypatch.setattr(projects_module, "_git_index_mtime", lambda: 1234.0)
|
||||||
|
calls = []
|
||||||
|
|
||||||
|
real_run = projects_module.subprocess.run
|
||||||
|
|
||||||
|
def _fake_run(*args, **kwargs):
|
||||||
|
calls.append(args)
|
||||||
|
class _R:
|
||||||
|
returncode = 0
|
||||||
|
stdout = ""
|
||||||
|
return _R()
|
||||||
|
monkeypatch.setattr(projects_module.subprocess, "run", _fake_run)
|
||||||
|
|
||||||
|
sig1 = projects_module._compute_signature()
|
||||||
|
sig2 = projects_module._compute_signature()
|
||||||
|
assert sig1 == sig2
|
||||||
|
# Only the FIRST call should have invoked git status.
|
||||||
|
assert len(calls) == 1
|
||||||
|
|
||||||
|
|
||||||
|
# ─────────────────────────────────────────────────────────────────────
|
||||||
|
# T#8 – T#12 / T#19 — planning endpoints
|
||||||
|
# ─────────────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
|
||||||
|
class _FakeSession:
|
||||||
|
def __init__(self, planning_session_id="ps-1", phase="/office-hours"):
|
||||||
|
self.planning_session_id = planning_session_id
|
||||||
|
self.phase = phase
|
||||||
|
|
||||||
|
|
||||||
|
class TestPlanningEndpoints:
|
||||||
|
def test_plan_start_sets_planning_status(self, stub, projects_module, monkeypatch):
|
||||||
|
# Mock the orchestrator so we don't hit Claude CLI.
|
||||||
|
from src import planning_orchestrator as po
|
||||||
|
monkeypatch.setattr(
|
||||||
|
po.PlanningOrchestrator, "start",
|
||||||
|
lambda **kw: (_FakeSession(planning_session_id="ps-1"), "first message"),
|
||||||
|
)
|
||||||
|
stub.command = "POST"
|
||||||
|
stub.set_body({"description": "a fresh planning project"})
|
||||||
|
stub.handle_plan_start("plan-target")
|
||||||
|
assert stub.captured_code == 200
|
||||||
|
# On-disk status is "planning"
|
||||||
|
data = projects_module._read_approved()
|
||||||
|
proj = next(p for p in data["projects"] if p["name"] == "plan-target")
|
||||||
|
assert proj["status"] == "planning"
|
||||||
|
assert proj["planning_session_id"] == "ps-1"
|
||||||
|
|
||||||
|
def test_plan_respond_returns_message(self, stub, monkeypatch):
|
||||||
|
from src import planning_orchestrator as po
|
||||||
|
monkeypatch.setattr(
|
||||||
|
po.PlanningOrchestrator, "respond",
|
||||||
|
lambda **kw: (_FakeSession(phase="/office-hours"), "second message", False),
|
||||||
|
)
|
||||||
|
stub.command = "POST"
|
||||||
|
stub.set_body({"message": "hello"})
|
||||||
|
stub.handle_plan_respond("plan-target")
|
||||||
|
assert stub.captured_code == 200
|
||||||
|
assert stub.captured["message"] == "second message"
|
||||||
|
|
||||||
|
def test_plan_respond_no_active_session_returns_404(self, stub, monkeypatch):
|
||||||
|
from src import planning_orchestrator as po
|
||||||
|
monkeypatch.setattr(
|
||||||
|
po.PlanningOrchestrator, "respond",
|
||||||
|
lambda **kw: (None, "no session", False),
|
||||||
|
)
|
||||||
|
stub.command = "POST"
|
||||||
|
stub.set_body({"message": "hello"})
|
||||||
|
stub.handle_plan_respond("missing")
|
||||||
|
assert stub.captured_code == 404
|
||||||
|
|
||||||
|
def test_plan_state_returns_active(self, stub, monkeypatch):
|
||||||
|
from src import planning_session as ps
|
||||||
|
monkeypatch.setattr(
|
||||||
|
ps, "get_planning_state",
|
||||||
|
lambda adapter, channel: {
|
||||||
|
"phase": "/office-hours",
|
||||||
|
"phases_planned": ["/office-hours"],
|
||||||
|
"phases_completed": [],
|
||||||
|
"session_id": "s",
|
||||||
|
"planning_session_id": "ps-1",
|
||||||
|
} if channel == "active-slug" else None,
|
||||||
|
)
|
||||||
|
stub.handle_plan_state("active-slug")
|
||||||
|
assert stub.captured_code == 200
|
||||||
|
assert stub.captured["status"] == "active"
|
||||||
|
assert stub.captured["phase"] == "/office-hours"
|
||||||
|
|
||||||
|
def test_plan_finalize_sets_approved(self, stub, projects_module, monkeypatch):
|
||||||
|
# Seed an approved-tasks pending entry first.
|
||||||
|
_seed_approved(stub, projects_module, [{
|
||||||
|
"name": "fin-target", "description": "x" * 12, "status": "planning",
|
||||||
|
"proposed_at": None, "approved_at": None, "started_at": None,
|
||||||
|
"pid": None, "planning_session_id": "ps-1", "final_plan_path": None,
|
||||||
|
}])
|
||||||
|
|
||||||
|
from src import planning_session as ps
|
||||||
|
from src import planning_orchestrator as po
|
||||||
|
monkeypatch.setattr(
|
||||||
|
ps, "get_planning_state",
|
||||||
|
lambda adapter, channel: {"final_plan_path": "/tmp/fin.md"},
|
||||||
|
)
|
||||||
|
monkeypatch.setattr(ps, "clear_planning_state", lambda *a, **kw: True)
|
||||||
|
monkeypatch.setattr(
|
||||||
|
po.PlanningOrchestrator, "final_plan_path",
|
||||||
|
lambda slug: Path("/tmp/fin.md"),
|
||||||
|
)
|
||||||
|
|
||||||
|
stub.command = "POST"
|
||||||
|
stub.set_body({})
|
||||||
|
stub.handle_plan_finalize("fin-target")
|
||||||
|
assert stub.captured_code == 200
|
||||||
|
assert stub.captured["status"] == "approved"
|
||||||
|
|
||||||
|
data = projects_module._read_approved()
|
||||||
|
proj = next(p for p in data["projects"] if p["name"] == "fin-target")
|
||||||
|
assert proj["status"] == "approved"
|
||||||
|
assert proj["final_plan_path"]
|
||||||
|
|
||||||
|
def test_plan_cancel_sets_pending(self, stub, projects_module, monkeypatch):
|
||||||
|
_seed_approved(stub, projects_module, [{
|
||||||
|
"name": "cancel-target", "description": "x" * 12, "status": "planning",
|
||||||
|
"proposed_at": None, "approved_at": None, "started_at": None,
|
||||||
|
"pid": None, "planning_session_id": "ps-1", "final_plan_path": None,
|
||||||
|
}])
|
||||||
|
|
||||||
|
from src import planning_orchestrator as po
|
||||||
|
monkeypatch.setattr(po.PlanningOrchestrator, "cancel", lambda *a, **kw: True)
|
||||||
|
|
||||||
|
stub.command = "POST"
|
||||||
|
stub.set_body({})
|
||||||
|
stub.handle_plan_cancel_planning("cancel-target")
|
||||||
|
assert stub.captured_code == 200
|
||||||
|
assert stub.captured["status"] == "pending"
|
||||||
|
|
||||||
|
data = projects_module._read_approved()
|
||||||
|
proj = next(p for p in data["projects"] if p["name"] == "cancel-target")
|
||||||
|
assert proj["status"] == "pending"
|
||||||
|
|
||||||
|
def test_plan_respond_slow_propagates_session_id(self, stub, monkeypatch):
|
||||||
|
"""T#19 surrogate — slow planning still returns the message + phase."""
|
||||||
|
from src import planning_orchestrator as po
|
||||||
|
|
||||||
|
def _slow_respond(**kw):
|
||||||
|
time.sleep(0.05)
|
||||||
|
return (_FakeSession(phase="/office-hours"), "delayed", False)
|
||||||
|
|
||||||
|
monkeypatch.setattr(po.PlanningOrchestrator, "respond", _slow_respond)
|
||||||
|
stub.command = "POST"
|
||||||
|
stub.set_body({"message": "yo"})
|
||||||
|
stub.handle_plan_respond("slow-target")
|
||||||
|
assert stub.captured_code == 200
|
||||||
|
assert stub.captured["message"] == "delayed"
|
||||||
|
|
||||||
|
|
||||||
|
# ─────────────────────────────────────────────────────────────────────
|
||||||
|
# T#14 ext — SSE auth
|
||||||
|
# ─────────────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
|
||||||
|
class TestSseAuth:
|
||||||
|
def test_sse_stream_requires_cookie(self, stub):
|
||||||
|
# No Cookie header → handler short-circuits with 401.
|
||||||
|
stub.path = "/api/projects/stream"
|
||||||
|
stub.command = "GET"
|
||||||
|
stub.handle_projects_stream()
|
||||||
|
assert stub.response_code == 401
|
||||||
|
|
||||||
|
|
||||||
|
# ─────────────────────────────────────────────────────────────────────
|
||||||
|
# T#26 — markdown pipeline (server returns raw, not pre-rendered)
|
||||||
|
# ─────────────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
|
||||||
|
class TestMarkdownPipeline:
|
||||||
|
def test_planning_response_marked_dompurify_dom_pipeline(
|
||||||
|
self, stub, projects_module, monkeypatch, tmp_path,
|
||||||
|
):
|
||||||
|
"""The transcript endpoint must return raw markdown (no HTML rendering).
|
||||||
|
|
||||||
|
Client-side DOMPurify is the only sanitiser; the server should be a
|
||||||
|
dumb passthrough.
|
||||||
|
"""
|
||||||
|
import constants # type: ignore
|
||||||
|
slug = "xss-test"
|
||||||
|
# Drop a final-plan.md with a script tag — must come back verbatim.
|
||||||
|
plan_dir = constants.WORKSPACE_DIR / slug / "scripts" / "ralph"
|
||||||
|
plan_dir.mkdir(parents=True)
|
||||||
|
raw_md = "# Plan\n<script>alert(1)</script>\n\n- item one"
|
||||||
|
(plan_dir / "final-plan.md").write_text(raw_md, encoding="utf-8")
|
||||||
|
|
||||||
|
from src import planning_session as ps
|
||||||
|
monkeypatch.setattr(
|
||||||
|
ps, "get_planning_state",
|
||||||
|
lambda adapter, channel: {
|
||||||
|
"phase": "__complete__",
|
||||||
|
"phases_completed": ["/office-hours"],
|
||||||
|
"last_text_excerpt": "<script>alert('x')</script>",
|
||||||
|
},
|
||||||
|
)
|
||||||
|
|
||||||
|
stub.handle_plan_transcript(slug)
|
||||||
|
assert stub.captured_code == 200
|
||||||
|
assert stub.captured["final_plan"] == raw_md
|
||||||
|
# The excerpt is also passed through raw — DOMPurify lives in the page.
|
||||||
|
assert "<script>" in stub.captured["last_text_excerpt"]
|
||||||
|
|
||||||
|
|
||||||
|
# ─────────────────────────────────────────────────────────────────────
|
||||||
|
# T#13 — legacy /ralph.html → /echo/workspace.html
|
||||||
|
# ─────────────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
|
||||||
|
class TestLegacyRedirect:
|
||||||
|
def test_ralph_html_redirects_to_workspace(self):
|
||||||
|
"""Smoke check that api.py contains the redirect handler.
|
||||||
|
|
||||||
|
We check the source rather than spinning up a real HTTP server —
|
||||||
|
matches the api routing test pattern in
|
||||||
|
tests/test_dashboard_ralph_endpoint.py.
|
||||||
|
"""
|
||||||
|
src = (PROJECT_ROOT / "dashboard" / "api.py").read_text()
|
||||||
|
assert "/ralph.html" in src
|
||||||
|
assert "/echo/workspace.html" in src
|
||||||
|
# The redirect block must use status 302 — keep the user's bookmarks alive.
|
||||||
|
# (Sanity check; not exhaustive parse.)
|
||||||
|
assert "send_response(302)" in src
|
||||||
|
|
||||||
|
|
||||||
|
# ─────────────────────────────────────────────────────────────────────
|
||||||
|
# T#16 / T#25 / T#28 — concurrency
|
||||||
|
# ─────────────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
|
||||||
|
def _writer_thread(path: str, key: str, iterations: int = 10):
|
||||||
|
"""Thread target — bumps a counter under flock, races with siblings."""
|
||||||
|
from src.jsonlock import write_locked
|
||||||
|
|
||||||
|
for _ in range(iterations):
|
||||||
|
def _mut(data):
|
||||||
|
data[key] = data.get(key, 0) + 1
|
||||||
|
return data
|
||||||
|
write_locked(path, _mut)
|
||||||
|
|
||||||
|
|
||||||
|
class TestConcurrency:
|
||||||
|
def test_flock_serializes_concurrent_writes(self, tmp_path):
|
||||||
|
"""T#16 — two threads bumping the same file via write_locked don't corrupt."""
|
||||||
|
target = tmp_path / "race.json"
|
||||||
|
target.write_text("{}", encoding="utf-8")
|
||||||
|
|
||||||
|
t1 = threading.Thread(target=_writer_thread, args=(str(target), "a", 25))
|
||||||
|
t2 = threading.Thread(target=_writer_thread, args=(str(target), "b", 25))
|
||||||
|
t1.start(); t2.start()
|
||||||
|
t1.join(); t2.join()
|
||||||
|
|
||||||
|
data = json.loads(target.read_text())
|
||||||
|
# No lost updates — each writer should land all 25 bumps.
|
||||||
|
assert data["a"] == 25
|
||||||
|
assert data["b"] == 25
|
||||||
|
|
||||||
|
def test_flock_timeout_returns_lock_timeout_error(self, tmp_path, monkeypatch):
|
||||||
|
"""T#25 — when a peer hogs LOCK_EX past the deadline, write_locked raises.
|
||||||
|
|
||||||
|
The dashboard layer is expected to translate this to 503; this test
|
||||||
|
pins the LockTimeoutError surface so that contract is testable.
|
||||||
|
"""
|
||||||
|
from src import jsonlock
|
||||||
|
from src.jsonlock import LockTimeoutError, write_locked
|
||||||
|
|
||||||
|
monkeypatch.setattr(jsonlock, "_TIMEOUT_SEC", 0.1)
|
||||||
|
monkeypatch.setattr(jsonlock, "_POLL_INTERVAL", 0.01)
|
||||||
|
|
||||||
|
target = tmp_path / "hostage.json"
|
||||||
|
target.write_text("{}", encoding="utf-8")
|
||||||
|
# write_locked contends on the sidecar lockfile, not the data file.
|
||||||
|
lock_path = str(target) + ".lock"
|
||||||
|
|
||||||
|
acquired = threading.Event()
|
||||||
|
release = threading.Event()
|
||||||
|
|
||||||
|
def _hold():
|
||||||
|
import fcntl
|
||||||
|
fd = os.open(lock_path, os.O_RDWR | os.O_CREAT, 0o644)
|
||||||
|
try:
|
||||||
|
fcntl.flock(fd, fcntl.LOCK_EX)
|
||||||
|
acquired.set()
|
||||||
|
release.wait(timeout=5)
|
||||||
|
try:
|
||||||
|
fcntl.flock(fd, fcntl.LOCK_UN)
|
||||||
|
except OSError:
|
||||||
|
pass
|
||||||
|
finally:
|
||||||
|
os.close(fd)
|
||||||
|
|
||||||
|
t = threading.Thread(target=_hold, daemon=True)
|
||||||
|
t.start()
|
||||||
|
try:
|
||||||
|
assert acquired.wait(timeout=2)
|
||||||
|
with pytest.raises(LockTimeoutError):
|
||||||
|
write_locked(str(target), lambda d: {"x": 1})
|
||||||
|
finally:
|
||||||
|
release.set()
|
||||||
|
t.join(timeout=2)
|
||||||
|
|
||||||
|
|
||||||
|
# ── multiprocessing helper (must be top-level for pickling) ──────────
|
||||||
|
|
||||||
|
|
||||||
|
def _proc_writer(target_path: str, key: str, iterations: int):
|
||||||
|
"""Run in a child process — uses the project's src.jsonlock helper."""
|
||||||
|
sys.path.insert(0, str(PROJECT_ROOT))
|
||||||
|
from src.jsonlock import write_locked
|
||||||
|
|
||||||
|
for _ in range(iterations):
|
||||||
|
def _mut(data, k=key):
|
||||||
|
data[k] = data.get(k, 0) + 1
|
||||||
|
return data
|
||||||
|
write_locked(target_path, _mut)
|
||||||
|
|
||||||
|
|
||||||
|
def test_cron_and_dashboard_concurrent_writes_serialize(tmp_path):
|
||||||
|
"""T#28 — two processes share flock semantics across PIDs.
|
||||||
|
|
||||||
|
Linux flock is per-fd but POSIX file locks coordinate at the inode level,
|
||||||
|
so two processes opening the same path get serialised. This test makes
|
||||||
|
sure the helper preserves that property end-to-end.
|
||||||
|
"""
|
||||||
|
target = tmp_path / "two-procs.json"
|
||||||
|
target.write_text("{}", encoding="utf-8")
|
||||||
|
|
||||||
|
ctx = multiprocessing.get_context("spawn")
|
||||||
|
p1 = ctx.Process(target=_proc_writer, args=(str(target), "p1", 15))
|
||||||
|
p2 = ctx.Process(target=_proc_writer, args=(str(target), "p2", 15))
|
||||||
|
p1.start(); p2.start()
|
||||||
|
p1.join(timeout=30); p2.join(timeout=30)
|
||||||
|
assert p1.exitcode == 0, "p1 didn't exit cleanly"
|
||||||
|
assert p2.exitcode == 0, "p2 didn't exit cleanly"
|
||||||
|
|
||||||
|
data = json.loads(target.read_text())
|
||||||
|
assert data.get("p1") == 15, f"p1 lost updates: {data}"
|
||||||
|
assert data.get("p2") == 15, f"p2 lost updates: {data}"
|
||||||
|
|
||||||
|
|
||||||
|
# ─────────────────────────────────────────────────────────────────────
|
||||||
|
# T#24 — router planning unaffected by jsonlock
|
||||||
|
# ─────────────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
|
||||||
|
class TestRouterPlanningRegression:
|
||||||
|
"""Smoke checks that the router still handles /p, /a, /k, /l after
|
||||||
|
jsonlock wrapping. We don't exercise the full Claude CLI; we just verify
|
||||||
|
the command dispatch remains intact.
|
||||||
|
"""
|
||||||
|
|
||||||
|
def test_slash_commands_routed(self, tmp_path, monkeypatch):
|
||||||
|
from src import router, planning_session
|
||||||
|
# Re-route APPROVED_TASKS_FILE so we don't poison the real file.
|
||||||
|
approved = tmp_path / "approved-tasks.json"
|
||||||
|
approved.write_text(json.dumps({"projects": [], "version": 0}))
|
||||||
|
monkeypatch.setattr(router, "APPROVED_TASKS_FILE", approved)
|
||||||
|
|
||||||
|
# /a (list pending) — should be routed as a command, never hit Claude.
|
||||||
|
with patch("src.router.send_message") as mock_send:
|
||||||
|
response, is_cmd = router.route_message(
|
||||||
|
"ch-1", "user-1", "/a", adapter_name="discord",
|
||||||
|
)
|
||||||
|
mock_send.assert_not_called()
|
||||||
|
assert is_cmd is True
|
||||||
|
|
||||||
|
|
||||||
|
# ─────────────────────────────────────────────────────────────────────
|
||||||
|
# Misc — version helpers
|
||||||
|
# ─────────────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
|
||||||
|
class TestVersionHelpers:
|
||||||
|
def test_get_version_handles_missing(self, projects_module):
|
||||||
|
assert projects_module._get_version_from({}) == 0
|
||||||
|
assert projects_module._get_version_from({"version": 7}) == 7
|
||||||
|
assert projects_module._get_version_from({"version": "bad"}) == 0
|
||||||
|
|
||||||
|
def test_bump_version_increments(self, projects_module):
|
||||||
|
d = {"version": 4}
|
||||||
|
out = projects_module._bump_version(d)
|
||||||
|
assert out["version"] == 5
|
||||||
@@ -139,9 +139,9 @@ class TestStatus:
|
|||||||
|
|
||||||
class TestLog:
|
class TestLog:
|
||||||
def test_log_returns_progress_lines(self, handler, tmp_path):
|
def test_log_returns_progress_lines(self, handler, tmp_path):
|
||||||
_make_ralph_project(tmp_path, "p1", [], progress="line1\nline2\nline3")
|
_make_ralph_project(tmp_path, "p1a", [], progress="line1\nline2\nline3")
|
||||||
handler.path = "/api/ralph/p1/log"
|
handler.path = "/api/ralph/p1a/log"
|
||||||
handler.handle_ralph_log("p1")
|
handler.handle_ralph_log("p1a")
|
||||||
assert handler.captured_code == 200
|
assert handler.captured_code == 200
|
||||||
assert handler.captured["lines"] == ["line1", "line2", "line3"]
|
assert handler.captured["lines"] == ["line1", "line2", "line3"]
|
||||||
assert handler.captured["total"] == 3
|
assert handler.captured["total"] == 3
|
||||||
@@ -170,10 +170,10 @@ class TestLog:
|
|||||||
class TestPrd:
|
class TestPrd:
|
||||||
def test_prd_returns_full_json(self, handler, tmp_path):
|
def test_prd_returns_full_json(self, handler, tmp_path):
|
||||||
stories = [{"id": "US-001", "passes": False, "title": "t", "priority": 10}]
|
stories = [{"id": "US-001", "passes": False, "title": "t", "priority": 10}]
|
||||||
_make_ralph_project(tmp_path, "p1", stories)
|
_make_ralph_project(tmp_path, "p1a", stories)
|
||||||
handler.handle_ralph_prd("p1")
|
handler.handle_ralph_prd("p1a")
|
||||||
assert handler.captured_code == 200
|
assert handler.captured_code == 200
|
||||||
assert handler.captured["projectName"] == "p1"
|
assert handler.captured["projectName"] == "p1a"
|
||||||
assert len(handler.captured["userStories"]) == 1
|
assert len(handler.captured["userStories"]) == 1
|
||||||
|
|
||||||
def test_prd_404_when_missing(self, handler, tmp_path):
|
def test_prd_404_when_missing(self, handler, tmp_path):
|
||||||
|
|||||||
165
tests/test_jsonlock.py
Normal file
165
tests/test_jsonlock.py
Normal file
@@ -0,0 +1,165 @@
|
|||||||
|
"""Tests for src.jsonlock — flock-based JSON locking primitives.
|
||||||
|
|
||||||
|
Covers:
|
||||||
|
- read_locked: missing file, valid JSON
|
||||||
|
- write_locked: create-on-missing, mutator chain, atomic temp file
|
||||||
|
- re-entrant write_locked (same thread, no deadlock)
|
||||||
|
- LockTimeoutError on contended lock (cross-thread)
|
||||||
|
"""
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import fcntl
|
||||||
|
import json
|
||||||
|
import os
|
||||||
|
import threading
|
||||||
|
import time
|
||||||
|
from pathlib import Path
|
||||||
|
|
||||||
|
import pytest
|
||||||
|
|
||||||
|
from src import jsonlock
|
||||||
|
from src.jsonlock import (
|
||||||
|
LockTimeoutError,
|
||||||
|
read_locked,
|
||||||
|
write_locked,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
# ── read_locked ──────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
|
||||||
|
def test_read_locked_missing_file(tmp_path):
|
||||||
|
"""read_locked surfaces FileNotFoundError on missing path.
|
||||||
|
|
||||||
|
(Callers are expected to catch and default to {}; the helper itself
|
||||||
|
does not swallow this error.)
|
||||||
|
"""
|
||||||
|
target = tmp_path / "nope.json"
|
||||||
|
with pytest.raises(FileNotFoundError):
|
||||||
|
read_locked(str(target))
|
||||||
|
|
||||||
|
|
||||||
|
def test_read_locked_valid_json(tmp_path):
|
||||||
|
target = tmp_path / "ok.json"
|
||||||
|
target.write_text(json.dumps({"a": 1, "b": [2, 3]}), encoding="utf-8")
|
||||||
|
assert read_locked(str(target)) == {"a": 1, "b": [2, 3]}
|
||||||
|
|
||||||
|
|
||||||
|
# ── write_locked ─────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
|
||||||
|
def test_write_locked_creates_file(tmp_path):
|
||||||
|
target = tmp_path / "new.json"
|
||||||
|
seen: list = []
|
||||||
|
|
||||||
|
def _mut(data):
|
||||||
|
seen.append(dict(data))
|
||||||
|
data["created"] = True
|
||||||
|
return data
|
||||||
|
|
||||||
|
out = write_locked(str(target), _mut)
|
||||||
|
assert out == {"created": True}
|
||||||
|
assert seen == [{}]
|
||||||
|
assert target.exists()
|
||||||
|
assert json.loads(target.read_text()) == {"created": True}
|
||||||
|
|
||||||
|
|
||||||
|
def test_write_locked_mutator_applied(tmp_path):
|
||||||
|
target = tmp_path / "exists.json"
|
||||||
|
target.write_text(json.dumps({"counter": 1}), encoding="utf-8")
|
||||||
|
|
||||||
|
seen: list = []
|
||||||
|
|
||||||
|
def _mut(data):
|
||||||
|
seen.append(dict(data))
|
||||||
|
data["counter"] = data.get("counter", 0) + 1
|
||||||
|
return data
|
||||||
|
|
||||||
|
out = write_locked(str(target), _mut)
|
||||||
|
assert out == {"counter": 2}
|
||||||
|
assert seen == [{"counter": 1}]
|
||||||
|
assert json.loads(target.read_text()) == {"counter": 2}
|
||||||
|
|
||||||
|
|
||||||
|
def test_write_locked_atomic(tmp_path):
|
||||||
|
"""After write_locked returns, the .tmp sibling must be gone (rename clean)."""
|
||||||
|
target = tmp_path / "atomic.json"
|
||||||
|
write_locked(str(target), lambda d: {"x": 42})
|
||||||
|
assert target.exists()
|
||||||
|
assert not (tmp_path / "atomic.json.tmp").exists()
|
||||||
|
assert json.loads(target.read_text()) == {"x": 42}
|
||||||
|
|
||||||
|
|
||||||
|
# ── re-entry guard ───────────────────────────────────────────────────
|
||||||
|
|
||||||
|
|
||||||
|
def test_reentrant_write_locked(tmp_path):
|
||||||
|
"""Same-thread write_locked nested inside the mutator must not deadlock."""
|
||||||
|
target = tmp_path / "reentry.json"
|
||||||
|
|
||||||
|
def _outer(data):
|
||||||
|
data["outer"] = True
|
||||||
|
# Re-entrant call to the same path — must skip flock acquisition
|
||||||
|
# (would deadlock otherwise on the held LOCK_EX).
|
||||||
|
write_locked(str(target), _inner)
|
||||||
|
# Re-read after inner; the inner write replaced the file.
|
||||||
|
return data
|
||||||
|
|
||||||
|
def _inner(data):
|
||||||
|
# Inner observes whatever is currently on disk (data is read at start
|
||||||
|
# of write_locked; we just stash an inner marker).
|
||||||
|
data["inner"] = True
|
||||||
|
return data
|
||||||
|
|
||||||
|
out = write_locked(str(target), _outer)
|
||||||
|
# The outer mutator's `data` is independent of the inner write and
|
||||||
|
# is what gets persisted last (replaces the inner-only file).
|
||||||
|
persisted = json.loads(target.read_text())
|
||||||
|
assert persisted.get("outer") is True
|
||||||
|
# `out` is the dict the outer mutator returned.
|
||||||
|
assert out is not None
|
||||||
|
|
||||||
|
|
||||||
|
# ── timeout ──────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
|
||||||
|
def test_lock_timeout_raises(tmp_path, monkeypatch):
|
||||||
|
"""If another thread holds LOCK_EX, write_locked must give up with LockTimeoutError.
|
||||||
|
|
||||||
|
We patch _TIMEOUT_SEC down to keep the test fast (the real value of 5s
|
||||||
|
× 2 retries would make the test take 10s).
|
||||||
|
"""
|
||||||
|
monkeypatch.setattr(jsonlock, "_TIMEOUT_SEC", 0.1)
|
||||||
|
monkeypatch.setattr(jsonlock, "_POLL_INTERVAL", 0.01)
|
||||||
|
|
||||||
|
target = tmp_path / "contended.json"
|
||||||
|
target.write_text("{}", encoding="utf-8")
|
||||||
|
lock_path = str(target) + ".lock"
|
||||||
|
|
||||||
|
holder_acquired = threading.Event()
|
||||||
|
holder_release = threading.Event()
|
||||||
|
|
||||||
|
def _holder():
|
||||||
|
fd = os.open(lock_path, os.O_RDWR | os.O_CREAT, 0o644)
|
||||||
|
try:
|
||||||
|
fcntl.flock(fd, fcntl.LOCK_EX)
|
||||||
|
holder_acquired.set()
|
||||||
|
# Hold until the test signals release.
|
||||||
|
holder_release.wait(timeout=5.0)
|
||||||
|
try:
|
||||||
|
fcntl.flock(fd, fcntl.LOCK_UN)
|
||||||
|
except OSError:
|
||||||
|
pass
|
||||||
|
finally:
|
||||||
|
os.close(fd)
|
||||||
|
|
||||||
|
t = threading.Thread(target=_holder, daemon=True)
|
||||||
|
t.start()
|
||||||
|
assert holder_acquired.wait(timeout=2.0), "holder thread did not acquire lock"
|
||||||
|
|
||||||
|
try:
|
||||||
|
with pytest.raises(LockTimeoutError):
|
||||||
|
write_locked(str(target), lambda d: {"should": "fail"})
|
||||||
|
finally:
|
||||||
|
holder_release.set()
|
||||||
|
t.join(timeout=2.0)
|
||||||
Reference in New Issue
Block a user