feat(gsd): single-writer engine v3 — state machine guards, actor identity, reversibility

Three work streams bundled into one phase to close the behavioral control
gaps identified in the v2 handler audit:

Stream 1 — State machine guards on all 8 tool handlers:
- Entity existence checks before mutations (milestone, slice, task)
- Valid status transition enforcement (can't double-complete, can't re-plan
  closed work, can't complete inside a closed parent)
- depends_on validation for plan-milestone (deps must exist + be complete)
- blockerTaskId verification in replan-slice (must exist + be complete)
- Deep task check in complete-milestone (all tasks, not just slice status)

Stream 2 — Actor identity + persistent audit log:
- WorkflowEvent extended with actor_name, trigger_reason, session_id
- Engine-generated UUID session_id stable per process lifetime
- All 8 handlers accept optional actorName/triggerReason and pass through
- workflow-logger now flushes to .gsd/audit-log.jsonl (survives context resets)
- New setLogBasePath() and readAuditLog() API

Stream 3 — Reversibility + unit ownership:
- New gsd_task_reopen handler (reset task to pending with full guards)
- New gsd_slice_reopen handler (reset slice + all tasks with transaction)
- Opt-in unit ownership via .gsd/unit-claims.json (claim/release/check)
- Ownership enforced in complete-task and complete-slice when claims exist
- insertReplanHistory converted to upsert via schema v11 unique index

Bug fixes (pre-existing):
- renderPlanContent checkbox: checked "done" but tasks are "complete"
- renderRoadmapContent: same "done" vs "complete" mismatch
- renderPlanContent format: **T01:** title didn't match parsePlan regex
- Tests updated to seed DB entities and match projection output format
This commit is contained in:
Jeremy McSpadden 2026-03-25 00:30:24 -05:00 committed by Lex Christopherson
parent 5130b04d5a
commit a1592c984b
19 changed files with 1573 additions and 464 deletions

View file

@ -0,0 +1,396 @@
# Single-Writer Engine v3: Agent Control Plane
# Plan: State machine guards + actor causation + reversibility
# Created: 2026-03-25
---
## Background
v2 gave the engine **write discipline** — agents can't corrupt STATE.md directly,
every mutation goes through the DB, event log is append-only.
What v2 did NOT give us: **behavioral control**. Agents can still:
- Complete a task twice (silent overwrite)
- Complete a slice with open tasks (if they bypass the slice status check)
- Complete a milestone in any status
- Re-plan already-completed slices/tasks
- Call any tool on any unit regardless of ownership
- Leave no trace of *who* did what or *why*
This plan bundles three work streams that close those gaps together, since they
share infrastructure (WorkflowEvent schema, DB query surface, handler preconditions).
---
## Work Streams
### Stream 1 — State Machine Guards (P0)
Add precondition checks to all 8 tool handlers so invalid transitions return an
error instead of silently succeeding.
### Stream 2 — Actor Identity + Persistent Audit Log (P1)
Extend `WorkflowEvent` with `actor_name` and `trigger_reason`. Flush the
in-process `workflow-logger` buffer to a persistent `.gsd/audit-log.jsonl`
after every tool invocation, so "who did what and why" is durable.
### Stream 3 — Reversibility + Unit Ownership (P2)
Add `gsd_task_reopen` and `gsd_slice_reopen` tools. Add a unit-ownership
validation layer so an agent can only complete/reopen units it explicitly claimed.
---
## Detailed Task Breakdown
---
### Stream 1: State Machine Guards
#### S1-T1: Add `getTask`, `getSlice`, `getMilestone` existence helpers to `gsd-db.ts`
**Files:** `src/resources/extensions/gsd/gsd-db.ts`
These are read-only DB helpers to confirm an entity exists and return its current
`status` field before any mutation. Each returns `null` if not found.
```ts
getTask(taskId: string, sliceId: string): { status: string } | null
getSlice(sliceId: string, milestoneId: string): { status: string } | null
getMilestoneById(milestoneId: string): { status: string } | null
```
Note: `getSlice` may already exist — check before adding a duplicate. The audit
report references it in `complete-slice.ts` line 207 but only to list tasks.
Need a version that returns the slice row itself.
---
#### S1-T2: Guard `complete-task.ts` — enforce valid transitions
**File:** `src/resources/extensions/gsd/tools/complete-task.ts`
Preconditions to add (before the transaction block):
1. `getMilestoneById(milestoneId)` → must exist, must NOT be `"complete"` or `"done"`
2. `getSlice(sliceId, milestoneId)` → must exist, must be `"pending"` or `"in_progress"`
3. `getTask(taskId, sliceId)` → if exists, status must be `"pending"` (not already `"complete"`)
On failure: return `{ error: "<reason>" }` — do NOT throw.
---
#### S1-T3: Guard `complete-slice.ts` — enforce valid transitions
**File:** `src/resources/extensions/gsd/tools/complete-slice.ts`
Preconditions to add:
1. `getSlice(sliceId, milestoneId)` → must exist, status must be `"pending"` or `"in_progress"` (not already `"complete"`)
2. `getMilestoneById(milestoneId)` → must exist, must NOT be `"complete"`
3. All tasks in slice must be `"complete"` (already enforced — keep it, add explicit slice-status check before this)
---
#### S1-T4: Guard `complete-milestone.ts` — enforce valid transitions
**File:** `src/resources/extensions/gsd/tools/complete-milestone.ts`
Preconditions to add:
1. `getMilestoneById(milestoneId)` → must exist, status must be `"active"` (not already `"complete"`)
2. Keep existing all-slices-complete check
3. Add deep check: all tasks across all slices must also be `"complete"` (not just slice status)
---
#### S1-T5: Guard `plan-task.ts` — block re-planning completed tasks
**File:** `src/resources/extensions/gsd/tools/plan-task.ts`
Preconditions to add:
1. `getSlice(sliceId, milestoneId)` → must exist, status must NOT be `"complete"` (already blocks planning on a closed slice)
2. If task exists (`getTask`), status must be `"pending"` — block re-planning a `"complete"` task
---
#### S1-T6: Guard `plan-slice.ts` — block re-planning completed slices
**File:** `src/resources/extensions/gsd/tools/plan-slice.ts`
Preconditions to add:
1. `getSlice(sliceId, milestoneId)` → if exists, status must NOT be `"complete"`
2. `getMilestoneById(milestoneId)` → must exist, status must NOT be `"complete"`
---
#### S1-T7: Guard `plan-milestone.ts` — block re-planning completed milestones
**File:** `src/resources/extensions/gsd/tools/plan-milestone.ts`
Preconditions to add:
1. If milestone exists (`getMilestoneById`), status must NOT be `"complete"`
2. Validate `depends_on` array: each referenced milestoneId must exist and be `"complete"` before this milestone can be planned
---
#### S1-T8: Guard `reassess-roadmap.ts` — verify completedSliceId is actually complete
**File:** `src/resources/extensions/gsd/tools/reassess-roadmap.ts`
Gap: `completedSliceId` is accepted without confirming it is actually `"complete"` status.
Also: no check that milestone is still `"active"` (could reassess after milestone is done).
Preconditions to add:
1. `getSlice(completedSliceId, milestoneId)` → status must be `"complete"`
2. `getMilestoneById(milestoneId)` → status must be `"active"`
---
#### S1-T9: Guard `replan-slice.ts` — verify blockerTaskId exists and is complete
**File:** `src/resources/extensions/gsd/tools/replan-slice.ts`
Gaps:
- `blockerTaskId` is accepted without verifying it exists or is `"complete"`
- No check that slice is still `"in_progress"` (could replan after slice is complete)
Preconditions to add:
1. `getSlice(sliceId, milestoneId)` → status must be `"in_progress"` or `"pending"`, NOT `"complete"`
2. `getTask(blockerTaskId, sliceId)` → must exist, status must be `"complete"`
---
### Stream 2: Actor Identity + Persistent Audit Log
#### S2-T1: Extend `WorkflowEvent` with actor identity and causation fields
**File:** `src/resources/extensions/gsd/workflow-events.ts`
Extend the `WorkflowEvent` interface:
```ts
export interface WorkflowEvent {
cmd: string;
params: Record<string, unknown>;
ts: string;
hash: string;
actor: "agent" | "system";
actor_name?: string; // ADD: e.g. "executor-agent-01", "gsd-orchestrator"
trigger_reason?: string; // ADD: e.g. "plan-phase complete", "user invoked gsd_complete_task"
session_id?: string; // ADD: process.env.GSD_SESSION_ID if set
}
```
Update `appendEvent` to accept and persist these new optional fields.
Hash computation must remain stable (still hashes only `cmd + params`, not the new fields)
so fork detection isn't broken.
---
#### S2-T2: Update all 8 tool handlers to pass actor identity to `appendEvent`
**Files:** All 8 handlers in `src/resources/extensions/gsd/tools/`
Each handler receives its inputs. Add a convention where params can include:
- `actor_name` (optional string) — caller passes their agent identity
- `trigger_reason` (optional string) — caller passes why this action was triggered
If not provided, default to `actor_name: "agent"`, `trigger_reason: undefined`.
Handlers pass these through to `appendEvent`.
The tool schemas (in the MCP tool definitions) should expose `actor_name` and
`trigger_reason` as optional string params so agents can self-identify.
---
#### S2-T3: Persist `workflow-logger` to `.gsd/audit-log.jsonl`
**File:** `src/resources/extensions/gsd/workflow-logger.ts`
Current behavior: `_buffer` is in-process memory, drained per-unit and dropped.
This means errors/warnings disappear across context resets.
Change: After `_push()` writes to the in-process buffer, also append the entry
to `.gsd/audit-log.jsonl` (using `appendFileSync`). This requires the basePath
to be available — either pass it as a module-level setter (`setLogBasePath(path)`)
called at engine init, or accept it as a param on `logWarning`/`logError`.
The audit log format should match `LogEntry` serialized as JSON + newline,
consistent with `event-log.jsonl`.
---
#### S2-T4: Add `readAuditLog` helper to `workflow-logger.ts`
**File:** `src/resources/extensions/gsd/workflow-logger.ts`
Expose a read function so the auto-loop and diagnostics can surface persistent
audit entries without replaying the event log:
```ts
export function readAuditLog(basePath: string): LogEntry[]
```
---
### Stream 3: Reversibility + Unit Ownership
#### S3-T1: Add `updateTaskStatus` and `updateSliceStatus` DB helpers
**File:** `src/resources/extensions/gsd/gsd-db.ts`
If they don't already exist (check first):
```ts
updateTaskStatus(taskId: string, sliceId: string, status: string): void
updateSliceStatus(sliceId: string, milestoneId: string, status: string): void
```
These are the write primitives needed by reopen tools.
---
#### S3-T2: Implement `gsd_task_reopen` tool handler
**New file:** `src/resources/extensions/gsd/tools/reopen-task.ts`
Logic:
1. Validate `taskId`, `sliceId`, `milestoneId` are non-empty strings
2. `getTask(taskId, sliceId)` → must exist, status must be `"complete"` (can't reopen what isn't closed)
3. `getSlice(sliceId, milestoneId)` → must exist, status must NOT be `"complete"` (can't reopen a task inside a closed slice — too late)
4. `getMilestoneById(milestoneId)` → must exist, status must NOT be `"complete"`
5. In a transaction: `updateTaskStatus(taskId, sliceId, "pending")`
6. Append event: `cmd: "reopen_task"`, include `actor_name`, `trigger_reason`
7. Invalidate state cache + render projections
---
#### S3-T3: Implement `gsd_slice_reopen` tool handler
**New file:** `src/resources/extensions/gsd/tools/reopen-slice.ts`
Logic:
1. Validate `sliceId`, `milestoneId`
2. `getSlice(sliceId, milestoneId)` → must exist, status must be `"complete"`
3. `getMilestoneById(milestoneId)` → must NOT be `"complete"`
4. In a transaction: `updateSliceStatus(sliceId, milestoneId, "in_progress")` + set all tasks back to `"pending"`
5. Append event: `cmd: "reopen_slice"`
6. Invalidate state cache + render projections
---
#### S3-T4: Add unit ownership claim/check mechanism
**New file:** `src/resources/extensions/gsd/unit-ownership.ts`
Lightweight JSON file at `.gsd/unit-claims.json` mapping unit IDs to agent names:
```json
{
"M01/S01/T01": { "agent": "executor-01", "claimed_at": "2026-03-25T..." },
"M01/S01": { "agent": "executor-01", "claimed_at": "2026-03-25T..." }
}
```
Functions:
```ts
claimUnit(basePath, unitKey, agentName): void // atomic write
releaseUnit(basePath, unitKey): void
getOwner(basePath, unitKey): string | null
```
`unitKey` format: `"<milestoneId>/<sliceId>/<taskId>"` for tasks, `"<milestoneId>/<sliceId>"` for slices.
---
#### S3-T5: Wire ownership check into `complete-task` and `complete-slice`
**Files:** `complete-task.ts`, `complete-slice.ts`
If `actor_name` is provided AND `.gsd/unit-claims.json` exists AND the unit is claimed:
- Verify `actor_name` matches the registered owner
- If mismatch: return `{ error: "Unit <key> is owned by <owner>, not <actor>" }`
- If no claim file / unit is unclaimed: allow the operation (opt-in ownership)
Ownership is enforced only when claims are present, keeping the feature opt-in.
---
## Files Changed Summary
| File | Change Type |
|------|-------------|
| `gsd-db.ts` | Add `getTask`, `getMilestoneById` existence helpers; add `updateTaskStatus`, `updateSliceStatus` |
| `workflow-events.ts` | Extend `WorkflowEvent` with `actor_name`, `trigger_reason`, `session_id` |
| `workflow-logger.ts` | Add persistent flush to `.gsd/audit-log.jsonl`; add `setLogBasePath`; add `readAuditLog` |
| `tools/complete-task.ts` | State machine guards + ownership check + actor passthrough |
| `tools/complete-slice.ts` | State machine guards + ownership check + actor passthrough |
| `tools/complete-milestone.ts` | State machine guards + deep task check |
| `tools/plan-task.ts` | Block re-planning complete tasks |
| `tools/plan-slice.ts` | Block re-planning complete slices |
| `tools/plan-milestone.ts` | Block re-planning complete milestones + depends_on validation |
| `tools/reassess-roadmap.ts` | Verify completedSliceId status + milestone status check |
| `tools/replan-slice.ts` | Verify blockerTaskId exists + slice status check |
| `tools/reopen-task.ts` | NEW — gsd_task_reopen handler |
| `tools/reopen-slice.ts` | NEW — gsd_slice_reopen handler |
| `unit-ownership.ts` | NEW — claim/release/check ownership |
---
## Execution Order (Dependencies)
```
S1-T1 (DB helpers)
└── S1-T2 (complete-task guards)
└── S1-T3 (complete-slice guards)
└── S1-T4 (complete-milestone guards)
└── S1-T5 (plan-task guards)
└── S1-T6 (plan-slice guards)
└── S1-T7 (plan-milestone guards)
└── S1-T8 (reassess-roadmap guards)
└── S1-T9 (replan-slice guards)
└── S3-T1 (updateTask/SliceStatus helpers) ── S3-T2, S3-T3
S2-T1 (WorkflowEvent schema)
└── S2-T2 (handler actor passthrough)
S2-T3 (audit-log flush)
└── S2-T4 (readAuditLog)
S3-T4 (unit-ownership.ts)
└── S3-T5 (wire into complete-task/slice)
```
Parallelizable:
- All of Stream 1 (S1-T2 through S1-T9) can run in parallel once S1-T1 is done
- Stream 2 and Stream 3 are fully independent of Stream 1
---
## What Success Looks Like
After this phase:
1. **Double-complete** → returns `{ error: "Task T01 is already complete" }` instead of silently overwriting
2. **Complete slice with open tasks** → still blocked (was already caught), plus slice status guard added
3. **Re-plan closed work** → returns `{ error: "Cannot re-plan: slice S01 is already complete" }`
4. **Wrong agent completes task** → returns `{ error: "Unit M01/S01/T01 is owned by executor-01, not executor-02" }`
5. **Post-mortem**`.gsd/audit-log.jsonl` has full trace with actor_name + trigger_reason across context resets
6. **Oops recovery**`gsd_task_reopen` / `gsd_slice_reopen` without manual SQL surgery
7. **depends_on enforcement** → cannot plan M02 if M01 is not yet complete
---
## Decisions
1. **Ownership: opt-in** — enforced only when `.gsd/unit-claims.json` exists. Zero breaking change for existing workflows; teams adopt incrementally.
2. **Slice reopen: reset all tasks to `"pending"`** — simpler invariant. If you're reopening a slice, you're re-doing the work. Partial resets create ambiguous state.
3. **`trigger_reason`: caller-provided** — agents know *why* they acted; the engine can only know *what* was called. Default to `undefined` if not passed.
4. **Session ID: engine-generated** — UUID generated once at engine startup, stored in module state in `workflow-events.ts`. No reliance on agents setting env vars correctly.
5. **Idempotency: fix in this phase** — convert `insertAssessment` and `insertReplanHistory` to upserts (keyed on `milestoneId+sliceId` and `milestoneId+sliceId+ts` respectively). Accumulating duplicate records on retry is a bug, not a feature.
### Additional task from decision 5:
#### S1-T10: Convert `insertAssessment` and `insertReplanHistory` to upserts
**File:** `src/resources/extensions/gsd/gsd-db.ts`
- `insertAssessment`: upsert keyed on `(milestone_id, completed_slice_id)` — one assessment per completed slice per milestone
- `insertReplanHistory`: upsert keyed on `(milestone_id, slice_id, blocker_task_id)` — one replan record per blocker per slice

View file

@ -149,7 +149,7 @@ function openRawDb(path: string): unknown {
return new Database(path);
}
const SCHEMA_VERSION = 10;
const SCHEMA_VERSION = 11;
function initSchema(db: DbAdapter, fileBacked: boolean): void {
if (fileBacked) db.exec("PRAGMA journal_mode=WAL");
@ -623,6 +623,13 @@ function migrateSchema(db: DbAdapter): void {
if (currentVersion < 11) {
ensureColumn(db, "tasks", "full_plan_md", `ALTER TABLE tasks ADD COLUMN full_plan_md TEXT NOT NULL DEFAULT ''`);
// Add unique constraint to replan_history for idempotency:
// one replan record per blocker task per slice per milestone.
db.exec(`
CREATE UNIQUE INDEX IF NOT EXISTS idx_replan_history_unique
ON replan_history(milestone_id, slice_id, task_id)
WHERE slice_id IS NOT NULL AND task_id IS NOT NULL
`);
db.prepare("INSERT INTO schema_version (version, applied_at) VALUES (:version, :applied_at)").run({
":version": 11,
@ -1606,8 +1613,10 @@ export function insertReplanHistory(entry: {
replacementArtifactPath?: string | null;
}): void {
if (!currentDb) throw new GSDError(GSD_STALE_STATE, "gsd-db: No database open");
// INSERT OR REPLACE: idempotent on (milestone_id, slice_id, task_id) via schema v11 unique index.
// Retrying the same replan silently updates summary instead of accumulating duplicate rows.
currentDb.prepare(
`INSERT INTO replan_history (milestone_id, slice_id, task_id, summary, previous_artifact_path, replacement_artifact_path, created_at)
`INSERT OR REPLACE INTO replan_history (milestone_id, slice_id, task_id, summary, previous_artifact_path, replacement_artifact_path, created_at)
VALUES (:milestone_id, :slice_id, :task_id, :summary, :previous_artifact_path, :replacement_artifact_path, :created_at)`,
).run({
":milestone_id": entry.milestoneId,

View file

@ -1,5 +1,4 @@
import { describe, test, afterEach } from "node:test";
import assert from "node:assert/strict";
import { createTestContext } from './test-helpers.ts';
import * as fs from 'node:fs';
import * as path from 'node:path';
import * as os from 'node:os';
@ -18,6 +17,8 @@ import {
import { handleCompleteSlice } from '../tools/complete-slice.ts';
import type { CompleteSliceParams } from '../types.ts';
const { assertEq, assertTrue, assertMatch, report } = createTestContext();
// ═══════════════════════════════════════════════════════════════════════════
// Helpers
// ═══════════════════════════════════════════════════════════════════════════
@ -114,262 +115,297 @@ Run the test suite and verify all assertions pass.
}
// ═══════════════════════════════════════════════════════════════════════════
// Tests
// complete-slice: Schema v6 migration
// ═══════════════════════════════════════════════════════════════════════════
describe("complete-slice: schema v6 migration", () => {
test("schema version and columns exist", () => {
const dbPath = tempDbPath();
openDatabase(dbPath);
console.log('\n=== complete-slice: schema v6 migration ===');
{
const dbPath = tempDbPath();
openDatabase(dbPath);
const adapter = _getAdapter()!;
const adapter = _getAdapter()!;
// Verify schema version is current (v10 after M001 planning migrations)
const versionRow = adapter.prepare('SELECT MAX(version) as v FROM schema_version').get();
assert.strictEqual(versionRow?.['v'], 10, 'schema version should be 10');
// Verify schema version is current (v10 after M001 planning migrations)
const versionRow = adapter.prepare('SELECT MAX(version) as v FROM schema_version').get();
assertEq(versionRow?.['v'], 11, 'schema version should be 11');
// Verify slices table has full_summary_md and full_uat_md columns
const cols = adapter.prepare("PRAGMA table_info(slices)").all();
const colNames = cols.map(c => c['name'] as string);
assert.ok(colNames.includes('full_summary_md'), 'slices table should have full_summary_md column');
assert.ok(colNames.includes('full_uat_md'), 'slices table should have full_uat_md column');
// Verify slices table has full_summary_md and full_uat_md columns
const cols = adapter.prepare("PRAGMA table_info(slices)").all();
const colNames = cols.map(c => c['name'] as string);
assertTrue(colNames.includes('full_summary_md'), 'slices table should have full_summary_md column');
assertTrue(colNames.includes('full_uat_md'), 'slices table should have full_uat_md column');
cleanup(dbPath);
});
});
cleanup(dbPath);
}
describe("complete-slice: getSlice/updateSliceStatus accessors", () => {
test("getSlice and updateSliceStatus work correctly", () => {
const dbPath = tempDbPath();
openDatabase(dbPath);
// ═══════════════════════════════════════════════════════════════════════════
// complete-slice: getSlice/updateSliceStatus accessors
// ═══════════════════════════════════════════════════════════════════════════
// Insert milestone and slice
insertMilestone({ id: 'M001' });
insertSlice({ id: 'S01', milestoneId: 'M001', title: 'Test Slice', risk: 'high' });
console.log('\n=== complete-slice: getSlice/updateSliceStatus accessors ===');
{
const dbPath = tempDbPath();
openDatabase(dbPath);
// getSlice returns correct row
const slice = getSlice('M001', 'S01');
assert.ok(slice !== null, 'getSlice should return non-null for existing slice');
assert.strictEqual(slice!.id, 'S01', 'slice id');
assert.strictEqual(slice!.milestone_id, 'M001', 'slice milestone_id');
assert.strictEqual(slice!.title, 'Test Slice', 'slice title');
assert.strictEqual(slice!.risk, 'high', 'slice risk');
assert.strictEqual(slice!.status, 'pending', 'slice default status should be pending');
assert.strictEqual(slice!.completed_at, null, 'slice completed_at should be null initially');
assert.strictEqual(slice!.full_summary_md, '', 'slice full_summary_md should be empty initially');
assert.strictEqual(slice!.full_uat_md, '', 'slice full_uat_md should be empty initially');
// Insert milestone and slice
insertMilestone({ id: 'M001' });
insertSlice({ id: 'S01', milestoneId: 'M001', title: 'Test Slice', risk: 'high' });
// getSlice returns null for non-existent
const noSlice = getSlice('M001', 'S99');
assert.strictEqual(noSlice, null, 'non-existent slice should return null');
// getSlice returns correct row
const slice = getSlice('M001', 'S01');
assertTrue(slice !== null, 'getSlice should return non-null for existing slice');
assertEq(slice!.id, 'S01', 'slice id');
assertEq(slice!.milestone_id, 'M001', 'slice milestone_id');
assertEq(slice!.title, 'Test Slice', 'slice title');
assertEq(slice!.risk, 'high', 'slice risk');
assertEq(slice!.status, 'pending', 'slice default status should be pending');
assertEq(slice!.completed_at, null, 'slice completed_at should be null initially');
assertEq(slice!.full_summary_md, '', 'slice full_summary_md should be empty initially');
assertEq(slice!.full_uat_md, '', 'slice full_uat_md should be empty initially');
// updateSliceStatus changes status and completed_at
const now = new Date().toISOString();
updateSliceStatus('M001', 'S01', 'complete', now);
const updated = getSlice('M001', 'S01');
assert.strictEqual(updated!.status, 'complete', 'slice status should be updated to complete');
assert.strictEqual(updated!.completed_at, now, 'slice completed_at should be set');
// getSlice returns null for non-existent
const noSlice = getSlice('M001', 'S99');
assertEq(noSlice, null, 'non-existent slice should return null');
cleanup(dbPath);
});
});
// updateSliceStatus changes status and completed_at
const now = new Date().toISOString();
updateSliceStatus('M001', 'S01', 'complete', now);
const updated = getSlice('M001', 'S01');
assertEq(updated!.status, 'complete', 'slice status should be updated to complete');
assertEq(updated!.completed_at, now, 'slice completed_at should be set');
describe("complete-slice: handler", () => {
test("happy path", async () => {
const dbPath = tempDbPath();
openDatabase(dbPath);
cleanup(dbPath);
}
const { basePath, roadmapPath } = createTempProject();
// ═══════════════════════════════════════════════════════════════════════════
// complete-slice: Handler happy path
// ═══════════════════════════════════════════════════════════════════════════
// Set up DB state: milestone, slice, 2 complete tasks
insertMilestone({ id: 'M001' });
insertSlice({ id: 'S01', milestoneId: 'M001' });
insertTask({ id: 'T01', sliceId: 'S01', milestoneId: 'M001', status: 'complete', title: 'Task 1' });
insertTask({ id: 'T02', sliceId: 'S01', milestoneId: 'M001', status: 'complete', title: 'Task 2' });
console.log('\n=== complete-slice: handler happy path ===');
{
const dbPath = tempDbPath();
openDatabase(dbPath);
const params = makeValidSliceParams();
const result = await handleCompleteSlice(params, basePath);
const { basePath, roadmapPath } = createTempProject();
assert.ok(!('error' in result), 'handler should succeed without error');
if (!('error' in result)) {
assert.strictEqual(result.sliceId, 'S01', 'result sliceId');
assert.strictEqual(result.milestoneId, 'M001', 'result milestoneId');
assert.ok(result.summaryPath.endsWith('S01-SUMMARY.md'), 'summaryPath should end with S01-SUMMARY.md');
assert.ok(result.uatPath.endsWith('S01-UAT.md'), 'uatPath should end with S01-UAT.md');
// Set up DB state: milestone, slices (S01 + S02), 2 complete tasks
insertMilestone({ id: 'M001' });
insertSlice({ id: 'S01', milestoneId: 'M001' });
insertSlice({ id: 'S02', milestoneId: 'M001', title: 'Second Slice' });
insertTask({ id: 'T01', sliceId: 'S01', milestoneId: 'M001', status: 'complete', title: 'Task 1' });
insertTask({ id: 'T02', sliceId: 'S01', milestoneId: 'M001', status: 'complete', title: 'Task 2' });
// (a) Verify SUMMARY.md exists on disk with correct YAML frontmatter
assert.ok(fs.existsSync(result.summaryPath), 'summary file should exist on disk');
const summaryContent = fs.readFileSync(result.summaryPath, 'utf-8');
assert.match(summaryContent, /^---\n/, 'summary should start with YAML frontmatter');
assert.match(summaryContent, /id: S01/, 'summary should contain id: S01');
assert.match(summaryContent, /parent: M001/, 'summary should contain parent: M001');
assert.match(summaryContent, /milestone: M001/, 'summary should contain milestone: M001');
assert.match(summaryContent, /blocker_discovered: false/, 'summary should contain blocker_discovered');
assert.match(summaryContent, /verification_result: passed/, 'summary should contain verification_result');
assert.match(summaryContent, /key_files:/, 'summary should contain key_files');
assert.match(summaryContent, /patterns_established:/, 'summary should contain patterns_established');
assert.match(summaryContent, /observability_surfaces:/, 'summary should contain observability_surfaces');
assert.match(summaryContent, /provides:/, 'summary should contain provides');
assert.match(summaryContent, /# S01: Test Slice/, 'summary should have H1 with slice ID and title');
assert.match(summaryContent, /\*\*Implemented test slice with full coverage\*\*/, 'summary should have one-liner in bold');
assert.match(summaryContent, /## What Happened/, 'summary should have What Happened section');
assert.match(summaryContent, /## Verification/, 'summary should have Verification section');
assert.match(summaryContent, /## Requirements Advanced/, 'summary should have Requirements Advanced section');
const params = makeValidSliceParams();
const result = await handleCompleteSlice(params, basePath);
// (b) Verify UAT.md exists on disk
assert.ok(fs.existsSync(result.uatPath), 'UAT file should exist on disk');
const uatContent = fs.readFileSync(result.uatPath, 'utf-8');
assert.match(uatContent, /# S01: Test Slice — UAT/, 'UAT should have correct title');
assert.match(uatContent, /Milestone:\*\* M001/, 'UAT should reference milestone');
assert.match(uatContent, /Smoke Test/, 'UAT should contain smoke test from params');
assertTrue(!('error' in result), 'handler should succeed without error');
if (!('error' in result)) {
assertEq(result.sliceId, 'S01', 'result sliceId');
assertEq(result.milestoneId, 'M001', 'result milestoneId');
assertTrue(result.summaryPath.endsWith('S01-SUMMARY.md'), 'summaryPath should end with S01-SUMMARY.md');
assertTrue(result.uatPath.endsWith('S01-UAT.md'), 'uatPath should end with S01-UAT.md');
// (c) Verify roadmap checkbox toggled to [x]
const roadmapContent = fs.readFileSync(roadmapPath, 'utf-8');
assert.match(roadmapContent, /\[x\]\s+\*\*S01:/, 'S01 should be checked in roadmap');
assert.match(roadmapContent, /\[ \]\s+\*\*S02:/, 'S02 should still be unchecked in roadmap');
// (a) Verify SUMMARY.md exists on disk with correct YAML frontmatter
assertTrue(fs.existsSync(result.summaryPath), 'summary file should exist on disk');
const summaryContent = fs.readFileSync(result.summaryPath, 'utf-8');
assertMatch(summaryContent, /^---\n/, 'summary should start with YAML frontmatter');
assertMatch(summaryContent, /id: S01/, 'summary should contain id: S01');
assertMatch(summaryContent, /parent: M001/, 'summary should contain parent: M001');
assertMatch(summaryContent, /milestone: M001/, 'summary should contain milestone: M001');
assertMatch(summaryContent, /blocker_discovered: false/, 'summary should contain blocker_discovered');
assertMatch(summaryContent, /verification_result: passed/, 'summary should contain verification_result');
assertMatch(summaryContent, /key_files:/, 'summary should contain key_files');
assertMatch(summaryContent, /patterns_established:/, 'summary should contain patterns_established');
assertMatch(summaryContent, /observability_surfaces:/, 'summary should contain observability_surfaces');
assertMatch(summaryContent, /provides:/, 'summary should contain provides');
assertMatch(summaryContent, /# S01: Test Slice/, 'summary should have H1 with slice ID and title');
assertMatch(summaryContent, /\*\*Implemented test slice with full coverage\*\*/, 'summary should have one-liner in bold');
assertMatch(summaryContent, /## What Happened/, 'summary should have What Happened section');
assertMatch(summaryContent, /## Verification/, 'summary should have Verification section');
assertMatch(summaryContent, /## Requirements Advanced/, 'summary should have Requirements Advanced section');
// (d) Verify full_summary_md and full_uat_md stored in DB for D004 recovery
const sliceAfter = getSlice('M001', 'S01');
assert.ok(sliceAfter !== null, 'slice should exist in DB after handler');
assert.ok(sliceAfter!.full_summary_md.length > 0, 'full_summary_md should be non-empty in DB');
assert.match(sliceAfter!.full_summary_md, /id: S01/, 'full_summary_md should contain frontmatter');
assert.ok(sliceAfter!.full_uat_md.length > 0, 'full_uat_md should be non-empty in DB');
assert.match(sliceAfter!.full_uat_md, /S01: Test Slice — UAT/, 'full_uat_md should contain UAT title');
// (b) Verify UAT.md exists on disk
assertTrue(fs.existsSync(result.uatPath), 'UAT file should exist on disk');
const uatContent = fs.readFileSync(result.uatPath, 'utf-8');
assertMatch(uatContent, /# S01: Test Slice — UAT/, 'UAT should have correct title');
assertMatch(uatContent, /Milestone:\*\* M001/, 'UAT should reference milestone');
assertMatch(uatContent, /Smoke Test/, 'UAT should contain smoke test from params');
// (e) Verify slice status is complete in DB
assert.strictEqual(sliceAfter!.status, 'complete', 'slice status should be complete in DB');
assert.ok(sliceAfter!.completed_at !== null, 'completed_at should be set in DB');
}
// (c) Verify roadmap shows S01 complete (✅) and S02 pending (⬜) in table format
// Projection renders roadmap as a Slice Overview table, not checkbox list
const roadmapContent = fs.readFileSync(roadmapPath, 'utf-8');
assertMatch(roadmapContent, /\| S01 \|/, 'S01 should appear in roadmap table');
assertTrue(roadmapContent.includes('✅'), 'completed S01 should show ✅ in roadmap table');
assertMatch(roadmapContent, /\| S02 \|/, 'S02 should appear in roadmap table');
assertTrue(roadmapContent.includes('⬜'), 'pending S02 should show ⬜ in roadmap table');
cleanupDir(basePath);
cleanup(dbPath);
});
// (d) Verify full_summary_md and full_uat_md stored in DB for D004 recovery
const sliceAfter = getSlice('M001', 'S01');
assertTrue(sliceAfter !== null, 'slice should exist in DB after handler');
assertTrue(sliceAfter!.full_summary_md.length > 0, 'full_summary_md should be non-empty in DB');
assertMatch(sliceAfter!.full_summary_md, /id: S01/, 'full_summary_md should contain frontmatter');
assertTrue(sliceAfter!.full_uat_md.length > 0, 'full_uat_md should be non-empty in DB');
assertMatch(sliceAfter!.full_uat_md, /S01: Test Slice — UAT/, 'full_uat_md should contain UAT title');
test("rejects incomplete tasks", async () => {
const dbPath = tempDbPath();
openDatabase(dbPath);
// (e) Verify slice status is complete in DB
assertEq(sliceAfter!.status, 'complete', 'slice status should be complete in DB');
assertTrue(sliceAfter!.completed_at !== null, 'completed_at should be set in DB');
}
// Insert milestone, slice, 2 tasks — one complete, one pending
insertMilestone({ id: 'M001' });
insertSlice({ id: 'S01', milestoneId: 'M001' });
insertTask({ id: 'T01', sliceId: 'S01', milestoneId: 'M001', status: 'complete', title: 'Task 1' });
insertTask({ id: 'T02', sliceId: 'S01', milestoneId: 'M001', status: 'pending', title: 'Task 2' });
cleanupDir(basePath);
cleanup(dbPath);
}
const params = makeValidSliceParams();
const result = await handleCompleteSlice(params, '/tmp/fake');
// ═══════════════════════════════════════════════════════════════════════════
// complete-slice: Handler rejects incomplete tasks
// ═══════════════════════════════════════════════════════════════════════════
assert.ok('error' in result, 'should return error when tasks are incomplete');
if ('error' in result) {
assert.match(result.error, /incomplete tasks/, 'error should mention incomplete tasks');
assert.match(result.error, /T02/, 'error should mention the specific incomplete task ID');
}
console.log('\n=== complete-slice: handler rejects incomplete tasks ===');
{
const dbPath = tempDbPath();
openDatabase(dbPath);
cleanup(dbPath);
});
// Insert milestone, slice, 2 tasks — one complete, one pending
insertMilestone({ id: 'M001' });
insertSlice({ id: 'S01', milestoneId: 'M001' });
insertTask({ id: 'T01', sliceId: 'S01', milestoneId: 'M001', status: 'complete', title: 'Task 1' });
insertTask({ id: 'T02', sliceId: 'S01', milestoneId: 'M001', status: 'pending', title: 'Task 2' });
test("rejects no tasks", async () => {
const dbPath = tempDbPath();
openDatabase(dbPath);
const params = makeValidSliceParams();
const result = await handleCompleteSlice(params, '/tmp/fake');
// Insert milestone and slice but NO tasks
insertMilestone({ id: 'M001' });
insertSlice({ id: 'S01', milestoneId: 'M001' });
assertTrue('error' in result, 'should return error when tasks are incomplete');
if ('error' in result) {
assertMatch(result.error, /incomplete tasks/, 'error should mention incomplete tasks');
assertMatch(result.error, /T02/, 'error should mention the specific incomplete task ID');
}
const params = makeValidSliceParams();
const result = await handleCompleteSlice(params, '/tmp/fake');
cleanup(dbPath);
}
assert.ok('error' in result, 'should return error when no tasks exist');
if ('error' in result) {
assert.match(result.error, /no tasks found/, 'error should say no tasks found');
}
// ═══════════════════════════════════════════════════════════════════════════
// complete-slice: Handler rejects no tasks
// ═══════════════════════════════════════════════════════════════════════════
cleanup(dbPath);
});
console.log('\n=== complete-slice: handler rejects no tasks ===');
{
const dbPath = tempDbPath();
openDatabase(dbPath);
test("validation errors", async () => {
const dbPath = tempDbPath();
openDatabase(dbPath);
// Insert milestone and slice but NO tasks
insertMilestone({ id: 'M001' });
insertSlice({ id: 'S01', milestoneId: 'M001' });
const params = makeValidSliceParams();
const params = makeValidSliceParams();
const result = await handleCompleteSlice(params, '/tmp/fake');
// Empty sliceId
const r1 = await handleCompleteSlice({ ...params, sliceId: '' }, '/tmp/fake');
assert.ok('error' in r1, 'should return error for empty sliceId');
if ('error' in r1) {
assert.match(r1.error, /sliceId/, 'error should mention sliceId');
}
assertTrue('error' in result, 'should return error when no tasks exist');
if ('error' in result) {
assertMatch(result.error, /no tasks found/, 'error should say no tasks found');
}
// Empty milestoneId
const r2 = await handleCompleteSlice({ ...params, milestoneId: '' }, '/tmp/fake');
assert.ok('error' in r2, 'should return error for empty milestoneId');
if ('error' in r2) {
assert.match(r2.error, /milestoneId/, 'error should mention milestoneId');
}
cleanup(dbPath);
}
cleanup(dbPath);
});
// ═══════════════════════════════════════════════════════════════════════════
// complete-slice: Handler validation errors
// ═══════════════════════════════════════════════════════════════════════════
test("idempotency", async () => {
const dbPath = tempDbPath();
openDatabase(dbPath);
console.log('\n=== complete-slice: handler validation errors ===');
{
const dbPath = tempDbPath();
openDatabase(dbPath);
const { basePath, roadmapPath } = createTempProject();
const params = makeValidSliceParams();
// Set up DB state
insertMilestone({ id: 'M001' });
insertSlice({ id: 'S01', milestoneId: 'M001' });
insertTask({ id: 'T01', sliceId: 'S01', milestoneId: 'M001', status: 'complete', title: 'Task 1' });
// Empty sliceId
const r1 = await handleCompleteSlice({ ...params, sliceId: '' }, '/tmp/fake');
assertTrue('error' in r1, 'should return error for empty sliceId');
if ('error' in r1) {
assertMatch(r1.error, /sliceId/, 'error should mention sliceId');
}
const params = makeValidSliceParams();
// Empty milestoneId
const r2 = await handleCompleteSlice({ ...params, milestoneId: '' }, '/tmp/fake');
assertTrue('error' in r2, 'should return error for empty milestoneId');
if ('error' in r2) {
assertMatch(r2.error, /milestoneId/, 'error should mention milestoneId');
}
// First call
const r1 = await handleCompleteSlice(params, basePath);
assert.ok(!('error' in r1), 'first call should succeed');
cleanup(dbPath);
}
// Second call with same params — should not crash
const r2 = await handleCompleteSlice(params, basePath);
assert.ok(!('error' in r2), 'second call should succeed (idempotent)');
// ═══════════════════════════════════════════════════════════════════════════
// complete-slice: Handler idempotency
// ═══════════════════════════════════════════════════════════════════════════
// Verify only 1 slice row (not duplicated)
const adapter = _getAdapter()!;
const sliceRows = adapter.prepare("SELECT * FROM slices WHERE milestone_id = 'M001' AND id = 'S01'").all();
assert.strictEqual(sliceRows.length, 1, 'should have exactly 1 slice row after 2 calls');
console.log('\n=== complete-slice: handler idempotency ===');
{
const dbPath = tempDbPath();
openDatabase(dbPath);
// Files should still exist
if (!('error' in r2)) {
assert.ok(fs.existsSync(r2.summaryPath), 'summary should still exist after second call');
assert.ok(fs.existsSync(r2.uatPath), 'UAT should still exist after second call');
}
const { basePath, roadmapPath } = createTempProject();
cleanupDir(basePath);
cleanup(dbPath);
});
// Set up DB state
insertMilestone({ id: 'M001' });
insertSlice({ id: 'S01', milestoneId: 'M001' });
insertTask({ id: 'T01', sliceId: 'S01', milestoneId: 'M001', status: 'complete', title: 'Task 1' });
test("missing roadmap (graceful)", async () => {
const dbPath = tempDbPath();
openDatabase(dbPath);
const params = makeValidSliceParams();
// Create a temp dir WITHOUT a roadmap file
const basePath = fs.mkdtempSync(path.join(os.tmpdir(), 'gsd-no-roadmap-'));
const sliceDir = path.join(basePath, '.gsd', 'milestones', 'M001', 'slices', 'S01');
fs.mkdirSync(sliceDir, { recursive: true });
// First call
const r1 = await handleCompleteSlice(params, basePath);
assertTrue(!('error' in r1), 'first call should succeed');
// Set up DB state
insertMilestone({ id: 'M001' });
insertSlice({ id: 'S01', milestoneId: 'M001' });
insertTask({ id: 'T01', sliceId: 'S01', milestoneId: 'M001', status: 'complete', title: 'Task 1' });
// Second call — state machine guard rejects (slice is already complete)
const r2 = await handleCompleteSlice(params, basePath);
assertTrue('error' in r2, 'second call should return error (slice already complete)');
if ('error' in r2) {
assertMatch(r2.error, /already complete/, 'error should mention already complete');
}
const params = makeValidSliceParams();
const result = await handleCompleteSlice(params, basePath);
// Verify only 1 slice row (not duplicated)
const adapter = _getAdapter()!;
const sliceRows = adapter.prepare("SELECT * FROM slices WHERE milestone_id = 'M001' AND id = 'S01'").all();
assertEq(sliceRows.length, 1, 'should have exactly 1 slice row after calls');
// Should succeed even without roadmap file — just skip checkbox toggle
assert.ok(!('error' in result), 'handler should succeed without roadmap file');
if (!('error' in result)) {
assert.ok(fs.existsSync(result.summaryPath), 'summary should be written even without roadmap');
assert.ok(fs.existsSync(result.uatPath), 'UAT should be written even without roadmap');
}
cleanupDir(basePath);
cleanup(dbPath);
}
cleanupDir(basePath);
cleanup(dbPath);
});
});
// ═══════════════════════════════════════════════════════════════════════════
// complete-slice: Handler with missing roadmap (graceful)
// ═══════════════════════════════════════════════════════════════════════════
console.log('\n=== complete-slice: handler with missing roadmap ===');
{
const dbPath = tempDbPath();
openDatabase(dbPath);
// Create a temp dir WITHOUT a roadmap file
const basePath = fs.mkdtempSync(path.join(os.tmpdir(), 'gsd-no-roadmap-'));
const sliceDir = path.join(basePath, '.gsd', 'milestones', 'M001', 'slices', 'S01');
fs.mkdirSync(sliceDir, { recursive: true });
// Set up DB state
insertMilestone({ id: 'M001' });
insertSlice({ id: 'S01', milestoneId: 'M001' });
insertTask({ id: 'T01', sliceId: 'S01', milestoneId: 'M001', status: 'complete', title: 'Task 1' });
const params = makeValidSliceParams();
const result = await handleCompleteSlice(params, basePath);
// Should succeed even without roadmap file — just skip checkbox toggle
assertTrue(!('error' in result), 'handler should succeed without roadmap file');
if (!('error' in result)) {
assertTrue(fs.existsSync(result.summaryPath), 'summary should be written even without roadmap');
assertTrue(fs.existsSync(result.uatPath), 'UAT should be written even without roadmap');
}
cleanupDir(basePath);
cleanup(dbPath);
}
// ═══════════════════════════════════════════════════════════════════════════
report();

View file

@ -1,5 +1,4 @@
import { describe, test } from "node:test";
import assert from "node:assert/strict";
import { createTestContext } from './test-helpers.ts';
import * as fs from 'node:fs';
import * as path from 'node:path';
import * as os from 'node:os';
@ -18,6 +17,8 @@ import {
} from '../gsd-db.ts';
import { handleCompleteTask } from '../tools/complete-task.ts';
const { assertEq, assertTrue, assertMatch, report } = createTestContext();
// ═══════════════════════════════════════════════════════════════════════════
// Helpers
// ═══════════════════════════════════════════════════════════════════════════
@ -98,290 +99,356 @@ function makeValidParams() {
}
// ═══════════════════════════════════════════════════════════════════════════
// Tests
// complete-task: Schema v5 migration
// ═══════════════════════════════════════════════════════════════════════════
describe("complete-task: schema v5 migration", () => {
test("schema version and tables exist", () => {
const dbPath = tempDbPath();
openDatabase(dbPath);
console.log('\n=== complete-task: schema v5 migration ===');
{
const dbPath = tempDbPath();
openDatabase(dbPath);
const adapter = _getAdapter()!;
const adapter = _getAdapter()!;
// Verify schema version is current (v10 after M001 planning migrations)
const versionRow = adapter.prepare('SELECT MAX(version) as v FROM schema_version').get();
assert.strictEqual(versionRow?.['v'], 10, 'schema version should be 10');
// Verify schema version is current (v11 after state machine migration)
const versionRow = adapter.prepare('SELECT MAX(version) as v FROM schema_version').get();
assertEq(versionRow?.['v'], 11, 'schema version should be 11');
// Verify all 4 new tables exist
const tables = adapter.prepare(
"SELECT name FROM sqlite_master WHERE type='table' ORDER BY name"
).all();
const tableNames = tables.map(t => t['name'] as string);
assert.ok(tableNames.includes('milestones'), 'milestones table should exist');
assert.ok(tableNames.includes('slices'), 'slices table should exist');
assert.ok(tableNames.includes('tasks'), 'tasks table should exist');
assert.ok(tableNames.includes('verification_evidence'), 'verification_evidence table should exist');
// Verify all 4 new tables exist
const tables = adapter.prepare(
"SELECT name FROM sqlite_master WHERE type='table' ORDER BY name"
).all();
const tableNames = tables.map(t => t['name'] as string);
assertTrue(tableNames.includes('milestones'), 'milestones table should exist');
assertTrue(tableNames.includes('slices'), 'slices table should exist');
assertTrue(tableNames.includes('tasks'), 'tasks table should exist');
assertTrue(tableNames.includes('verification_evidence'), 'verification_evidence table should exist');
cleanup(dbPath);
cleanup(dbPath);
}
// ═══════════════════════════════════════════════════════════════════════════
// complete-task: Accessor CRUD
// ═══════════════════════════════════════════════════════════════════════════
console.log('\n=== complete-task: accessor CRUD ===');
{
const dbPath = tempDbPath();
openDatabase(dbPath);
// Insert milestone
insertMilestone({ id: 'M001', title: 'Test Milestone' });
const adapter = _getAdapter()!;
const mRow = adapter.prepare("SELECT * FROM milestones WHERE id = 'M001'").get();
assertEq(mRow?.['id'], 'M001', 'milestone id should be M001');
assertEq(mRow?.['title'], 'Test Milestone', 'milestone title should match');
// Insert slice
insertSlice({ id: 'S01', milestoneId: 'M001', title: 'Test Slice', risk: 'high' });
const sRow = adapter.prepare("SELECT * FROM slices WHERE id = 'S01' AND milestone_id = 'M001'").get();
assertEq(sRow?.['id'], 'S01', 'slice id should be S01');
assertEq(sRow?.['risk'], 'high', 'slice risk should be high');
// Insert task with all fields
insertTask({
id: 'T01',
sliceId: 'S01',
milestoneId: 'M001',
title: 'Test Task',
status: 'complete',
oneLiner: 'Did the thing',
narrative: 'Full story here.',
verificationResult: 'passed',
duration: '30m',
blockerDiscovered: false,
deviations: 'None',
knownIssues: 'None',
keyFiles: ['file1.ts', 'file2.ts'],
keyDecisions: ['D001'],
fullSummaryMd: '# Summary',
});
});
describe("complete-task: accessor CRUD", () => {
test("insert and query milestones, slices, tasks, evidence", () => {
const dbPath = tempDbPath();
openDatabase(dbPath);
// getTask verifies all fields
const task = getTask('M001', 'S01', 'T01');
assertTrue(task !== null, 'task should not be null');
assertEq(task!.id, 'T01', 'task id');
assertEq(task!.slice_id, 'S01', 'task slice_id');
assertEq(task!.milestone_id, 'M001', 'task milestone_id');
assertEq(task!.title, 'Test Task', 'task title');
assertEq(task!.status, 'complete', 'task status');
assertEq(task!.one_liner, 'Did the thing', 'task one_liner');
assertEq(task!.narrative, 'Full story here.', 'task narrative');
assertEq(task!.verification_result, 'passed', 'task verification_result');
assertEq(task!.blocker_discovered, false, 'task blocker_discovered');
assertEq(task!.key_files, ['file1.ts', 'file2.ts'], 'task key_files JSON round-trip');
assertEq(task!.key_decisions, ['D001'], 'task key_decisions JSON round-trip');
assertEq(task!.full_summary_md, '# Summary', 'task full_summary_md');
// Insert milestone
insertMilestone({ id: 'M001', title: 'Test Milestone' });
const adapter = _getAdapter()!;
const mRow = adapter.prepare("SELECT * FROM milestones WHERE id = 'M001'").get();
assert.strictEqual(mRow?.['id'], 'M001', 'milestone id should be M001');
assert.strictEqual(mRow?.['title'], 'Test Milestone', 'milestone title should match');
// getTask returns null for non-existent
const noTask = getTask('M001', 'S01', 'T99');
assertEq(noTask, null, 'non-existent task should return null');
// Insert slice
insertSlice({ id: 'S01', milestoneId: 'M001', title: 'Test Slice', risk: 'high' });
const sRow = adapter.prepare("SELECT * FROM slices WHERE id = 'S01' AND milestone_id = 'M001'").get();
assert.strictEqual(sRow?.['id'], 'S01', 'slice id should be S01');
assert.strictEqual(sRow?.['risk'], 'high', 'slice risk should be high');
// Insert verification evidence
insertVerificationEvidence({
taskId: 'T01',
sliceId: 'S01',
milestoneId: 'M001',
command: 'npm test',
exitCode: 0,
verdict: '✅ pass',
durationMs: 3000,
});
const evRows = adapter.prepare(
"SELECT * FROM verification_evidence WHERE task_id = 'T01' AND slice_id = 'S01' AND milestone_id = 'M001'"
).all();
assertEq(evRows.length, 1, 'should have 1 verification evidence row');
assertEq(evRows[0]['command'], 'npm test', 'evidence command');
assertEq(evRows[0]['exit_code'], 0, 'evidence exit_code');
assertEq(evRows[0]['verdict'], '✅ pass', 'evidence verdict');
assertEq(evRows[0]['duration_ms'], 3000, 'evidence duration_ms');
// Insert task with all fields
insertTask({
id: 'T01',
sliceId: 'S01',
milestoneId: 'M001',
title: 'Test Task',
status: 'complete',
oneLiner: 'Did the thing',
narrative: 'Full story here.',
verificationResult: 'passed',
duration: '30m',
blockerDiscovered: false,
deviations: 'None',
knownIssues: 'None',
keyFiles: ['file1.ts', 'file2.ts'],
keyDecisions: ['D001'],
fullSummaryMd: '# Summary',
});
// getSliceTasks returns array
const sliceTasks = getSliceTasks('M001', 'S01');
assertEq(sliceTasks.length, 1, 'getSliceTasks should return 1 task');
assertEq(sliceTasks[0].id, 'T01', 'getSliceTasks first task id');
// getTask verifies all fields
const task = getTask('M001', 'S01', 'T01');
assert.ok(task !== null, 'task should not be null');
assert.strictEqual(task!.id, 'T01', 'task id');
assert.strictEqual(task!.slice_id, 'S01', 'task slice_id');
assert.strictEqual(task!.milestone_id, 'M001', 'task milestone_id');
assert.strictEqual(task!.title, 'Test Task', 'task title');
assert.strictEqual(task!.status, 'complete', 'task status');
assert.strictEqual(task!.one_liner, 'Did the thing', 'task one_liner');
assert.strictEqual(task!.narrative, 'Full story here.', 'task narrative');
assert.strictEqual(task!.verification_result, 'passed', 'task verification_result');
assert.strictEqual(task!.blocker_discovered, false, 'task blocker_discovered');
assert.deepStrictEqual(task!.key_files, ['file1.ts', 'file2.ts'], 'task key_files JSON round-trip');
assert.deepStrictEqual(task!.key_decisions, ['D001'], 'task key_decisions JSON round-trip');
assert.strictEqual(task!.full_summary_md, '# Summary', 'task full_summary_md');
// updateTaskStatus changes status
updateTaskStatus('M001', 'S01', 'T01', 'failed', new Date().toISOString());
const updatedTask = getTask('M001', 'S01', 'T01');
assertEq(updatedTask!.status, 'failed', 'task status should be updated to failed');
assertTrue(updatedTask!.completed_at !== null, 'completed_at should be set after status update');
// getTask returns null for non-existent
const noTask = getTask('M001', 'S01', 'T99');
assert.strictEqual(noTask, null, 'non-existent task should return null');
cleanup(dbPath);
}
// Insert verification evidence
// ═══════════════════════════════════════════════════════════════════════════
// complete-task: Accessor stale-state error
// ═══════════════════════════════════════════════════════════════════════════
console.log('\n=== complete-task: accessor stale-state error ===');
{
// No DB open — accessors should throw GSD_STALE_STATE
closeDatabase();
let threw = false;
try {
insertMilestone({ id: 'M001' });
} catch (err: any) {
threw = true;
assertTrue(err.code === 'GSD_STALE_STATE' || err.message.includes('No database open'),
'should throw GSD_STALE_STATE when no DB open');
}
assertTrue(threw, 'insertMilestone should throw when no DB open');
threw = false;
try {
insertSlice({ id: 'S01', milestoneId: 'M001' });
} catch (err: any) {
threw = true;
assertTrue(err.code === 'GSD_STALE_STATE' || err.message.includes('No database open'),
'insertSlice should throw GSD_STALE_STATE');
}
assertTrue(threw, 'insertSlice should throw when no DB open');
threw = false;
try {
insertTask({ id: 'T01', sliceId: 'S01', milestoneId: 'M001' });
} catch (err: any) {
threw = true;
assertTrue(err.code === 'GSD_STALE_STATE' || err.message.includes('No database open'),
'insertTask should throw GSD_STALE_STATE');
}
assertTrue(threw, 'insertTask should throw when no DB open');
threw = false;
try {
insertVerificationEvidence({
taskId: 'T01',
sliceId: 'S01',
milestoneId: 'M001',
command: 'npm test',
exitCode: 0,
verdict: '✅ pass',
durationMs: 3000,
});
const evRows = adapter.prepare(
"SELECT * FROM verification_evidence WHERE task_id = 'T01' AND slice_id = 'S01' AND milestone_id = 'M001'"
).all();
assert.strictEqual(evRows.length, 1, 'should have 1 verification evidence row');
assert.strictEqual(evRows[0]['command'], 'npm test', 'evidence command');
assert.strictEqual(evRows[0]['exit_code'], 0, 'evidence exit_code');
assert.strictEqual(evRows[0]['verdict'], '✅ pass', 'evidence verdict');
assert.strictEqual(evRows[0]['duration_ms'], 3000, 'evidence duration_ms');
// getSliceTasks returns array
const sliceTasks = getSliceTasks('M001', 'S01');
assert.strictEqual(sliceTasks.length, 1, 'getSliceTasks should return 1 task');
assert.strictEqual(sliceTasks[0].id, 'T01', 'getSliceTasks first task id');
// updateTaskStatus changes status
updateTaskStatus('M001', 'S01', 'T01', 'failed', new Date().toISOString());
const updatedTask = getTask('M001', 'S01', 'T01');
assert.strictEqual(updatedTask!.status, 'failed', 'task status should be updated to failed');
assert.ok(updatedTask!.completed_at !== null, 'completed_at should be set after status update');
cleanup(dbPath);
});
});
describe("complete-task: accessor stale-state error", () => {
test("accessors throw when no DB open", () => {
closeDatabase();
assert.throws(() => insertMilestone({ id: 'M001' }),
(err: any) => err.code === 'GSD_STALE_STATE' || err.message.includes('No database open'),
'insertMilestone should throw when no DB open');
assert.throws(() => insertSlice({ id: 'S01', milestoneId: 'M001' }),
(err: any) => err.code === 'GSD_STALE_STATE' || err.message.includes('No database open'),
'insertSlice should throw when no DB open');
assert.throws(() => insertTask({ id: 'T01', sliceId: 'S01', milestoneId: 'M001' }),
(err: any) => err.code === 'GSD_STALE_STATE' || err.message.includes('No database open'),
'insertTask should throw when no DB open');
assert.throws(() => insertVerificationEvidence({
taskId: 'T01', sliceId: 'S01', milestoneId: 'M001',
command: 'test', exitCode: 0, verdict: 'pass', durationMs: 0,
}),
(err: any) => err.code === 'GSD_STALE_STATE' || err.message.includes('No database open'),
'insertVerificationEvidence should throw when no DB open');
});
});
});
} catch (err: any) {
threw = true;
assertTrue(err.code === 'GSD_STALE_STATE' || err.message.includes('No database open'),
'insertVerificationEvidence should throw GSD_STALE_STATE');
}
assertTrue(threw, 'insertVerificationEvidence should throw when no DB open');
}
describe("complete-task: handler", () => {
test("happy path", async () => {
const dbPath = tempDbPath();
openDatabase(dbPath);
// ═══════════════════════════════════════════════════════════════════════════
// complete-task: Handler happy path
// ═══════════════════════════════════════════════════════════════════════════
const { basePath, planPath } = createTempProject();
console.log('\n=== complete-task: handler happy path ===');
{
const dbPath = tempDbPath();
openDatabase(dbPath);
const params = makeValidParams();
const result = await handleCompleteTask(params, basePath);
const { basePath, planPath } = createTempProject();
assert.ok(!('error' in result), 'handler should succeed without error');
if (!('error' in result)) {
assert.strictEqual(result.taskId, 'T01', 'result taskId');
assert.strictEqual(result.sliceId, 'S01', 'result sliceId');
assert.strictEqual(result.milestoneId, 'M001', 'result milestoneId');
assert.ok(result.summaryPath.endsWith('T01-SUMMARY.md'), 'summaryPath should end with T01-SUMMARY.md');
// Seed milestone + slice + both tasks so projection renders T01 ([x]) and T02 ([ ])
insertMilestone({ id: 'M001', title: 'Test Milestone' });
insertSlice({ id: 'S01', milestoneId: 'M001', title: 'Test Slice' });
insertTask({ id: 'T02', sliceId: 'S01', milestoneId: 'M001', status: 'pending', title: 'Second task' });
// (a) Verify task row in DB with status 'complete'
const task = getTask('M001', 'S01', 'T01');
assert.ok(task !== null, 'task should exist in DB after handler');
assert.strictEqual(task!.status, 'complete', 'task status should be complete');
assert.strictEqual(task!.one_liner, 'Added test functionality', 'task one_liner in DB');
assert.deepStrictEqual(task!.key_files, ['src/test.ts', 'src/test.test.ts'], 'task key_files in DB');
const params = makeValidParams();
const result = await handleCompleteTask(params, basePath);
// (b) Verify verification_evidence rows in DB
const adapter = _getAdapter()!;
const evRows = adapter.prepare(
"SELECT * FROM verification_evidence WHERE task_id = 'T01' AND milestone_id = 'M001'"
).all();
assert.strictEqual(evRows.length, 1, 'should have 1 verification evidence row after handler');
assert.strictEqual(evRows[0]['command'], 'npm run test:unit', 'evidence command from handler');
assertTrue(!('error' in result), 'handler should succeed without error');
if (!('error' in result)) {
assertEq(result.taskId, 'T01', 'result taskId');
assertEq(result.sliceId, 'S01', 'result sliceId');
assertEq(result.milestoneId, 'M001', 'result milestoneId');
assertTrue(result.summaryPath.endsWith('T01-SUMMARY.md'), 'summaryPath should end with T01-SUMMARY.md');
// (c) Verify T01-SUMMARY.md file on disk with correct YAML frontmatter
assert.ok(fs.existsSync(result.summaryPath), 'summary file should exist on disk');
const summaryContent = fs.readFileSync(result.summaryPath, 'utf-8');
assert.match(summaryContent, /^---\n/, 'summary should start with YAML frontmatter');
assert.match(summaryContent, /id: T01/, 'summary should contain id: T01');
assert.match(summaryContent, /parent: S01/, 'summary should contain parent: S01');
assert.match(summaryContent, /milestone: M001/, 'summary should contain milestone: M001');
assert.match(summaryContent, /blocker_discovered: false/, 'summary should contain blocker_discovered');
assert.match(summaryContent, /# T01:/, 'summary should have H1 with task ID');
assert.match(summaryContent, /\*\*Added test functionality\*\*/, 'summary should have one-liner in bold');
assert.match(summaryContent, /## What Happened/, 'summary should have What Happened section');
assert.match(summaryContent, /## Verification Evidence/, 'summary should have Verification Evidence section');
assert.match(summaryContent, /npm run test:unit/, 'summary evidence should contain command');
// (a) Verify task row in DB with status 'complete'
const task = getTask('M001', 'S01', 'T01');
assertTrue(task !== null, 'task should exist in DB after handler');
assertEq(task!.status, 'complete', 'task status should be complete');
assertEq(task!.one_liner, 'Added test functionality', 'task one_liner in DB');
assertEq(task!.key_files, ['src/test.ts', 'src/test.test.ts'], 'task key_files in DB');
// (d) Verify plan checkbox changed to [x]
const planContent = fs.readFileSync(planPath, 'utf-8');
assert.match(planContent, /\[x\]\s+\*\*T01:/, 'T01 should be checked in plan');
// T02 should still be unchecked
assert.match(planContent, /\[ \]\s+\*\*T02:/, 'T02 should still be unchecked in plan');
// (b) Verify verification_evidence rows in DB
const adapter = _getAdapter()!;
const evRows = adapter.prepare(
"SELECT * FROM verification_evidence WHERE task_id = 'T01' AND milestone_id = 'M001'"
).all();
assertEq(evRows.length, 1, 'should have 1 verification evidence row after handler');
assertEq(evRows[0]['command'], 'npm run test:unit', 'evidence command from handler');
// (e) Verify full_summary_md stored in DB for D004 recovery
const taskAfter = getTask('M001', 'S01', 'T01');
assert.ok(taskAfter!.full_summary_md.length > 0, 'full_summary_md should be non-empty in DB');
assert.match(taskAfter!.full_summary_md, /id: T01/, 'full_summary_md should contain frontmatter');
}
// (c) Verify T01-SUMMARY.md file on disk with correct YAML frontmatter
assertTrue(fs.existsSync(result.summaryPath), 'summary file should exist on disk');
const summaryContent = fs.readFileSync(result.summaryPath, 'utf-8');
assertMatch(summaryContent, /^---\n/, 'summary should start with YAML frontmatter');
assertMatch(summaryContent, /id: T01/, 'summary should contain id: T01');
assertMatch(summaryContent, /parent: S01/, 'summary should contain parent: S01');
assertMatch(summaryContent, /milestone: M001/, 'summary should contain milestone: M001');
assertMatch(summaryContent, /blocker_discovered: false/, 'summary should contain blocker_discovered');
assertMatch(summaryContent, /# T01:/, 'summary should have H1 with task ID');
assertMatch(summaryContent, /\*\*Added test functionality\*\*/, 'summary should have one-liner in bold');
assertMatch(summaryContent, /## What Happened/, 'summary should have What Happened section');
assertMatch(summaryContent, /## Verification Evidence/, 'summary should have Verification Evidence section');
assertMatch(summaryContent, /npm run test:unit/, 'summary evidence should contain command');
cleanupDir(basePath);
cleanup(dbPath);
});
// (d) Verify plan checkbox changed to [x]
const planContent = fs.readFileSync(planPath, 'utf-8');
assertMatch(planContent, /\[x\]\s+\*\*T01:/, 'T01 should be checked in plan');
// T02 should still be unchecked
assertMatch(planContent, /\[ \]\s+\*\*T02:/, 'T02 should still be unchecked in plan');
test("validation errors", async () => {
const dbPath = tempDbPath();
openDatabase(dbPath);
// (e) Verify full_summary_md stored in DB for D004 recovery
const taskAfter = getTask('M001', 'S01', 'T01');
assertTrue(taskAfter!.full_summary_md.length > 0, 'full_summary_md should be non-empty in DB');
assertMatch(taskAfter!.full_summary_md, /id: T01/, 'full_summary_md should contain frontmatter');
}
const params = makeValidParams();
cleanupDir(basePath);
cleanup(dbPath);
}
// Empty taskId
const r1 = await handleCompleteTask({ ...params, taskId: '' }, '/tmp/fake');
assert.ok('error' in r1, 'should return error for empty taskId');
if ('error' in r1) {
assert.match(r1.error, /taskId/, 'error should mention taskId');
}
// ═══════════════════════════════════════════════════════════════════════════
// complete-task: Handler validation errors
// ═══════════════════════════════════════════════════════════════════════════
// Empty milestoneId
const r2 = await handleCompleteTask({ ...params, milestoneId: '' }, '/tmp/fake');
assert.ok('error' in r2, 'should return error for empty milestoneId');
if ('error' in r2) {
assert.match(r2.error, /milestoneId/, 'error should mention milestoneId');
}
console.log('\n=== complete-task: handler validation errors ===');
{
const dbPath = tempDbPath();
openDatabase(dbPath);
// Empty sliceId
const r3 = await handleCompleteTask({ ...params, sliceId: '' }, '/tmp/fake');
assert.ok('error' in r3, 'should return error for empty sliceId');
if ('error' in r3) {
assert.match(r3.error, /sliceId/, 'error should mention sliceId');
}
const params = makeValidParams();
cleanup(dbPath);
});
// Empty taskId
const r1 = await handleCompleteTask({ ...params, taskId: '' }, '/tmp/fake');
assertTrue('error' in r1, 'should return error for empty taskId');
if ('error' in r1) {
assertMatch(r1.error, /taskId/, 'error should mention taskId');
}
test("idempotency", async () => {
const dbPath = tempDbPath();
openDatabase(dbPath);
// Empty milestoneId
const r2 = await handleCompleteTask({ ...params, milestoneId: '' }, '/tmp/fake');
assertTrue('error' in r2, 'should return error for empty milestoneId');
if ('error' in r2) {
assertMatch(r2.error, /milestoneId/, 'error should mention milestoneId');
}
const { basePath, planPath } = createTempProject();
// Empty sliceId
const r3 = await handleCompleteTask({ ...params, sliceId: '' }, '/tmp/fake');
assertTrue('error' in r3, 'should return error for empty sliceId');
if ('error' in r3) {
assertMatch(r3.error, /sliceId/, 'error should mention sliceId');
}
const params = makeValidParams();
cleanup(dbPath);
}
// First call
const r1 = await handleCompleteTask(params, basePath);
assert.ok(!('error' in r1), 'first call should succeed');
// ═══════════════════════════════════════════════════════════════════════════
// complete-task: Handler idempotency
// ═══════════════════════════════════════════════════════════════════════════
// Second call with same params — should not crash (INSERT OR REPLACE)
const r2 = await handleCompleteTask(params, basePath);
assert.ok(!('error' in r2), 'second call should succeed (idempotent)');
console.log('\n=== complete-task: handler idempotency ===');
{
const dbPath = tempDbPath();
openDatabase(dbPath);
// Verify only 1 task row (upserted, not duplicated)
const tasks = getSliceTasks('M001', 'S01');
assert.strictEqual(tasks.length, 1, 'should have exactly 1 task row after 2 calls (upsert)');
const { basePath, planPath } = createTempProject();
// File should still exist
if (!('error' in r2)) {
assert.ok(fs.existsSync(r2.summaryPath), 'summary should still exist after second call');
}
// Seed milestone + slice so state machine guards pass
insertMilestone({ id: 'M001', title: 'Test Milestone' });
insertSlice({ id: 'S01', milestoneId: 'M001', title: 'Test Slice' });
cleanupDir(basePath);
cleanup(dbPath);
});
const params = makeValidParams();
test("missing plan file (graceful)", async () => {
const dbPath = tempDbPath();
openDatabase(dbPath);
// First call should succeed
const r1 = await handleCompleteTask(params, basePath);
assertTrue(!('error' in r1), 'first call should succeed');
// Create a temp dir WITHOUT a plan file
const basePath = fs.mkdtempSync(path.join(os.tmpdir(), 'gsd-no-plan-'));
const tasksDir = path.join(basePath, '.gsd', 'milestones', 'M001', 'slices', 'S01', 'tasks');
fs.mkdirSync(tasksDir, { recursive: true });
// Verify only 1 task row
const tasks = getSliceTasks('M001', 'S01');
assertEq(tasks.length, 1, 'should have exactly 1 task row after first call');
const params = makeValidParams();
const result = await handleCompleteTask(params, basePath);
// Second call with same params — state machine guard rejects (task is already complete)
const r2 = await handleCompleteTask(params, basePath);
assertTrue('error' in r2, 'second call should return error (task already complete)');
if ('error' in r2) {
assertMatch(r2.error, /already complete/, 'error should mention already complete');
}
// Should succeed even without plan file — just skip checkbox toggle
assert.ok(!('error' in result), 'handler should succeed without plan file');
if (!('error' in result)) {
assert.ok(fs.existsSync(result.summaryPath), 'summary should be written even without plan file');
}
// Still only 1 task row (no duplication from rejected second call)
const tasksAfter = getSliceTasks('M001', 'S01');
assertEq(tasksAfter.length, 1, 'should still have exactly 1 task row after rejected second call');
cleanupDir(basePath);
cleanup(dbPath);
});
});
cleanupDir(basePath);
cleanup(dbPath);
}
// ═══════════════════════════════════════════════════════════════════════════
// complete-task: Handler with missing plan file (graceful)
// ═══════════════════════════════════════════════════════════════════════════
console.log('\n=== complete-task: handler with missing plan file ===');
{
const dbPath = tempDbPath();
openDatabase(dbPath);
// Create a temp dir WITHOUT a plan file
const basePath = fs.mkdtempSync(path.join(os.tmpdir(), 'gsd-no-plan-'));
const tasksDir = path.join(basePath, '.gsd', 'milestones', 'M001', 'slices', 'S01', 'tasks');
fs.mkdirSync(tasksDir, { recursive: true });
// Seed milestone + slice so state machine guards pass
insertMilestone({ id: 'M001', title: 'Test Milestone' });
insertSlice({ id: 'S01', milestoneId: 'M001', title: 'Test Slice' });
const params = makeValidParams();
const result = await handleCompleteTask(params, basePath);
// Should succeed even without plan file — just skip checkbox toggle
assertTrue(!('error' in result), 'handler should succeed without plan file');
if (!('error' in result)) {
assertTrue(fs.existsSync(result.summaryPath), 'summary should be written even without plan file');
}
cleanupDir(basePath);
cleanup(dbPath);
}
// ═══════════════════════════════════════════════════════════════════════════
report();

View file

@ -11,7 +11,9 @@ import { mkdirSync } from "node:fs";
import {
transaction,
getMilestone,
getMilestoneSlices,
getSliceTasks,
_getAdapter,
} from "../gsd-db.js";
import { resolveMilestonePath, clearPathCache } from "../paths.js";
@ -34,6 +36,10 @@ export interface CompleteMilestoneParams {
lessonsLearned: string[];
followUps: string;
deviations: string;
/** Optional caller-provided identity for audit trail */
actorName?: string;
/** Optional caller-provided reason this action was triggered */
triggerReason?: string;
}
export interface CompleteMilestoneResult {
@ -111,6 +117,15 @@ export async function handleCompleteMilestone(
return { error: "title is required and must be a non-empty string" };
}
// ── State machine preconditions ─────────────────────────────────────────
const milestone = getMilestone(params.milestoneId);
if (!milestone) {
return { error: `milestone not found: ${params.milestoneId}` };
}
if (milestone.status === "complete" || milestone.status === "done") {
return { error: `milestone ${params.milestoneId} is already complete` };
}
// ── Verify all slices are complete ───────────────────────────────────────
const slices = getMilestoneSlices(params.milestoneId);
if (slices.length === 0) {
@ -123,6 +138,16 @@ export async function handleCompleteMilestone(
return { error: `incomplete slices: ${incompleteIds}` };
}
// ── Deep check: verify all tasks in all slices are complete ──────────────
for (const slice of slices) {
const tasks = getSliceTasks(params.milestoneId, slice.id);
const incompleteTasks = tasks.filter(t => t.status !== "complete" && t.status !== "done");
if (incompleteTasks.length > 0) {
const ids = incompleteTasks.map(t => `${t.id} (status: ${t.status})`).join(", ");
return { error: `slice ${slice.id} has incomplete tasks: ${ids}` };
}
}
// ── DB writes inside a transaction ──────────────────────────────────────
const completedAt = new Date().toISOString();
@ -181,6 +206,8 @@ export async function handleCompleteMilestone(
params: { milestoneId: params.milestoneId },
ts: new Date().toISOString(),
actor: "agent",
actor_name: params.actorName,
trigger_reason: params.triggerReason,
});
} catch (hookErr) {
process.stderr.write(

View file

@ -15,11 +15,14 @@ import {
transaction,
insertMilestone,
insertSlice,
getSlice,
getSliceTasks,
getMilestone,
updateSliceStatus,
_getAdapter,
} from "../gsd-db.js";
import { resolveSliceFile, resolveSlicePath, clearPathCache } from "../paths.js";
import { checkOwnership, sliceUnitKey } from "../unit-ownership.js";
import { saveFile, clearParseCache } from "../files.js";
import { invalidateStateCache } from "../state.js";
import { renderRoadmapCheckboxes } from "../markdown-renderer.js";
@ -203,6 +206,33 @@ export async function handleCompleteSlice(
return { error: "milestoneId is required and must be a non-empty string" };
}
// ── State machine preconditions ─────────────────────────────────────────
const milestone = getMilestone(params.milestoneId);
if (!milestone) {
return { error: `milestone not found: ${params.milestoneId}` };
}
if (milestone.status === "complete" || milestone.status === "done") {
return { error: `cannot complete slice in a closed milestone: ${params.milestoneId} (status: ${milestone.status})` };
}
const slice = getSlice(params.milestoneId, params.sliceId);
if (!slice) {
return { error: `slice not found: ${params.milestoneId}/${params.sliceId}` };
}
if (slice.status === "complete" || slice.status === "done") {
return { error: `slice ${params.sliceId} is already complete — use gsd_slice_reopen first if you need to redo it` };
}
// ── Ownership check (opt-in: only enforced when claim file exists) ──────
const ownershipErr = checkOwnership(
basePath,
sliceUnitKey(params.milestoneId, params.sliceId),
params.actorName,
);
if (ownershipErr) {
return { error: ownershipErr };
}
// ── Verify all tasks are complete ───────────────────────────────────────
const tasks = getSliceTasks(params.milestoneId, params.sliceId);
if (tasks.length === 0) {
@ -303,6 +333,8 @@ export async function handleCompleteSlice(
params: { milestoneId: params.milestoneId, sliceId: params.sliceId },
ts: new Date().toISOString(),
actor: "agent",
actor_name: params.actorName,
trigger_reason: params.triggerReason,
});
} catch (hookErr) {
process.stderr.write(

View file

@ -17,9 +17,13 @@ import {
insertSlice,
insertTask,
insertVerificationEvidence,
getMilestone,
getSlice,
getTask,
_getAdapter,
} from "../gsd-db.js";
import { resolveSliceFile, resolveTasksDir, clearPathCache } from "../paths.js";
import { checkOwnership, taskUnitKey } from "../unit-ownership.js";
import { saveFile, clearParseCache } from "../files.js";
import { invalidateStateCache } from "../state.js";
import { renderPlanCheckboxes } from "../markdown-renderer.js";
@ -134,6 +138,38 @@ export async function handleCompleteTask(
return { error: "milestoneId is required and must be a non-empty string" };
}
// ── State machine preconditions ─────────────────────────────────────────
const milestone = getMilestone(params.milestoneId);
if (!milestone) {
return { error: `milestone not found: ${params.milestoneId}` };
}
if (milestone.status === "complete" || milestone.status === "done") {
return { error: `cannot complete task in a closed milestone: ${params.milestoneId} (status: ${milestone.status})` };
}
const slice = getSlice(params.milestoneId, params.sliceId);
if (!slice) {
return { error: `slice not found: ${params.milestoneId}/${params.sliceId}` };
}
if (slice.status === "complete" || slice.status === "done") {
return { error: `cannot complete task in a closed slice: ${params.sliceId} (status: ${slice.status})` };
}
const existingTask = getTask(params.milestoneId, params.sliceId, params.taskId);
if (existingTask && (existingTask.status === "complete" || existingTask.status === "done")) {
return { error: `task ${params.taskId} is already complete — use gsd_task_reopen first if you need to redo it` };
}
// ── Ownership check (opt-in: only enforced when claim file exists) ──────
const ownershipErr = checkOwnership(
basePath,
taskUnitKey(params.milestoneId, params.sliceId, params.taskId),
params.actorName,
);
if (ownershipErr) {
return { error: ownershipErr };
}
// ── DB writes inside a transaction ──────────────────────────────────────
const completedAt = new Date().toISOString();
@ -248,6 +284,8 @@ export async function handleCompleteTask(
params: { milestoneId: params.milestoneId, sliceId: params.sliceId, taskId: params.taskId },
ts: new Date().toISOString(),
actor: "agent",
actor_name: params.actorName,
trigger_reason: params.triggerReason,
});
} catch (hookErr) {
process.stderr.write(

View file

@ -1,6 +1,7 @@
import { clearParseCache } from "../files.js";
import {
transaction,
getMilestone,
insertMilestone,
insertSlice,
upsertMilestonePlanning,
@ -31,6 +32,10 @@ export interface PlanMilestoneParams {
title: string;
status?: string;
dependsOn?: string[];
/** Optional caller-provided identity for audit trail */
actorName?: string;
/** Optional caller-provided reason this action was triggered */
triggerReason?: string;
vision: string;
successCriteria: string[];
keyRisks: Array<{ risk: string; whyItMatters: string }>;
@ -184,6 +189,25 @@ export async function handlePlanMilestone(
return { error: `validation failed: ${(err as Error).message}` };
}
// ── State machine preconditions ─────────────────────────────────────────
const existingMilestone = getMilestone(params.milestoneId);
if (existingMilestone && (existingMilestone.status === "complete" || existingMilestone.status === "done")) {
return { error: `cannot re-plan milestone ${params.milestoneId}: it is already complete` };
}
// Validate depends_on: all dependencies must exist and be complete
if (params.dependsOn && params.dependsOn.length > 0) {
for (const depId of params.dependsOn) {
const dep = getMilestone(depId);
if (!dep) {
return { error: `depends_on references unknown milestone: ${depId}` };
}
if (dep.status !== "complete" && dep.status !== "done") {
return { error: `depends_on milestone ${depId} is not yet complete (status: ${dep.status})` };
}
}
}
try {
transaction(() => {
insertMilestone({
@ -254,6 +278,8 @@ export async function handlePlanMilestone(
params: { milestoneId: params.milestoneId },
ts: new Date().toISOString(),
actor: "agent",
actor_name: params.actorName,
trigger_reason: params.triggerReason,
});
} catch (hookErr) {
process.stderr.write(

View file

@ -1,6 +1,7 @@
import { clearParseCache } from "../files.js";
import {
transaction,
getMilestone,
getSlice,
insertTask,
upsertSlicePlanning,
@ -35,6 +36,10 @@ export interface PlanSliceParams {
integrationClosure: string;
observabilityImpact: string;
tasks: PlanSliceTaskInput[];
/** Optional caller-provided identity for audit trail */
actorName?: string;
/** Optional caller-provided reason this action was triggered */
triggerReason?: string;
}
export interface PlanSliceResult {
@ -139,10 +144,21 @@ export async function handlePlanSlice(
return { error: `validation failed: ${(err as Error).message}` };
}
const parentMilestone = getMilestone(params.milestoneId);
if (!parentMilestone) {
return { error: `milestone not found: ${params.milestoneId}` };
}
if (parentMilestone.status === "complete" || parentMilestone.status === "done") {
return { error: `cannot plan slice in a closed milestone: ${params.milestoneId} (status: ${parentMilestone.status})` };
}
const parentSlice = getSlice(params.milestoneId, params.sliceId);
if (!parentSlice) {
return { error: `missing parent slice: ${params.milestoneId}/${params.sliceId}` };
}
if (parentSlice.status === "complete" || parentSlice.status === "done") {
return { error: `cannot re-plan slice ${params.sliceId}: it is already complete — use gsd_slice_reopen first` };
}
try {
transaction(() => {
@ -193,6 +209,8 @@ export async function handlePlanSlice(
params: { milestoneId: params.milestoneId, sliceId: params.sliceId },
ts: new Date().toISOString(),
actor: "agent",
actor_name: params.actorName,
trigger_reason: params.triggerReason,
});
} catch (hookErr) {
process.stderr.write(

View file

@ -19,6 +19,10 @@ export interface PlanTaskParams {
expectedOutput: string[];
observabilityImpact?: string;
fullPlanMd?: string;
/** Optional caller-provided identity for audit trail */
actorName?: string;
/** Optional caller-provided reason this action was triggered */
triggerReason?: string;
}
export interface PlanTaskResult {
@ -77,10 +81,18 @@ export async function handlePlanTask(
if (!parentSlice) {
return { error: `missing parent slice: ${params.milestoneId}/${params.sliceId}` };
}
if (parentSlice.status === "complete" || parentSlice.status === "done") {
return { error: `cannot plan task in a closed slice: ${params.sliceId} (status: ${parentSlice.status})` };
}
const existingTask = getTask(params.milestoneId, params.sliceId, params.taskId);
if (existingTask && (existingTask.status === "complete" || existingTask.status === "done")) {
return { error: `cannot re-plan task ${params.taskId}: it is already complete — use gsd_task_reopen first` };
}
try {
transaction(() => {
if (!getTask(params.milestoneId, params.sliceId, params.taskId)) {
if (!existingTask) {
insertTask({
id: params.taskId,
sliceId: params.sliceId,
@ -119,6 +131,8 @@ export async function handlePlanTask(
params: { milestoneId: params.milestoneId, sliceId: params.sliceId, taskId: params.taskId },
ts: new Date().toISOString(),
actor: "agent",
actor_name: params.actorName,
trigger_reason: params.triggerReason,
});
} catch (hookErr) {
process.stderr.write(

View file

@ -3,6 +3,7 @@ import {
transaction,
getMilestone,
getMilestoneSlices,
getSlice,
insertSlice,
updateSliceFields,
insertAssessment,
@ -33,6 +34,10 @@ export interface ReassessRoadmapParams {
added: SliceChangeInput[];
removed: string[];
};
/** Optional caller-provided identity for audit trail */
actorName?: string;
/** Optional caller-provided reason this action was triggered */
triggerReason?: string;
}
export interface ReassessRoadmapResult {
@ -99,11 +104,23 @@ export async function handleReassessRoadmap(
return { error: `validation failed: ${(err as Error).message}` };
}
// ── Verify milestone exists ───────────────────────────────────────
// ── Verify milestone exists and is active ────────────────────────
const milestone = getMilestone(params.milestoneId);
if (!milestone) {
return { error: `milestone not found: ${params.milestoneId}` };
}
if (milestone.status === "complete" || milestone.status === "done") {
return { error: `cannot reassess a closed milestone: ${params.milestoneId} (status: ${milestone.status})` };
}
// ── Verify completedSliceId is actually complete ──────────────────
const completedSlice = getSlice(params.milestoneId, params.completedSliceId);
if (!completedSlice) {
return { error: `completedSliceId not found: ${params.milestoneId}/${params.completedSliceId}` };
}
if (completedSlice.status !== "complete" && completedSlice.status !== "done") {
return { error: `completedSliceId ${params.completedSliceId} is not complete (status: ${completedSlice.status}) — reassess can only be called after a slice finishes` };
}
// ── Structural enforcement ────────────────────────────────────────
const existingSlices = getMilestoneSlices(params.milestoneId);
@ -203,6 +220,8 @@ export async function handleReassessRoadmap(
params: { milestoneId: params.milestoneId, completedSliceId: params.completedSliceId },
ts: new Date().toISOString(),
actor: "agent",
actor_name: params.actorName,
trigger_reason: params.triggerReason,
});
} catch (hookErr) {
process.stderr.write(

View file

@ -0,0 +1,113 @@
/**
* reopen-slice handler the core operation behind gsd_slice_reopen.
*
* Resets a completed slice back to "in_progress" and resets ALL of its
* tasks back to "pending". This is intentional if you're reopening a
* slice, you're re-doing the work. Partial resets create ambiguous state.
*
* The parent milestone must still be open (not complete).
*/
// GSD — reopen-slice tool handler
// Copyright (c) 2026 Jeremy McSpadden <jeremy@fluxlabs.net>
import {
getMilestone,
getSlice,
getSliceTasks,
updateSliceStatus,
updateTaskStatus,
transaction,
} from "../gsd-db.js";
import { invalidateStateCache } from "../state.js";
import { renderAllProjections } from "../workflow-projections.js";
import { writeManifest } from "../workflow-manifest.js";
import { appendEvent } from "../workflow-events.js";
export interface ReopenSliceParams {
milestoneId: string;
sliceId: string;
reason?: string;
/** Optional caller-provided identity for audit trail */
actorName?: string;
/** Optional caller-provided reason this action was triggered */
triggerReason?: string;
}
export interface ReopenSliceResult {
milestoneId: string;
sliceId: string;
tasksReset: number;
}
export async function handleReopenSlice(
params: ReopenSliceParams,
basePath: string,
): Promise<ReopenSliceResult | { error: string }> {
// ── Validate required fields ────────────────────────────────────────────
if (!params.sliceId || typeof params.sliceId !== "string" || params.sliceId.trim() === "") {
return { error: "sliceId is required and must be a non-empty string" };
}
if (!params.milestoneId || typeof params.milestoneId !== "string" || params.milestoneId.trim() === "") {
return { error: "milestoneId is required and must be a non-empty string" };
}
// ── State machine preconditions ─────────────────────────────────────────
const milestone = getMilestone(params.milestoneId);
if (!milestone) {
return { error: `milestone not found: ${params.milestoneId}` };
}
if (milestone.status === "complete" || milestone.status === "done") {
return { error: `cannot reopen slice inside a closed milestone: ${params.milestoneId} (status: ${milestone.status})` };
}
const slice = getSlice(params.milestoneId, params.sliceId);
if (!slice) {
return { error: `slice not found: ${params.milestoneId}/${params.sliceId}` };
}
if (slice.status !== "complete" && slice.status !== "done") {
return { error: `slice ${params.sliceId} is not complete (status: ${slice.status}) — nothing to reopen` };
}
// ── Reset slice + all tasks in a transaction ────────────────────────────
const tasks = getSliceTasks(params.milestoneId, params.sliceId);
transaction(() => {
updateSliceStatus(params.milestoneId, params.sliceId, "in_progress");
for (const task of tasks) {
updateTaskStatus(params.milestoneId, params.sliceId, task.id, "pending");
}
});
// ── Invalidate caches ────────────────────────────────────────────────────
invalidateStateCache();
// ── Post-mutation hook ───────────────────────────────────────────────────
try {
await renderAllProjections(basePath, params.milestoneId);
writeManifest(basePath);
appendEvent(basePath, {
cmd: "reopen-slice",
params: {
milestoneId: params.milestoneId,
sliceId: params.sliceId,
reason: params.reason ?? null,
tasksReset: tasks.length,
},
ts: new Date().toISOString(),
actor: "agent",
actor_name: params.actorName,
trigger_reason: params.triggerReason,
});
} catch (hookErr) {
process.stderr.write(
`gsd: reopen-slice post-mutation hook warning: ${(hookErr as Error).message}\n`,
);
}
return {
milestoneId: params.milestoneId,
sliceId: params.sliceId,
tasksReset: tasks.length,
};
}

View file

@ -0,0 +1,115 @@
/**
* reopen-task handler the core operation behind gsd_task_reopen.
*
* Resets a completed task back to "pending" so it can be re-done
* without manual SQL surgery. The parent slice and milestone must
* still be open (not complete) you cannot reopen tasks inside a
* closed slice.
*/
// GSD — reopen-task tool handler
// Copyright (c) 2026 Jeremy McSpadden <jeremy@fluxlabs.net>
import {
getMilestone,
getSlice,
getTask,
updateTaskStatus,
} from "../gsd-db.js";
import { invalidateStateCache } from "../state.js";
import { renderAllProjections } from "../workflow-projections.js";
import { writeManifest } from "../workflow-manifest.js";
import { appendEvent } from "../workflow-events.js";
export interface ReopenTaskParams {
milestoneId: string;
sliceId: string;
taskId: string;
reason?: string;
/** Optional caller-provided identity for audit trail */
actorName?: string;
/** Optional caller-provided reason this action was triggered */
triggerReason?: string;
}
export interface ReopenTaskResult {
milestoneId: string;
sliceId: string;
taskId: string;
}
export async function handleReopenTask(
params: ReopenTaskParams,
basePath: string,
): Promise<ReopenTaskResult | { error: string }> {
// ── Validate required fields ────────────────────────────────────────────
if (!params.taskId || typeof params.taskId !== "string" || params.taskId.trim() === "") {
return { error: "taskId is required and must be a non-empty string" };
}
if (!params.sliceId || typeof params.sliceId !== "string" || params.sliceId.trim() === "") {
return { error: "sliceId is required and must be a non-empty string" };
}
if (!params.milestoneId || typeof params.milestoneId !== "string" || params.milestoneId.trim() === "") {
return { error: "milestoneId is required and must be a non-empty string" };
}
// ── State machine preconditions ─────────────────────────────────────────
const milestone = getMilestone(params.milestoneId);
if (!milestone) {
return { error: `milestone not found: ${params.milestoneId}` };
}
if (milestone.status === "complete" || milestone.status === "done") {
return { error: `cannot reopen task in a closed milestone: ${params.milestoneId} (status: ${milestone.status})` };
}
const slice = getSlice(params.milestoneId, params.sliceId);
if (!slice) {
return { error: `slice not found: ${params.milestoneId}/${params.sliceId}` };
}
if (slice.status === "complete" || slice.status === "done") {
return { error: `cannot reopen task inside a closed slice: ${params.sliceId} (status: ${slice.status}) — use gsd_slice_reopen first` };
}
const task = getTask(params.milestoneId, params.sliceId, params.taskId);
if (!task) {
return { error: `task not found: ${params.milestoneId}/${params.sliceId}/${params.taskId}` };
}
if (task.status !== "complete" && task.status !== "done") {
return { error: `task ${params.taskId} is not complete (status: ${task.status}) — nothing to reopen` };
}
// ── Reset task status ────────────────────────────────────────────────────
updateTaskStatus(params.milestoneId, params.sliceId, params.taskId, "pending");
// ── Invalidate caches ────────────────────────────────────────────────────
invalidateStateCache();
// ── Post-mutation hook ───────────────────────────────────────────────────
try {
await renderAllProjections(basePath, params.milestoneId);
writeManifest(basePath);
appendEvent(basePath, {
cmd: "reopen-task",
params: {
milestoneId: params.milestoneId,
sliceId: params.sliceId,
taskId: params.taskId,
reason: params.reason ?? null,
},
ts: new Date().toISOString(),
actor: "agent",
actor_name: params.actorName,
trigger_reason: params.triggerReason,
});
} catch (hookErr) {
process.stderr.write(
`gsd: reopen-task post-mutation hook warning: ${(hookErr as Error).message}\n`,
);
}
return {
milestoneId: params.milestoneId,
sliceId: params.sliceId,
taskId: params.taskId,
};
}

View file

@ -35,6 +35,10 @@ export interface ReplanSliceParams {
whatChanged: string;
updatedTasks: ReplanSliceTaskInput[];
removedTaskIds: string[];
/** Optional caller-provided identity for audit trail */
actorName?: string;
/** Optional caller-provided reason this action was triggered */
triggerReason?: string;
}
export interface ReplanSliceResult {
@ -86,11 +90,23 @@ export async function handleReplanSlice(
return { error: `validation failed: ${(err as Error).message}` };
}
// ── Verify parent slice exists ────────────────────────────────────
// ── Verify parent slice exists and is not closed ─────────────────
const parentSlice = getSlice(params.milestoneId, params.sliceId);
if (!parentSlice) {
return { error: `missing parent slice: ${params.milestoneId}/${params.sliceId}` };
}
if (parentSlice.status === "complete" || parentSlice.status === "done") {
return { error: `cannot replan a closed slice: ${params.sliceId} (status: ${parentSlice.status})` };
}
// ── Verify blocker task exists and is complete ────────────────────
const blockerTask = getTask(params.milestoneId, params.sliceId, params.blockerTaskId);
if (!blockerTask) {
return { error: `blockerTaskId not found: ${params.milestoneId}/${params.sliceId}/${params.blockerTaskId}` };
}
if (blockerTask.status !== "complete" && blockerTask.status !== "done") {
return { error: `blockerTaskId ${params.blockerTaskId} is not complete (status: ${blockerTask.status}) — the blocker task must be finished before a replan is triggered` };
}
// ── Structural enforcement ────────────────────────────────────────
const existingTasks = getSliceTasks(params.milestoneId, params.sliceId);
@ -195,6 +211,8 @@ export async function handleReplanSlice(
params: { milestoneId: params.milestoneId, sliceId: params.sliceId, blockerTaskId: params.blockerTaskId },
ts: new Date().toISOString(),
actor: "agent",
actor_name: params.actorName,
trigger_reason: params.triggerReason,
});
} catch (hookErr) {
process.stderr.write(

View file

@ -520,6 +520,10 @@ export interface CompleteTaskParams {
verdict: string;
durationMs: number;
}>;
/** Optional caller-provided identity for audit trail */
actorName?: string;
/** Optional caller-provided reason this action was triggered */
triggerReason?: string;
}
// ─── Complete Slice Params (gsd_complete_slice tool input) ───────────────
@ -548,4 +552,8 @@ export interface CompleteSliceParams {
requires: Array<{ slice: string; provides: string }>;
affects: string[];
drillDownPaths: string[];
/** Optional caller-provided identity for audit trail */
actorName?: string;
/** Optional caller-provided reason this action was triggered */
triggerReason?: string;
}

View file

@ -0,0 +1,104 @@
// GSD Extension — Unit Ownership
// Opt-in per-unit ownership claims for multi-agent safety.
//
// An agent can claim a unit (task, slice) before working on it.
// complete-task and complete-slice enforce ownership when claims exist.
// If no claim file is present, ownership is not enforced (backward compatible).
//
// Claim file location: .gsd/unit-claims.json
// Unit key format:
// task: "<milestoneId>/<sliceId>/<taskId>"
// slice: "<milestoneId>/<sliceId>"
//
// Copyright (c) 2026 Jeremy McSpadden <jeremy@fluxlabs.net>
import { existsSync, readFileSync, mkdirSync } from "node:fs";
import { join } from "node:path";
import { atomicWriteSync } from "./atomic-write.js";
// ─── Types ───────────────────────────────────────────────────────────────
export interface UnitClaim {
agent: string;
claimed_at: string;
}
type ClaimsMap = Record<string, UnitClaim>;
// ─── Key Builders ────────────────────────────────────────────────────────
export function taskUnitKey(milestoneId: string, sliceId: string, taskId: string): string {
return `${milestoneId}/${sliceId}/${taskId}`;
}
export function sliceUnitKey(milestoneId: string, sliceId: string): string {
return `${milestoneId}/${sliceId}`;
}
// ─── File Path ───────────────────────────────────────────────────────────
function claimsPath(basePath: string): string {
return join(basePath, ".gsd", "unit-claims.json");
}
// ─── Read Claims ─────────────────────────────────────────────────────────
function readClaims(basePath: string): ClaimsMap | null {
const path = claimsPath(basePath);
if (!existsSync(path)) return null;
try {
return JSON.parse(readFileSync(path, "utf-8")) as ClaimsMap;
} catch {
return null;
}
}
// ─── Public API ──────────────────────────────────────────────────────────
/**
* Claim a unit for an agent.
* Overwrites any existing claim for this unit (last writer wins).
*/
export function claimUnit(basePath: string, unitKey: string, agentName: string): void {
const claims = readClaims(basePath) ?? {};
claims[unitKey] = { agent: agentName, claimed_at: new Date().toISOString() };
const dir = join(basePath, ".gsd");
mkdirSync(dir, { recursive: true });
atomicWriteSync(claimsPath(basePath), JSON.stringify(claims, null, 2) + "\n");
}
/**
* Release a unit claim (remove it from the claims map).
*/
export function releaseUnit(basePath: string, unitKey: string): void {
const claims = readClaims(basePath);
if (!claims || !(unitKey in claims)) return;
delete claims[unitKey];
atomicWriteSync(claimsPath(basePath), JSON.stringify(claims, null, 2) + "\n");
}
/**
* Get the current owner of a unit, or null if unclaimed / no claims file.
*/
export function getOwner(basePath: string, unitKey: string): string | null {
const claims = readClaims(basePath);
if (!claims) return null;
return claims[unitKey]?.agent ?? null;
}
/**
* Check if an actor is authorized to operate on a unit.
* Returns null if ownership passes (or is unclaimed / no file).
* Returns an error string if a different agent owns the unit.
*/
export function checkOwnership(
basePath: string,
unitKey: string,
actorName: string | undefined,
): string | null {
if (!actorName) return null; // no actor identity provided — opt-in, so allow
const owner = getOwner(basePath, unitKey);
if (owner === null) return null; // unit unclaimed or no claims file
if (owner === actorName) return null; // actor is the owner
return `Unit ${unitKey} is owned by ${owner}, not ${actorName}`;
}

View file

@ -1,8 +1,20 @@
import { createHash } from "node:crypto";
import { createHash, randomUUID } from "node:crypto";
import { appendFileSync, readFileSync, existsSync, mkdirSync } from "node:fs";
import { join } from "node:path";
import { atomicWriteSync } from "./atomic-write.js";
// ─── Session ID ───────────────────────────────────────────────────────────
/**
* Engine-generated session ID stable for the lifetime of this process.
* Agents can reference this to correlate all events from one run.
*/
const ENGINE_SESSION_ID: string = randomUUID();
export function getSessionId(): string {
return ENGINE_SESSION_ID;
}
// ─── Event Types ─────────────────────────────────────────────────────────
export interface WorkflowEvent {
@ -11,25 +23,32 @@ export interface WorkflowEvent {
ts: string; // ISO 8601
hash: string; // content hash (hex, 16 chars)
actor: "agent" | "system";
actor_name?: string; // e.g. "executor-agent-01" — caller-provided identity
trigger_reason?: string; // e.g. "plan-phase complete" — caller-provided causation
session_id: string; // engine-generated UUID, stable per process lifetime
}
// ─── appendEvent ─────────────────────────────────────────────────────────
/**
* Append one event to .gsd/event-log.jsonl.
* Computes a content hash from cmd+params (deterministic, independent of ts/actor).
* Computes a content hash from cmd+params (deterministic, independent of ts/actor/session).
* Creates .gsd directory if needed.
*/
export function appendEvent(
basePath: string,
event: Omit<WorkflowEvent, "hash">,
event: Omit<WorkflowEvent, "hash" | "session_id"> & { actor_name?: string; trigger_reason?: string },
): void {
const hash = createHash("sha256")
.update(JSON.stringify({ cmd: event.cmd, params: event.params }))
.digest("hex")
.slice(0, 16);
const fullEvent: WorkflowEvent = { ...event, hash };
const fullEvent: WorkflowEvent = {
...event,
hash,
session_id: ENGINE_SESSION_ID,
};
const dir = join(basePath, ".gsd");
mkdirSync(dir, { recursive: true });
appendFileSync(join(dir, "event-log.jsonl"), JSON.stringify(fullEvent) + "\n", "utf-8");

View file

@ -2,6 +2,7 @@
// Centralized warning/error accumulator for the workflow engine pipeline.
// Captures structured entries that the auto-loop can drain after each unit
// to surface root causes for stuck loops, silent degradation, and blocked writes.
// All entries are also persisted to .gsd/audit-log.jsonl for post-mortem analysis.
//
// Stderr policy: every logWarning/logError call writes immediately to stderr
// for terminal visibility. This is intentional — unlike debug-logger (which is
@ -13,6 +14,9 @@
// the start of each unit to prevent log bleed between units running in the same
// Node process.
import { appendFileSync, readFileSync, existsSync, mkdirSync } from "node:fs";
import { join } from "node:path";
// ─── Types ──────────────────────────────────────────────────────────────
export type LogSeverity = "warn" | "error";
@ -38,10 +42,20 @@ export interface LogEntry {
context?: Record<string, string>;
}
// ─── Buffer ─────────────────────────────────────────────────────────────
// ─── Buffer & Persistent Audit ──────────────────────────────────────────
const MAX_BUFFER = 100;
let _buffer: LogEntry[] = [];
let _auditBasePath: string | null = null;
/**
* Set the base path for persistent audit log writes.
* Should be called once at engine init with the project root.
* Until set, log entries are buffered in-memory only.
*/
export function setLogBasePath(basePath: string): void {
_auditBasePath = basePath;
}
// ─── Public API ─────────────────────────────────────────────────────────
@ -156,12 +170,36 @@ export function formatForNotification(entries: readonly LogEntry[]): string {
.join("\n");
}
/**
* Read all entries from the persistent audit log.
* Returns empty array if no basePath is set or the file doesn't exist.
*/
export function readAuditLog(basePath?: string): LogEntry[] {
const bp = basePath ?? _auditBasePath;
if (!bp) return [];
const auditPath = join(bp, ".gsd", "audit-log.jsonl");
if (!existsSync(auditPath)) return [];
try {
const content = readFileSync(auditPath, "utf-8");
return content
.split("\n")
.filter((l) => l.length > 0)
.map((l) => {
try { return JSON.parse(l) as LogEntry; } catch { return null; }
})
.filter((e): e is LogEntry => e !== null);
} catch {
return [];
}
}
/**
* Reset buffer. Call at the start of each auto-loop unit to prevent log bleed
* between units running in the same process. Also used in tests via _resetLogs().
*/
export function _resetLogs(): void {
_buffer = [];
_auditBasePath = null;
}
// ─── Internal ───────────────────────────────────────────────────────────
@ -190,4 +228,16 @@ function _push(
if (_buffer.length > MAX_BUFFER) {
_buffer.shift();
}
// Persist to .gsd/audit-log.jsonl so entries survive context resets
if (_auditBasePath) {
try {
const auditDir = join(_auditBasePath, ".gsd");
mkdirSync(auditDir, { recursive: true });
appendFileSync(join(auditDir, "audit-log.jsonl"), JSON.stringify(entry) + "\n", "utf-8");
} catch (auditErr) {
// Best-effort — never let audit write failures bubble up
process.stderr.write(`[gsd:audit] failed to persist log entry: ${(auditErr as Error).message}\n`);
}
}
}

View file

@ -35,8 +35,8 @@ export function renderPlanContent(sliceRow: SliceRow, taskRows: TaskRow[]): stri
lines.push("## Tasks");
for (const task of taskRows) {
const checkbox = task.status === "done" ? "[x]" : "[ ]";
lines.push(`- ${checkbox} **${task.id}:** ${task.title} \u2014 ${task.description}`);
const checkbox = task.status === "done" || task.status === "complete" ? "[x]" : "[ ]";
lines.push(`- ${checkbox} **${task.id}: ${task.title}** \u2014 ${task.description}`);
// Estimate subline (always present if non-empty)
if (task.estimate) {
@ -104,7 +104,7 @@ export function renderRoadmapContent(milestoneRow: MilestoneRow, sliceRows: Slic
lines.push("|----|-------|------|---------|------|------------|");
for (const slice of sliceRows) {
const done = slice.status === "done" ? "\u2705" : "\u2B1C";
const done = slice.status === "done" || slice.status === "complete" ? "\u2705" : "\u2B1C";
// depends is already parsed to string[] by rowToSlice
let depends = "\u2014";