refactor: rebrand gsd_ tool names and references to sf_ namespace
Updates workflow tool names, documentation references, and internal naming conventions across MCP server, CLI, tests, and web components to complete the singularity-forge rebrand from gsd to sf. Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
This commit is contained in:
parent
a79d67f38b
commit
421fccd898
51 changed files with 422 additions and 422 deletions
|
|
@ -58,19 +58,19 @@ Goal: add an MCP server surface for real SF workflow tools, distinct from the cu
|
|||
|
||||
Preferred first-cut tool set:
|
||||
|
||||
- `gsd_summary_save`
|
||||
- `gsd_decision_save`
|
||||
- `gsd_plan_milestone`
|
||||
- `gsd_plan_slice`
|
||||
- `gsd_plan_task`
|
||||
- `gsd_task_complete`
|
||||
- `gsd_slice_complete`
|
||||
- `gsd_complete_milestone`
|
||||
- `gsd_validate_milestone`
|
||||
- `gsd_replan_slice`
|
||||
- `gsd_reassess_roadmap`
|
||||
- `gsd_save_gate_result`
|
||||
- `gsd_milestone_status`
|
||||
- `sf_summary_save`
|
||||
- `sf_decision_save`
|
||||
- `sf_plan_milestone`
|
||||
- `sf_plan_slice`
|
||||
- `sf_plan_task`
|
||||
- `sf_task_complete`
|
||||
- `sf_slice_complete`
|
||||
- `sf_complete_milestone`
|
||||
- `sf_validate_milestone`
|
||||
- `sf_replan_slice`
|
||||
- `sf_reassess_roadmap`
|
||||
- `sf_save_gate_result`
|
||||
- `sf_milestone_status`
|
||||
|
||||
Likely files:
|
||||
|
||||
|
|
@ -129,7 +129,7 @@ Expected work:
|
|||
Exit criteria:
|
||||
|
||||
- Claude Code session can discover the SF workflow MCP tools
|
||||
- task execution path can call `gsd_task_complete` successfully
|
||||
- task execution path can call `sf_task_complete` successfully
|
||||
|
||||
### 5. Capability Detection and Failure Path
|
||||
|
||||
|
|
@ -179,7 +179,7 @@ Exit criteria:
|
|||
|
||||
Scope:
|
||||
|
||||
- extract shared logic for `gsd_summary_save`, `gsd_task_complete`, and `gsd_milestone_status`
|
||||
- extract shared logic for `sf_summary_save`, `sf_task_complete`, and `sf_milestone_status`
|
||||
- prove native wrappers still work
|
||||
|
||||
Why first:
|
||||
|
|
@ -212,7 +212,7 @@ Scope:
|
|||
|
||||
Verification:
|
||||
|
||||
- Claude Code can call `gsd_task_complete`
|
||||
- Claude Code can call `sf_task_complete`
|
||||
- summary file, DB state, and plan checkbox update correctly
|
||||
|
||||
## Phase 4: Expand to Full Minimum Workflow Set
|
||||
|
|
@ -329,7 +329,7 @@ ADR-008 is considered implemented when:
|
|||
|
||||
Start with a narrow spike:
|
||||
|
||||
1. Extract shared handlers for `gsd_summary_save`, `gsd_task_complete`, and `gsd_milestone_status`.
|
||||
1. Extract shared handlers for `sf_summary_save`, `sf_task_complete`, and `sf_milestone_status`.
|
||||
2. Expose those tools through a minimal workflow MCP server.
|
||||
3. Attach that MCP server to Claude Code sessions.
|
||||
4. Prove end-to-end task completion on a fixture project.
|
||||
|
|
|
|||
|
|
@ -18,23 +18,23 @@ This split is now creating a real provider compatibility problem.
|
|||
|
||||
The core SF workflow tools are internal extension tools. Examples include:
|
||||
|
||||
- `gsd_summary_save`
|
||||
- `gsd_plan_milestone`
|
||||
- `gsd_plan_slice`
|
||||
- `gsd_plan_task`
|
||||
- `gsd_task_complete` / `gsd_complete_task`
|
||||
- `gsd_slice_complete`
|
||||
- `gsd_complete_milestone`
|
||||
- `gsd_validate_milestone`
|
||||
- `gsd_replan_slice`
|
||||
- `gsd_reassess_roadmap`
|
||||
- `sf_summary_save`
|
||||
- `sf_plan_milestone`
|
||||
- `sf_plan_slice`
|
||||
- `sf_plan_task`
|
||||
- `sf_task_complete` / `sf_complete_task`
|
||||
- `sf_slice_complete`
|
||||
- `sf_complete_milestone`
|
||||
- `sf_validate_milestone`
|
||||
- `sf_replan_slice`
|
||||
- `sf_reassess_roadmap`
|
||||
|
||||
These are registered in `src/resources/extensions/sf/bootstrap/db-tools.ts` and related bootstrap files. SF prompts assume these tools are available during discuss, plan, and execute flows.
|
||||
|
||||
Separately, `packages/mcp-server/src/server.ts` exposes a different tool surface:
|
||||
|
||||
- session control: `gsd_execute`, `gsd_status`, `gsd_result`, `gsd_cancel`, `gsd_query`, `gsd_resolve_blocker`
|
||||
- read-only inspection: `gsd_progress`, `gsd_roadmap`, `gsd_history`, `gsd_doctor`, `gsd_captures`, `gsd_knowledge`
|
||||
- session control: `sf_execute`, `sf_status`, `sf_result`, `sf_cancel`, `sf_query`, `sf_resolve_blocker`
|
||||
- read-only inspection: `sf_progress`, `sf_roadmap`, `sf_history`, `sf_doctor`, `sf_captures`, `sf_knowledge`
|
||||
|
||||
That MCP server is useful, but it is **not** a transport for the internal workflow/mutation tools.
|
||||
|
||||
|
|
@ -44,7 +44,7 @@ The Claude Code CLI provider uses the Anthropic Agent SDK through `src/resources
|
|||
|
||||
As a result:
|
||||
|
||||
- prompts tell the model to call tools like `gsd_complete_task`
|
||||
- prompts tell the model to call tools like `sf_complete_task`
|
||||
- the tools exist in SF
|
||||
- but Claude Code sessions do not actually receive those tools
|
||||
|
||||
|
|
@ -92,19 +92,19 @@ SF will expose the workflow tools required for discuss, planning, execution, and
|
|||
|
||||
Initial minimum set:
|
||||
|
||||
- `gsd_summary_save`
|
||||
- `gsd_decision_save`
|
||||
- `gsd_plan_milestone`
|
||||
- `gsd_plan_slice`
|
||||
- `gsd_plan_task`
|
||||
- `gsd_task_complete`
|
||||
- `gsd_slice_complete`
|
||||
- `gsd_complete_milestone`
|
||||
- `gsd_validate_milestone`
|
||||
- `gsd_replan_slice`
|
||||
- `gsd_reassess_roadmap`
|
||||
- `gsd_save_gate_result`
|
||||
- selected read/query tools such as `gsd_milestone_status`
|
||||
- `sf_summary_save`
|
||||
- `sf_decision_save`
|
||||
- `sf_plan_milestone`
|
||||
- `sf_plan_slice`
|
||||
- `sf_plan_task`
|
||||
- `sf_task_complete`
|
||||
- `sf_slice_complete`
|
||||
- `sf_complete_milestone`
|
||||
- `sf_validate_milestone`
|
||||
- `sf_replan_slice`
|
||||
- `sf_reassess_roadmap`
|
||||
- `sf_save_gate_result`
|
||||
- selected read/query tools such as `sf_milestone_status`
|
||||
|
||||
Aliases should be treated conservatively. MCP should prefer canonical names unless compatibility requires exposing aliases.
|
||||
|
||||
|
|
@ -197,11 +197,11 @@ Refactor workflow tools so MCP and native registration can call the same transpo
|
|||
|
||||
Priority targets:
|
||||
|
||||
- `gsd_summary_save`
|
||||
- `gsd_task_complete`
|
||||
- `gsd_plan_milestone`
|
||||
- `gsd_plan_slice`
|
||||
- `gsd_plan_task`
|
||||
- `sf_summary_save`
|
||||
- `sf_task_complete`
|
||||
- `sf_plan_milestone`
|
||||
- `sf_plan_slice`
|
||||
- `sf_plan_task`
|
||||
|
||||
### Phase 2: Stand up the workflow-tool MCP server
|
||||
|
||||
|
|
|
|||
|
|
@ -18,23 +18,23 @@ This split is now creating a real provider compatibility problem.
|
|||
|
||||
The core SF workflow tools are internal extension tools. Examples include:
|
||||
|
||||
- `gsd_summary_save`
|
||||
- `gsd_plan_milestone`
|
||||
- `gsd_plan_slice`
|
||||
- `gsd_plan_task`
|
||||
- `gsd_task_complete` / `gsd_complete_task`
|
||||
- `gsd_slice_complete`
|
||||
- `gsd_complete_milestone`
|
||||
- `gsd_validate_milestone`
|
||||
- `gsd_replan_slice`
|
||||
- `gsd_reassess_roadmap`
|
||||
- `sf_summary_save`
|
||||
- `sf_plan_milestone`
|
||||
- `sf_plan_slice`
|
||||
- `sf_plan_task`
|
||||
- `sf_task_complete` / `sf_complete_task`
|
||||
- `sf_slice_complete`
|
||||
- `sf_complete_milestone`
|
||||
- `sf_validate_milestone`
|
||||
- `sf_replan_slice`
|
||||
- `sf_reassess_roadmap`
|
||||
|
||||
These are registered in `src/resources/extensions/sf/bootstrap/db-tools.ts` and related bootstrap files. SF prompts assume these tools are available during discuss, plan, and execute flows.
|
||||
|
||||
Separately, `packages/mcp-server/src/server.ts` exposes a different tool surface:
|
||||
|
||||
- session control: `gsd_execute`, `gsd_status`, `gsd_result`, `gsd_cancel`, `gsd_query`, `gsd_resolve_blocker`
|
||||
- read-only inspection: `gsd_progress`, `gsd_roadmap`, `gsd_history`, `gsd_doctor`, `gsd_captures`, `gsd_knowledge`
|
||||
- session control: `sf_execute`, `sf_status`, `sf_result`, `sf_cancel`, `sf_query`, `sf_resolve_blocker`
|
||||
- read-only inspection: `sf_progress`, `sf_roadmap`, `sf_history`, `sf_doctor`, `sf_captures`, `sf_knowledge`
|
||||
|
||||
That MCP server is useful, but it is **not** a transport for the internal workflow/mutation tools.
|
||||
|
||||
|
|
@ -44,7 +44,7 @@ The Claude Code CLI provider uses the Anthropic Agent SDK through `src/resources
|
|||
|
||||
As a result:
|
||||
|
||||
- prompts tell the model to call tools like `gsd_complete_task`
|
||||
- prompts tell the model to call tools like `sf_complete_task`
|
||||
- the tools exist in SF
|
||||
- but Claude Code sessions do not actually receive those tools
|
||||
|
||||
|
|
@ -92,19 +92,19 @@ SF will expose the workflow tools required for discuss, planning, execution, and
|
|||
|
||||
Initial minimum set:
|
||||
|
||||
- `gsd_summary_save`
|
||||
- `gsd_decision_save`
|
||||
- `gsd_plan_milestone`
|
||||
- `gsd_plan_slice`
|
||||
- `gsd_plan_task`
|
||||
- `gsd_task_complete`
|
||||
- `gsd_slice_complete`
|
||||
- `gsd_complete_milestone`
|
||||
- `gsd_validate_milestone`
|
||||
- `gsd_replan_slice`
|
||||
- `gsd_reassess_roadmap`
|
||||
- `gsd_save_gate_result`
|
||||
- selected read/query tools such as `gsd_milestone_status`
|
||||
- `sf_summary_save`
|
||||
- `sf_decision_save`
|
||||
- `sf_plan_milestone`
|
||||
- `sf_plan_slice`
|
||||
- `sf_plan_task`
|
||||
- `sf_task_complete`
|
||||
- `sf_slice_complete`
|
||||
- `sf_complete_milestone`
|
||||
- `sf_validate_milestone`
|
||||
- `sf_replan_slice`
|
||||
- `sf_reassess_roadmap`
|
||||
- `sf_save_gate_result`
|
||||
- selected read/query tools such as `sf_milestone_status`
|
||||
|
||||
Aliases should be treated conservatively. MCP should prefer canonical names unless compatibility requires exposing aliases.
|
||||
|
||||
|
|
@ -197,11 +197,11 @@ Refactor workflow tools so MCP and native registration can call the same transpo
|
|||
|
||||
Priority targets:
|
||||
|
||||
- `gsd_summary_save`
|
||||
- `gsd_task_complete`
|
||||
- `gsd_plan_milestone`
|
||||
- `gsd_plan_slice`
|
||||
- `gsd_plan_task`
|
||||
- `sf_summary_save`
|
||||
- `sf_task_complete`
|
||||
- `sf_plan_milestone`
|
||||
- `sf_plan_slice`
|
||||
- `sf_plan_task`
|
||||
|
||||
### Phase 2: Stand up the workflow-tool MCP server
|
||||
|
||||
|
|
|
|||
|
|
@ -865,7 +865,7 @@
|
|||
| native/crates/engine/src/ps.rs | Native/Rust Tools | Cross-platform process tree management |
|
||||
| native/crates/engine/src/clipboard.rs | Native/Rust Tools | Clipboard read/write for text and images |
|
||||
| native/crates/engine/src/json_parse.rs | Text Processing, Native/Rust Tools | Streaming JSON parser with partial recovery |
|
||||
| native/crates/engine/src/gsd_parser.rs | SF Workflow, Native/Rust Tools | .sf/ directory file parser (markdown, frontmatter) |
|
||||
| native/crates/engine/src/sf_parser.rs | SF Workflow, Native/Rust Tools | .sf/ directory file parser (markdown, frontmatter) |
|
||||
| native/crates/engine/src/ttsr.rs | TTSR, Native/Rust Tools | TTSR regex engine with compiled RegexSet |
|
||||
| native/crates/engine/src/stream_process.rs | Text Processing, Native/Rust Tools | Bash stream processor (UTF-8, ANSI strip, binary) |
|
||||
| native/crates/engine/src/xxhash.rs | Native/Rust Tools | xxHash32 for hashline edit tool |
|
||||
|
|
|
|||
|
|
@ -130,7 +130,7 @@ You can also add this to `~/.claude/settings.json` under `mcpServers` to make SF
|
|||
|
||||
**What's exposed:**
|
||||
|
||||
The MCP server provides SF's full workflow tool surface — milestone planning, task completion, slice management, roadmap reassessment, journal queries, and more. Session management tools (`gsd_execute`, `gsd_status`, `gsd_result`, `gsd_cancel`) let Claude Code start and monitor SF auto-mode sessions. See [Commands → MCP Server Mode](./commands.md#mcp-server-mode) for the full tool list.
|
||||
The MCP server provides SF's full workflow tool surface — milestone planning, task completion, slice management, roadmap reassessment, journal queries, and more. Session management tools (`sf_execute`, `sf_status`, `sf_result`, `sf_cancel`) let Claude Code start and monitor SF auto-mode sessions. See [Commands → MCP Server Mode](./commands.md#mcp-server-mode) for the full tool list.
|
||||
|
||||
**Verify the connection:**
|
||||
|
||||
|
|
|
|||
|
|
@ -344,7 +344,7 @@ Doctor rebuilds `STATE.md` from plan and roadmap files on disk and fixes detecte
|
|||
|
||||
### "SF database is not available"
|
||||
|
||||
**Symptoms:** `gsd_decision_save` (or its alias `gsd_save_decision`), `gsd_requirement_update` (or `gsd_update_requirement`), or `gsd_summary_save` (or `gsd_save_summary`) fail with this error.
|
||||
**Symptoms:** `sf_decision_save` (or its alias `sf_save_decision`), `sf_requirement_update` (or `sf_update_requirement`), or `sf_summary_save` (or `sf_save_summary`) fail with this error.
|
||||
|
||||
**Cause:** The SQLite database wasn't initialized. This happens in manual `/sf` sessions (non-auto mode) on versions before v2.29.
|
||||
|
||||
|
|
|
|||
|
|
@ -133,7 +133,7 @@ SF 会检测你本地的 Claude Code 安装,并把它作为已认证的 Anthro
|
|||
|
||||
**暴露了什么**
|
||||
|
||||
MCP server 会暴露 SF 的完整 workflow 工具面:milestone planning、task completion、slice 管理、roadmap reassessment、journal 查询等。会话管理工具(`gsd_execute`、`gsd_status`、`gsd_result`、`gsd_cancel`)允许 Claude Code 启动并监控 SF 自动模式会话。完整工具列表见 [命令 → MCP Server 模式](./commands.md#mcp-server-mode)。
|
||||
MCP server 会暴露 SF 的完整 workflow 工具面:milestone planning、task completion、slice 管理、roadmap reassessment、journal 查询等。会话管理工具(`sf_execute`、`sf_status`、`sf_result`、`sf_cancel`)允许 Claude Code 启动并监控 SF 自动模式会话。完整工具列表见 [命令 → MCP Server 模式](./commands.md#mcp-server-mode)。
|
||||
|
||||
**验证连接**
|
||||
|
||||
|
|
|
|||
|
|
@ -361,7 +361,7 @@ Doctor 会从磁盘上的 plan 和 roadmap 文件重建 `STATE.md`,并修复
|
|||
|
||||
### “SF database is not available”
|
||||
|
||||
**症状:** `gsd_decision_save`(及其别名 `gsd_save_decision`)、`gsd_requirement_update`(及其别名 `gsd_update_requirement`)或 `gsd_summary_save`(及其别名 `gsd_save_summary`)报这个错误。
|
||||
**症状:** `sf_decision_save`(及其别名 `sf_save_decision`)、`sf_requirement_update`(及其别名 `sf_update_requirement`)或 `sf_summary_save`(及其别名 `sf_save_summary`)报这个错误。
|
||||
|
||||
**原因:** SQLite 数据库未初始化。这个问题会出现在 v2.29 之前的手动 `/sf` 会话(非自动模式)中。
|
||||
|
||||
|
|
|
|||
|
|
@ -79,35 +79,35 @@ Add to `.cursor/mcp.json`:
|
|||
|
||||
The workflow MCP surface includes:
|
||||
|
||||
- `gsd_decision_save`
|
||||
- `gsd_save_decision`
|
||||
- `gsd_requirement_update`
|
||||
- `gsd_update_requirement`
|
||||
- `gsd_requirement_save`
|
||||
- `gsd_save_requirement`
|
||||
- `gsd_milestone_generate_id`
|
||||
- `gsd_generate_milestone_id`
|
||||
- `gsd_plan_milestone`
|
||||
- `gsd_plan_slice`
|
||||
- `gsd_plan_task`
|
||||
- `gsd_task_plan`
|
||||
- `gsd_replan_slice`
|
||||
- `gsd_slice_replan`
|
||||
- `gsd_task_complete`
|
||||
- `gsd_complete_task`
|
||||
- `gsd_slice_complete`
|
||||
- `gsd_complete_slice`
|
||||
- `gsd_skip_slice`
|
||||
- `gsd_validate_milestone`
|
||||
- `gsd_milestone_validate`
|
||||
- `gsd_complete_milestone`
|
||||
- `gsd_milestone_complete`
|
||||
- `gsd_reassess_roadmap`
|
||||
- `gsd_roadmap_reassess`
|
||||
- `gsd_save_gate_result`
|
||||
- `gsd_summary_save`
|
||||
- `gsd_milestone_status`
|
||||
- `gsd_journal_query`
|
||||
- `sf_decision_save`
|
||||
- `sf_save_decision`
|
||||
- `sf_requirement_update`
|
||||
- `sf_update_requirement`
|
||||
- `sf_requirement_save`
|
||||
- `sf_save_requirement`
|
||||
- `sf_milestone_generate_id`
|
||||
- `sf_generate_milestone_id`
|
||||
- `sf_plan_milestone`
|
||||
- `sf_plan_slice`
|
||||
- `sf_plan_task`
|
||||
- `sf_task_plan`
|
||||
- `sf_replan_slice`
|
||||
- `sf_slice_replan`
|
||||
- `sf_task_complete`
|
||||
- `sf_complete_task`
|
||||
- `sf_slice_complete`
|
||||
- `sf_complete_slice`
|
||||
- `sf_skip_slice`
|
||||
- `sf_validate_milestone`
|
||||
- `sf_milestone_validate`
|
||||
- `sf_complete_milestone`
|
||||
- `sf_milestone_complete`
|
||||
- `sf_reassess_roadmap`
|
||||
- `sf_roadmap_reassess`
|
||||
- `sf_save_gate_result`
|
||||
- `sf_summary_save`
|
||||
- `sf_milestone_status`
|
||||
- `sf_journal_query`
|
||||
|
||||
These tools use the same SF workflow handlers as the native in-process tool path wherever a shared handler exists.
|
||||
|
||||
|
|
@ -126,7 +126,7 @@ Current support boundary:
|
|||
|
||||
If the executor bridge cannot be loaded, workflow mutation calls will fail with a precise configuration error instead of silently degrading.
|
||||
|
||||
### `gsd_execute`
|
||||
### `sf_execute`
|
||||
|
||||
Start a SF auto-mode session for a project directory.
|
||||
|
||||
|
|
@ -139,13 +139,13 @@ Start a SF auto-mode session for a project directory.
|
|||
|
||||
**Returns:** `{ sessionId, status: "started" }`
|
||||
|
||||
### `gsd_status`
|
||||
### `sf_status`
|
||||
|
||||
Poll the current status of a running SF session.
|
||||
|
||||
| Parameter | Type | Required | Description |
|
||||
|-----------|------|----------|-------------|
|
||||
| `sessionId` | `string` | ✅ | Session ID from `gsd_execute` |
|
||||
| `sessionId` | `string` | ✅ | Session ID from `sf_execute` |
|
||||
|
||||
**Returns:**
|
||||
|
||||
|
|
@ -160,13 +160,13 @@ Poll the current status of a running SF session.
|
|||
}
|
||||
```
|
||||
|
||||
### `gsd_result`
|
||||
### `sf_result`
|
||||
|
||||
Get the accumulated result of a session. Works for both running (partial) and completed sessions.
|
||||
|
||||
| Parameter | Type | Required | Description |
|
||||
|-----------|------|----------|-------------|
|
||||
| `sessionId` | `string` | ✅ | Session ID from `gsd_execute` |
|
||||
| `sessionId` | `string` | ✅ | Session ID from `sf_execute` |
|
||||
|
||||
**Returns:**
|
||||
|
||||
|
|
@ -183,17 +183,17 @@ Get the accumulated result of a session. Works for both running (partial) and co
|
|||
}
|
||||
```
|
||||
|
||||
### `gsd_cancel`
|
||||
### `sf_cancel`
|
||||
|
||||
Cancel a running session. Aborts the current operation and stops the agent process.
|
||||
|
||||
| Parameter | Type | Required | Description |
|
||||
|-----------|------|----------|-------------|
|
||||
| `sessionId` | `string` | ✅ | Session ID from `gsd_execute` |
|
||||
| `sessionId` | `string` | ✅ | Session ID from `sf_execute` |
|
||||
|
||||
**Returns:** `{ cancelled: true }`
|
||||
|
||||
### `gsd_query`
|
||||
### `sf_query`
|
||||
|
||||
Query SF project state from the filesystem without an active session. Returns STATE.md, PROJECT.md, requirements, and milestone listing.
|
||||
|
||||
|
|
@ -216,13 +216,13 @@ Query SF project state from the filesystem without an active session. Returns ST
|
|||
}
|
||||
```
|
||||
|
||||
### `gsd_resolve_blocker`
|
||||
### `sf_resolve_blocker`
|
||||
|
||||
Resolve a pending blocker in a session by sending a response to the blocked UI request.
|
||||
|
||||
| Parameter | Type | Required | Description |
|
||||
|-----------|------|----------|-------------|
|
||||
| `sessionId` | `string` | ✅ | Session ID from `gsd_execute` |
|
||||
| `sessionId` | `string` | ✅ | Session ID from `sf_execute` |
|
||||
| `response` | `string` | ✅ | Response to send for the pending blocker |
|
||||
|
||||
**Returns:** `{ resolved: true }`
|
||||
|
|
|
|||
|
|
@ -584,13 +584,13 @@ describe('createMcpServer tool registration', () => {
|
|||
assert.ok(typeof server.close === 'function');
|
||||
});
|
||||
|
||||
it('gsd_execute flow returns sessionId on success', async () => {
|
||||
it('sf_execute flow returns sessionId on success', async () => {
|
||||
const sessionId = await sm.startSession('/tmp/tool-exec', { cliPath: '/usr/bin/sf' });
|
||||
assert.equal(typeof sessionId, 'string');
|
||||
assert.ok(sessionId.length > 0);
|
||||
});
|
||||
|
||||
it('gsd_status flow returns correct shape', async () => {
|
||||
it('sf_status flow returns correct shape', async () => {
|
||||
const sessionId = await sm.startSession('/tmp/tool-status', { cliPath: '/usr/bin/sf' });
|
||||
const session = sm.getSession(sessionId)!;
|
||||
|
||||
|
|
@ -600,7 +600,7 @@ describe('createMcpServer tool registration', () => {
|
|||
assert.equal(typeof session.startTime, 'number');
|
||||
});
|
||||
|
||||
it('gsd_resolve_blocker flow returns error when no blocker', async () => {
|
||||
it('sf_resolve_blocker flow returns error when no blocker', async () => {
|
||||
const sessionId = await sm.startSession('/tmp/tool-resolve', { cliPath: '/usr/bin/sf' });
|
||||
await assert.rejects(
|
||||
() => sm.resolveBlocker(sessionId, 'fix'),
|
||||
|
|
@ -611,7 +611,7 @@ describe('createMcpServer tool registration', () => {
|
|||
);
|
||||
});
|
||||
|
||||
it('gsd_result flow returns HeadlessJsonResult shape', async () => {
|
||||
it('sf_result flow returns HeadlessJsonResult shape', async () => {
|
||||
const sessionId = await sm.startSession('/tmp/tool-result', { cliPath: '/usr/bin/sf' });
|
||||
const result = sm.getResult(sessionId);
|
||||
|
||||
|
|
@ -625,7 +625,7 @@ describe('createMcpServer tool registration', () => {
|
|||
assert.ok('error' in result);
|
||||
});
|
||||
|
||||
it('gsd_cancel flow marks session as cancelled', async () => {
|
||||
it('sf_cancel flow marks session as cancelled', async () => {
|
||||
const sessionId = await sm.startSession('/tmp/tool-cancel', { cliPath: '/usr/bin/sf' });
|
||||
await sm.cancelSession(sessionId);
|
||||
const session = sm.getSession(sessionId)!;
|
||||
|
|
|
|||
|
|
@ -189,7 +189,7 @@ export function runDoctorLite(projectDir: string, scope?: string): DoctorResult
|
|||
ok: true,
|
||||
issues: [{
|
||||
severity: 'info',
|
||||
code: 'no_gsd_directory',
|
||||
code: 'no_sf_directory',
|
||||
scope: 'project',
|
||||
unitId: '',
|
||||
message: 'No .sf/ directory found — project not initialized',
|
||||
|
|
|
|||
|
|
@ -503,7 +503,7 @@ describe('runDoctorLite', () => {
|
|||
const empty = tmpProject();
|
||||
const result = runDoctorLite(empty);
|
||||
assert.equal(result.ok, true);
|
||||
assert.equal(result.issues[0].code, 'no_gsd_directory');
|
||||
assert.equal(result.issues[0].code, 'no_sf_directory');
|
||||
rmSync(empty, { recursive: true, force: true });
|
||||
});
|
||||
});
|
||||
|
|
|
|||
|
|
@ -1,9 +1,9 @@
|
|||
/**
|
||||
* MCP Server — registers SF orchestration, project-state, and workflow tools.
|
||||
*
|
||||
* Session tools (6): gsd_execute, gsd_status, gsd_result, gsd_cancel, gsd_query, gsd_resolve_blocker
|
||||
* Session tools (6): sf_execute, sf_status, sf_result, sf_cancel, sf_query, sf_resolve_blocker
|
||||
* Interactive tools (2): ask_user_questions, secure_env_collect via MCP form elicitation
|
||||
* Read-only tools (6): gsd_progress, gsd_roadmap, gsd_history, gsd_doctor, gsd_captures, gsd_knowledge
|
||||
* Read-only tools (6): sf_progress, sf_roadmap, sf_history, sf_doctor, sf_captures, sf_knowledge
|
||||
* Workflow tools (29): headless-safe planning, metadata persistence, replanning, completion, validation, reassessment, gate result, status, and journal tools
|
||||
*
|
||||
* Uses dynamic imports for @modelcontextprotocol/sdk because TS Node16
|
||||
|
|
@ -54,7 +54,7 @@ function textContent(text: string): { content: Array<{ type: 'text'; text: strin
|
|||
}
|
||||
|
||||
// ---------------------------------------------------------------------------
|
||||
// gsd_query filesystem reader
|
||||
// sf_query filesystem reader
|
||||
// ---------------------------------------------------------------------------
|
||||
|
||||
/**
|
||||
|
|
@ -355,15 +355,15 @@ export async function createMcpServer(sessionManager: SessionManager): Promise<{
|
|||
);
|
||||
|
||||
// -----------------------------------------------------------------------
|
||||
// gsd_execute — start a new SF auto-mode session.
|
||||
// sf_execute — start a new SF auto-mode session.
|
||||
//
|
||||
// If the JSON-RPC request is aborted while the session is starting (or
|
||||
// immediately after), we cancel the session so we don't leak a background
|
||||
// RpcClient process. Once the session is running the caller should use
|
||||
// `gsd_cancel` to stop it via sessionId.
|
||||
// `sf_cancel` to stop it via sessionId.
|
||||
// -----------------------------------------------------------------------
|
||||
server.tool(
|
||||
'gsd_execute',
|
||||
'sf_execute',
|
||||
'Start a SF auto-mode session for a project directory. Returns a sessionId for tracking.',
|
||||
{
|
||||
projectDir: z.string().describe('Absolute path to the project directory'),
|
||||
|
|
@ -382,7 +382,7 @@ export async function createMcpServer(sessionManager: SessionManager): Promise<{
|
|||
// newly-created session rather than leaving an orphaned process.
|
||||
if (extra?.signal?.aborted) {
|
||||
await sessionManager.cancelSession(sessionId).catch(() => { /* swallow */ });
|
||||
return errorContent('gsd_execute aborted by client before returning');
|
||||
return errorContent('sf_execute aborted by client before returning');
|
||||
}
|
||||
|
||||
return jsonContent({ sessionId, status: 'started' });
|
||||
|
|
@ -393,13 +393,13 @@ export async function createMcpServer(sessionManager: SessionManager): Promise<{
|
|||
);
|
||||
|
||||
// -----------------------------------------------------------------------
|
||||
// gsd_status — poll session status
|
||||
// sf_status — poll session status
|
||||
// -----------------------------------------------------------------------
|
||||
server.tool(
|
||||
'gsd_status',
|
||||
'sf_status',
|
||||
'Get the current status of a SF session including progress, recent events, and pending blockers.',
|
||||
{
|
||||
sessionId: z.string().describe('Session ID returned from gsd_execute'),
|
||||
sessionId: z.string().describe('Session ID returned from sf_execute'),
|
||||
},
|
||||
async (args: Record<string, unknown>) => {
|
||||
const { sessionId } = args as { sessionId: string };
|
||||
|
|
@ -437,13 +437,13 @@ export async function createMcpServer(sessionManager: SessionManager): Promise<{
|
|||
);
|
||||
|
||||
// -----------------------------------------------------------------------
|
||||
// gsd_result — get accumulated session result
|
||||
// sf_result — get accumulated session result
|
||||
// -----------------------------------------------------------------------
|
||||
server.tool(
|
||||
'gsd_result',
|
||||
'sf_result',
|
||||
'Get the result of a SF session. Returns partial results if the session is still running.',
|
||||
{
|
||||
sessionId: z.string().describe('Session ID returned from gsd_execute'),
|
||||
sessionId: z.string().describe('Session ID returned from sf_execute'),
|
||||
},
|
||||
async (args: Record<string, unknown>) => {
|
||||
const { sessionId } = args as { sessionId: string };
|
||||
|
|
@ -457,13 +457,13 @@ export async function createMcpServer(sessionManager: SessionManager): Promise<{
|
|||
);
|
||||
|
||||
// -----------------------------------------------------------------------
|
||||
// gsd_cancel — cancel a running session
|
||||
// sf_cancel — cancel a running session
|
||||
// -----------------------------------------------------------------------
|
||||
server.tool(
|
||||
'gsd_cancel',
|
||||
'sf_cancel',
|
||||
'Cancel a running SF session. Aborts the current operation and stops the process.',
|
||||
{
|
||||
sessionId: z.string().describe('Session ID returned from gsd_execute'),
|
||||
sessionId: z.string().describe('Session ID returned from sf_execute'),
|
||||
},
|
||||
async (args: Record<string, unknown>) => {
|
||||
const { sessionId } = args as { sessionId: string };
|
||||
|
|
@ -477,7 +477,7 @@ export async function createMcpServer(sessionManager: SessionManager): Promise<{
|
|||
);
|
||||
|
||||
// -----------------------------------------------------------------------
|
||||
// gsd_query — read project state from filesystem (no session needed).
|
||||
// sf_query — read project state from filesystem (no session needed).
|
||||
//
|
||||
// `query` is optional: when omitted the tool returns all fields (STATE.md,
|
||||
// PROJECT.md, requirements, milestone listing). Accepted narrow values:
|
||||
|
|
@ -485,7 +485,7 @@ export async function createMcpServer(sessionManager: SessionManager): Promise<{
|
|||
// Unknown values fall back to "all" for forward-compatibility.
|
||||
// -----------------------------------------------------------------------
|
||||
server.tool(
|
||||
'gsd_query',
|
||||
'sf_query',
|
||||
'Query SF project state from the filesystem. By default returns STATE.md, PROJECT.md, requirements, and milestone listing. Pass `query` to narrow the response (accepted: "state"/"status", "project", "requirements", "milestones", "all"). Does not require an active session.',
|
||||
{
|
||||
projectDir: z.string().describe('Absolute path to the project directory'),
|
||||
|
|
@ -506,13 +506,13 @@ export async function createMcpServer(sessionManager: SessionManager): Promise<{
|
|||
);
|
||||
|
||||
// -----------------------------------------------------------------------
|
||||
// gsd_resolve_blocker — resolve a pending blocker
|
||||
// sf_resolve_blocker — resolve a pending blocker
|
||||
// -----------------------------------------------------------------------
|
||||
server.tool(
|
||||
'gsd_resolve_blocker',
|
||||
'sf_resolve_blocker',
|
||||
'Resolve a pending blocker in a SF session by sending a response to the UI request.',
|
||||
{
|
||||
sessionId: z.string().describe('Session ID returned from gsd_execute'),
|
||||
sessionId: z.string().describe('Session ID returned from sf_execute'),
|
||||
response: z.string().describe('Response to send for the pending blocker'),
|
||||
},
|
||||
async (args: Record<string, unknown>) => {
|
||||
|
|
@ -685,10 +685,10 @@ export async function createMcpServer(sessionManager: SessionManager): Promise<{
|
|||
// =======================================================================
|
||||
|
||||
// -----------------------------------------------------------------------
|
||||
// gsd_progress — structured project progress metrics
|
||||
// sf_progress — structured project progress metrics
|
||||
// -----------------------------------------------------------------------
|
||||
server.tool(
|
||||
'gsd_progress',
|
||||
'sf_progress',
|
||||
'Get structured project progress: active milestone/slice/task, phase, completion counts, blockers, and next action. No session required — reads directly from .sf/ on disk.',
|
||||
{
|
||||
projectDir: z.string().describe('Absolute path to the project directory'),
|
||||
|
|
@ -704,10 +704,10 @@ export async function createMcpServer(sessionManager: SessionManager): Promise<{
|
|||
);
|
||||
|
||||
// -----------------------------------------------------------------------
|
||||
// gsd_roadmap — milestone/slice/task structure with status
|
||||
// sf_roadmap — milestone/slice/task structure with status
|
||||
// -----------------------------------------------------------------------
|
||||
server.tool(
|
||||
'gsd_roadmap',
|
||||
'sf_roadmap',
|
||||
'Get the full project roadmap structure: milestones with their slices, tasks, status, risk, and dependencies. Optionally filter to a single milestone. No session required.',
|
||||
{
|
||||
projectDir: z.string().describe('Absolute path to the project directory'),
|
||||
|
|
@ -724,10 +724,10 @@ export async function createMcpServer(sessionManager: SessionManager): Promise<{
|
|||
);
|
||||
|
||||
// -----------------------------------------------------------------------
|
||||
// gsd_history — execution history with cost/token metrics
|
||||
// sf_history — execution history with cost/token metrics
|
||||
// -----------------------------------------------------------------------
|
||||
server.tool(
|
||||
'gsd_history',
|
||||
'sf_history',
|
||||
'Get execution history with cost, token usage, model, and duration per unit. Returns totals across all units. No session required.',
|
||||
{
|
||||
projectDir: z.string().describe('Absolute path to the project directory'),
|
||||
|
|
@ -744,10 +744,10 @@ export async function createMcpServer(sessionManager: SessionManager): Promise<{
|
|||
);
|
||||
|
||||
// -----------------------------------------------------------------------
|
||||
// gsd_doctor — lightweight structural health check
|
||||
// sf_doctor — lightweight structural health check
|
||||
// -----------------------------------------------------------------------
|
||||
server.tool(
|
||||
'gsd_doctor',
|
||||
'sf_doctor',
|
||||
'Run a lightweight structural health check on the .sf/ directory. Checks for missing files, status inconsistencies, and orphaned state. No session required.',
|
||||
{
|
||||
projectDir: z.string().describe('Absolute path to the project directory'),
|
||||
|
|
@ -764,10 +764,10 @@ export async function createMcpServer(sessionManager: SessionManager): Promise<{
|
|||
);
|
||||
|
||||
// -----------------------------------------------------------------------
|
||||
// gsd_captures — pending captures and ideas
|
||||
// sf_captures — pending captures and ideas
|
||||
// -----------------------------------------------------------------------
|
||||
server.tool(
|
||||
'gsd_captures',
|
||||
'sf_captures',
|
||||
'Get captured ideas and thoughts from CAPTURES.md with triage status. Filter by pending, actionable, or all. No session required.',
|
||||
{
|
||||
projectDir: z.string().describe('Absolute path to the project directory'),
|
||||
|
|
@ -784,10 +784,10 @@ export async function createMcpServer(sessionManager: SessionManager): Promise<{
|
|||
);
|
||||
|
||||
// -----------------------------------------------------------------------
|
||||
// gsd_knowledge — project knowledge base
|
||||
// sf_knowledge — project knowledge base
|
||||
// -----------------------------------------------------------------------
|
||||
server.tool(
|
||||
'gsd_knowledge',
|
||||
'sf_knowledge',
|
||||
'Get the project knowledge base: rules, patterns, and lessons learned accumulated during development. No session required.',
|
||||
{
|
||||
projectDir: z.string().describe('Absolute path to the project directory'),
|
||||
|
|
@ -803,7 +803,7 @@ export async function createMcpServer(sessionManager: SessionManager): Promise<{
|
|||
);
|
||||
|
||||
// -----------------------------------------------------------------------
|
||||
// gsd_graph — knowledge graph for SF projects
|
||||
// sf_graph — knowledge graph for SF projects
|
||||
//
|
||||
// Modes:
|
||||
// build Parse .sf/ artifacts and write graph.json atomically.
|
||||
|
|
@ -812,7 +812,7 @@ export async function createMcpServer(sessionManager: SessionManager): Promise<{
|
|||
// diff Compare graph.json with the last build snapshot.
|
||||
// -----------------------------------------------------------------------
|
||||
server.tool(
|
||||
'gsd_graph',
|
||||
'sf_graph',
|
||||
[
|
||||
'Manage the SF project knowledge graph. No session required.',
|
||||
'',
|
||||
|
|
|
|||
|
|
@ -76,12 +76,12 @@ describe("workflow MCP tools", () => {
|
|||
assert.deepEqual(server.tools.map((t) => t.name), [...WORKFLOW_TOOL_NAMES]);
|
||||
});
|
||||
|
||||
it("gsd_summary_save writes artifact through the shared executor", async () => {
|
||||
it("sf_summary_save writes artifact through the shared executor", async () => {
|
||||
const base = makeTmpBase();
|
||||
try {
|
||||
const server = makeMockServer();
|
||||
registerWorkflowTools(server as any);
|
||||
const tool = server.tools.find((t) => t.name === "gsd_summary_save");
|
||||
const tool = server.tools.find((t) => t.name === "sf_summary_save");
|
||||
assert.ok(tool, "summary tool should be registered");
|
||||
const originalCwd = process.cwd();
|
||||
|
||||
|
|
@ -113,7 +113,7 @@ describe("workflow MCP tools", () => {
|
|||
process.env.SF_WORKFLOW_PROJECT_ROOT = base;
|
||||
const server = makeMockServer();
|
||||
registerWorkflowTools(server as any);
|
||||
const tool = server.tools.find((t) => t.name === "gsd_summary_save");
|
||||
const tool = server.tools.find((t) => t.name === "sf_summary_save");
|
||||
assert.ok(tool, "summary tool should be registered");
|
||||
|
||||
await assert.rejects(
|
||||
|
|
@ -147,7 +147,7 @@ describe("workflow MCP tools", () => {
|
|||
const { registerWorkflowTools: freshRegisterWorkflowTools } = await import(`./workflow-tools.ts?bad-module=${randomUUID()}`);
|
||||
const server = makeMockServer();
|
||||
freshRegisterWorkflowTools(server as any);
|
||||
const tool = server.tools.find((t) => t.name === "gsd_summary_save");
|
||||
const tool = server.tools.find((t) => t.name === "sf_summary_save");
|
||||
assert.ok(tool, "summary tool should be registered");
|
||||
|
||||
await assert.rejects(
|
||||
|
|
@ -187,7 +187,7 @@ describe("workflow MCP tools", () => {
|
|||
|
||||
const server = makeMockServer();
|
||||
registerWorkflowTools(server as any);
|
||||
const taskTool = server.tools.find((t) => t.name === "gsd_task_complete");
|
||||
const taskTool = server.tools.find((t) => t.name === "sf_task_complete");
|
||||
assert.ok(taskTool, "task tool should be registered");
|
||||
|
||||
await assert.rejects(
|
||||
|
|
@ -220,7 +220,7 @@ describe("workflow MCP tools", () => {
|
|||
|
||||
const server = makeMockServer();
|
||||
registerWorkflowTools(server as any);
|
||||
const taskTool = server.tools.find((t) => t.name === "gsd_task_complete");
|
||||
const taskTool = server.tools.find((t) => t.name === "sf_task_complete");
|
||||
assert.ok(taskTool, "task tool should be registered");
|
||||
|
||||
await assert.rejects(
|
||||
|
|
@ -234,14 +234,14 @@ describe("workflow MCP tools", () => {
|
|||
narrative: "Did the work",
|
||||
verification: "npm test",
|
||||
}),
|
||||
/planning tool .* not executes work|Cannot gsd_task_complete|Unknown tools are not permitted during queue mode/,
|
||||
/planning tool .* not executes work|Cannot sf_task_complete|Unknown tools are not permitted during queue mode/,
|
||||
);
|
||||
} finally {
|
||||
cleanup(base);
|
||||
}
|
||||
});
|
||||
|
||||
it("gsd_task_complete and gsd_milestone_status work end-to-end", async () => {
|
||||
it("sf_task_complete and sf_milestone_status work end-to-end", async () => {
|
||||
const base = makeTmpBase();
|
||||
try {
|
||||
mkdirSync(join(base, ".sf", "milestones", "M001", "slices", "S01"), { recursive: true });
|
||||
|
|
@ -252,8 +252,8 @@ describe("workflow MCP tools", () => {
|
|||
|
||||
const server = makeMockServer();
|
||||
registerWorkflowTools(server as any);
|
||||
const taskTool = server.tools.find((t) => t.name === "gsd_task_complete");
|
||||
const statusTool = server.tools.find((t) => t.name === "gsd_milestone_status");
|
||||
const taskTool = server.tools.find((t) => t.name === "sf_task_complete");
|
||||
const statusTool = server.tools.find((t) => t.name === "sf_milestone_status");
|
||||
assert.ok(taskTool, "task tool should be registered");
|
||||
assert.ok(statusTool, "status tool should be registered");
|
||||
|
||||
|
|
@ -286,7 +286,7 @@ describe("workflow MCP tools", () => {
|
|||
}
|
||||
});
|
||||
|
||||
it("gsd_complete_task alias delegates to gsd_task_complete behavior", async () => {
|
||||
it("sf_complete_task alias delegates to sf_task_complete behavior", async () => {
|
||||
const base = makeTmpBase();
|
||||
try {
|
||||
mkdirSync(join(base, ".sf", "milestones", "M002", "slices", "S02"), { recursive: true });
|
||||
|
|
@ -297,7 +297,7 @@ describe("workflow MCP tools", () => {
|
|||
|
||||
const server = makeMockServer();
|
||||
registerWorkflowTools(server as any);
|
||||
const aliasTool = server.tools.find((t) => t.name === "gsd_complete_task");
|
||||
const aliasTool = server.tools.find((t) => t.name === "sf_complete_task");
|
||||
assert.ok(aliasTool, "task completion alias should be registered");
|
||||
|
||||
const result = await aliasTool!.handler({
|
||||
|
|
@ -320,13 +320,13 @@ describe("workflow MCP tools", () => {
|
|||
}
|
||||
});
|
||||
|
||||
it("gsd_plan_milestone and gsd_plan_slice work end-to-end", async () => {
|
||||
it("sf_plan_milestone and sf_plan_slice work end-to-end", async () => {
|
||||
const base = makeTmpBase();
|
||||
try {
|
||||
const server = makeMockServer();
|
||||
registerWorkflowTools(server as any);
|
||||
const milestoneTool = server.tools.find((t) => t.name === "gsd_plan_milestone");
|
||||
const sliceTool = server.tools.find((t) => t.name === "gsd_plan_slice");
|
||||
const milestoneTool = server.tools.find((t) => t.name === "sf_plan_milestone");
|
||||
const sliceTool = server.tools.find((t) => t.name === "sf_plan_slice");
|
||||
assert.ok(milestoneTool, "milestone planning tool should be registered");
|
||||
assert.ok(sliceTool, "slice planning tool should be registered");
|
||||
|
||||
|
|
@ -384,12 +384,12 @@ describe("workflow MCP tools", () => {
|
|||
}
|
||||
});
|
||||
|
||||
it("gsd_requirement_save opens the DB before inline requirement writes", async () => {
|
||||
it("sf_requirement_save opens the DB before inline requirement writes", async () => {
|
||||
const base = makeTmpBase();
|
||||
try {
|
||||
const server = makeMockServer();
|
||||
registerWorkflowTools(server as any);
|
||||
const requirementTool = server.tools.find((t) => t.name === "gsd_requirement_save");
|
||||
const requirementTool = server.tools.find((t) => t.name === "sf_requirement_save");
|
||||
assert.ok(requirementTool, "requirement tool should be registered");
|
||||
|
||||
closeDatabase();
|
||||
|
|
@ -417,14 +417,14 @@ describe("workflow MCP tools", () => {
|
|||
}
|
||||
});
|
||||
|
||||
it("gsd_plan_task reopens the DB before inline task planning writes", async () => {
|
||||
it("sf_plan_task reopens the DB before inline task planning writes", async () => {
|
||||
const base = makeTmpBase();
|
||||
try {
|
||||
const server = makeMockServer();
|
||||
registerWorkflowTools(server as any);
|
||||
const milestoneTool = server.tools.find((t) => t.name === "gsd_plan_milestone");
|
||||
const sliceTool = server.tools.find((t) => t.name === "gsd_plan_slice");
|
||||
const taskTool = server.tools.find((t) => t.name === "gsd_plan_task");
|
||||
const milestoneTool = server.tools.find((t) => t.name === "sf_plan_milestone");
|
||||
const sliceTool = server.tools.find((t) => t.name === "sf_plan_slice");
|
||||
const taskTool = server.tools.find((t) => t.name === "sf_plan_task");
|
||||
assert.ok(milestoneTool, "milestone planning tool should be registered");
|
||||
assert.ok(sliceTool, "slice planning tool should be registered");
|
||||
assert.ok(taskTool, "task planning tool should be registered");
|
||||
|
|
@ -440,7 +440,7 @@ describe("workflow MCP tools", () => {
|
|||
title: "Inline task planning",
|
||||
risk: "medium",
|
||||
depends: [],
|
||||
demo: "Inline gsd_plan_task reopens the DB after it was closed.",
|
||||
demo: "Inline sf_plan_task reopens the DB after it was closed.",
|
||||
goal: "Preserve MCP task planning after the DB adapter is closed.",
|
||||
successCriteria: "The second task plan persists after a closed DB is reopened.",
|
||||
proofLevel: "integration",
|
||||
|
|
@ -494,16 +494,16 @@ describe("workflow MCP tools", () => {
|
|||
}
|
||||
});
|
||||
|
||||
it("gsd_replan_slice and gsd_slice_replan work end-to-end", async () => {
|
||||
it("sf_replan_slice and sf_slice_replan work end-to-end", async () => {
|
||||
const base = makeTmpBase();
|
||||
try {
|
||||
const server = makeMockServer();
|
||||
registerWorkflowTools(server as any);
|
||||
const milestoneTool = server.tools.find((t) => t.name === "gsd_plan_milestone");
|
||||
const sliceTool = server.tools.find((t) => t.name === "gsd_plan_slice");
|
||||
const taskTool = server.tools.find((t) => t.name === "gsd_task_complete");
|
||||
const canonicalTool = server.tools.find((t) => t.name === "gsd_replan_slice");
|
||||
const aliasTool = server.tools.find((t) => t.name === "gsd_slice_replan");
|
||||
const milestoneTool = server.tools.find((t) => t.name === "sf_plan_milestone");
|
||||
const sliceTool = server.tools.find((t) => t.name === "sf_plan_slice");
|
||||
const taskTool = server.tools.find((t) => t.name === "sf_task_complete");
|
||||
const canonicalTool = server.tools.find((t) => t.name === "sf_replan_slice");
|
||||
const aliasTool = server.tools.find((t) => t.name === "sf_slice_replan");
|
||||
assert.ok(milestoneTool, "milestone planning tool should be registered");
|
||||
assert.ok(sliceTool, "slice planning tool should be registered");
|
||||
assert.ok(taskTool, "task completion tool should be registered");
|
||||
|
|
@ -640,16 +640,16 @@ describe("workflow MCP tools", () => {
|
|||
}
|
||||
});
|
||||
|
||||
it("gsd_slice_complete and gsd_complete_slice work end-to-end", async () => {
|
||||
it("sf_slice_complete and sf_complete_slice work end-to-end", async () => {
|
||||
const base = makeTmpBase();
|
||||
try {
|
||||
const server = makeMockServer();
|
||||
registerWorkflowTools(server as any);
|
||||
const milestoneTool = server.tools.find((t) => t.name === "gsd_plan_milestone");
|
||||
const sliceTool = server.tools.find((t) => t.name === "gsd_plan_slice");
|
||||
const taskTool = server.tools.find((t) => t.name === "gsd_task_complete");
|
||||
const canonicalTool = server.tools.find((t) => t.name === "gsd_slice_complete");
|
||||
const aliasTool = server.tools.find((t) => t.name === "gsd_complete_slice");
|
||||
const milestoneTool = server.tools.find((t) => t.name === "sf_plan_milestone");
|
||||
const sliceTool = server.tools.find((t) => t.name === "sf_plan_slice");
|
||||
const taskTool = server.tools.find((t) => t.name === "sf_task_complete");
|
||||
const canonicalTool = server.tools.find((t) => t.name === "sf_slice_complete");
|
||||
const aliasTool = server.tools.find((t) => t.name === "sf_complete_slice");
|
||||
assert.ok(milestoneTool, "milestone planning tool should be registered");
|
||||
assert.ok(sliceTool, "slice planning tool should be registered");
|
||||
assert.ok(taskTool, "task completion tool should be registered");
|
||||
|
|
@ -788,17 +788,17 @@ describe("workflow MCP tools", () => {
|
|||
}
|
||||
});
|
||||
|
||||
it("gsd_validate_milestone and gsd_milestone_complete work end-to-end", async () => {
|
||||
it("sf_validate_milestone and sf_milestone_complete work end-to-end", async () => {
|
||||
const base = makeTmpBase();
|
||||
try {
|
||||
const server = makeMockServer();
|
||||
registerWorkflowTools(server as any);
|
||||
const milestoneTool = server.tools.find((t) => t.name === "gsd_plan_milestone");
|
||||
const sliceTool = server.tools.find((t) => t.name === "gsd_plan_slice");
|
||||
const taskTool = server.tools.find((t) => t.name === "gsd_task_complete");
|
||||
const completeSliceTool = server.tools.find((t) => t.name === "gsd_slice_complete");
|
||||
const validateTool = server.tools.find((t) => t.name === "gsd_validate_milestone");
|
||||
const completeMilestoneAlias = server.tools.find((t) => t.name === "gsd_milestone_complete");
|
||||
const milestoneTool = server.tools.find((t) => t.name === "sf_plan_milestone");
|
||||
const sliceTool = server.tools.find((t) => t.name === "sf_plan_slice");
|
||||
const taskTool = server.tools.find((t) => t.name === "sf_task_complete");
|
||||
const completeSliceTool = server.tools.find((t) => t.name === "sf_slice_complete");
|
||||
const validateTool = server.tools.find((t) => t.name === "sf_validate_milestone");
|
||||
const completeMilestoneAlias = server.tools.find((t) => t.name === "sf_milestone_complete");
|
||||
assert.ok(milestoneTool, "milestone planning tool should be registered");
|
||||
assert.ok(sliceTool, "slice planning tool should be registered");
|
||||
assert.ok(taskTool, "task completion tool should be registered");
|
||||
|
|
@ -899,18 +899,18 @@ describe("workflow MCP tools", () => {
|
|||
}
|
||||
});
|
||||
|
||||
it("gsd_reassess_roadmap, gsd_roadmap_reassess, and gsd_save_gate_result work end-to-end", async () => {
|
||||
it("sf_reassess_roadmap, sf_roadmap_reassess, and sf_save_gate_result work end-to-end", async () => {
|
||||
const base = makeTmpBase();
|
||||
try {
|
||||
const server = makeMockServer();
|
||||
registerWorkflowTools(server as any);
|
||||
const milestoneTool = server.tools.find((t) => t.name === "gsd_plan_milestone");
|
||||
const sliceTool = server.tools.find((t) => t.name === "gsd_plan_slice");
|
||||
const taskTool = server.tools.find((t) => t.name === "gsd_task_complete");
|
||||
const completeSliceTool = server.tools.find((t) => t.name === "gsd_slice_complete");
|
||||
const reassessTool = server.tools.find((t) => t.name === "gsd_reassess_roadmap");
|
||||
const reassessAlias = server.tools.find((t) => t.name === "gsd_roadmap_reassess");
|
||||
const gateTool = server.tools.find((t) => t.name === "gsd_save_gate_result");
|
||||
const milestoneTool = server.tools.find((t) => t.name === "sf_plan_milestone");
|
||||
const sliceTool = server.tools.find((t) => t.name === "sf_plan_slice");
|
||||
const taskTool = server.tools.find((t) => t.name === "sf_task_complete");
|
||||
const completeSliceTool = server.tools.find((t) => t.name === "sf_slice_complete");
|
||||
const reassessTool = server.tools.find((t) => t.name === "sf_reassess_roadmap");
|
||||
const reassessAlias = server.tools.find((t) => t.name === "sf_roadmap_reassess");
|
||||
const gateTool = server.tools.find((t) => t.name === "sf_save_gate_result");
|
||||
assert.ok(milestoneTool, "milestone planning tool should be registered");
|
||||
assert.ok(sliceTool, "slice planning tool should be registered");
|
||||
assert.ok(taskTool, "task completion tool should be registered");
|
||||
|
|
|
|||
|
|
@ -462,35 +462,35 @@ interface McpToolServer {
|
|||
}
|
||||
|
||||
export const WORKFLOW_TOOL_NAMES = [
|
||||
"gsd_decision_save",
|
||||
"gsd_save_decision",
|
||||
"gsd_requirement_update",
|
||||
"gsd_update_requirement",
|
||||
"gsd_requirement_save",
|
||||
"gsd_save_requirement",
|
||||
"gsd_milestone_generate_id",
|
||||
"gsd_generate_milestone_id",
|
||||
"gsd_plan_milestone",
|
||||
"gsd_plan_slice",
|
||||
"gsd_plan_task",
|
||||
"gsd_task_plan",
|
||||
"gsd_replan_slice",
|
||||
"gsd_slice_replan",
|
||||
"gsd_slice_complete",
|
||||
"gsd_complete_slice",
|
||||
"gsd_skip_slice",
|
||||
"gsd_complete_milestone",
|
||||
"gsd_milestone_complete",
|
||||
"gsd_validate_milestone",
|
||||
"gsd_milestone_validate",
|
||||
"gsd_reassess_roadmap",
|
||||
"gsd_roadmap_reassess",
|
||||
"gsd_save_gate_result",
|
||||
"gsd_summary_save",
|
||||
"gsd_task_complete",
|
||||
"gsd_complete_task",
|
||||
"gsd_milestone_status",
|
||||
"gsd_journal_query",
|
||||
"sf_decision_save",
|
||||
"sf_save_decision",
|
||||
"sf_requirement_update",
|
||||
"sf_update_requirement",
|
||||
"sf_requirement_save",
|
||||
"sf_save_requirement",
|
||||
"sf_milestone_generate_id",
|
||||
"sf_generate_milestone_id",
|
||||
"sf_plan_milestone",
|
||||
"sf_plan_slice",
|
||||
"sf_plan_task",
|
||||
"sf_task_plan",
|
||||
"sf_replan_slice",
|
||||
"sf_slice_replan",
|
||||
"sf_slice_complete",
|
||||
"sf_complete_slice",
|
||||
"sf_skip_slice",
|
||||
"sf_complete_milestone",
|
||||
"sf_milestone_complete",
|
||||
"sf_validate_milestone",
|
||||
"sf_milestone_validate",
|
||||
"sf_reassess_roadmap",
|
||||
"sf_roadmap_reassess",
|
||||
"sf_save_gate_result",
|
||||
"sf_summary_save",
|
||||
"sf_task_complete",
|
||||
"sf_complete_task",
|
||||
"sf_milestone_status",
|
||||
"sf_journal_query",
|
||||
] as const;
|
||||
|
||||
async function runSerializedWorkflowOperation<T>(fn: () => Promise<T>): Promise<T> {
|
||||
|
|
@ -558,7 +558,7 @@ async function handleTaskComplete(
|
|||
projectDir: string,
|
||||
args: Omit<z.infer<typeof taskCompleteSchema>, "projectDir">,
|
||||
): Promise<unknown> {
|
||||
await enforceWorkflowWriteGate("gsd_task_complete", projectDir, args.milestoneId);
|
||||
await enforceWorkflowWriteGate("sf_task_complete", projectDir, args.milestoneId);
|
||||
const {
|
||||
taskId,
|
||||
sliceId,
|
||||
|
|
@ -599,7 +599,7 @@ async function handleSliceComplete(
|
|||
projectDir: string,
|
||||
args: z.infer<typeof sliceCompleteSchema>,
|
||||
): Promise<unknown> {
|
||||
await enforceWorkflowWriteGate("gsd_slice_complete", projectDir, args.milestoneId);
|
||||
await enforceWorkflowWriteGate("sf_slice_complete", projectDir, args.milestoneId);
|
||||
const { executeSliceComplete } = await getWorkflowToolExecutors();
|
||||
const { projectDir: _projectDir, ...params } = args;
|
||||
return runSerializedWorkflowOperation(() => executeSliceComplete(params, projectDir));
|
||||
|
|
@ -609,7 +609,7 @@ async function handleReplanSlice(
|
|||
projectDir: string,
|
||||
args: z.infer<typeof replanSliceSchema>,
|
||||
): Promise<unknown> {
|
||||
await enforceWorkflowWriteGate("gsd_replan_slice", projectDir, args.milestoneId);
|
||||
await enforceWorkflowWriteGate("sf_replan_slice", projectDir, args.milestoneId);
|
||||
const { executeReplanSlice } = await getWorkflowToolExecutors();
|
||||
const { projectDir: _projectDir, ...params } = args;
|
||||
return runSerializedWorkflowOperation(() => executeReplanSlice(params, projectDir));
|
||||
|
|
@ -619,7 +619,7 @@ async function handleCompleteMilestone(
|
|||
projectDir: string,
|
||||
args: z.infer<typeof completeMilestoneSchema>,
|
||||
): Promise<unknown> {
|
||||
await enforceWorkflowWriteGate("gsd_complete_milestone", projectDir, args.milestoneId);
|
||||
await enforceWorkflowWriteGate("sf_complete_milestone", projectDir, args.milestoneId);
|
||||
const { executeCompleteMilestone } = await getWorkflowToolExecutors();
|
||||
const { projectDir: _projectDir, ...params } = args;
|
||||
return runSerializedWorkflowOperation(() => executeCompleteMilestone(params, projectDir));
|
||||
|
|
@ -629,7 +629,7 @@ async function handleValidateMilestone(
|
|||
projectDir: string,
|
||||
args: z.infer<typeof validateMilestoneSchema>,
|
||||
): Promise<unknown> {
|
||||
await enforceWorkflowWriteGate("gsd_validate_milestone", projectDir, args.milestoneId);
|
||||
await enforceWorkflowWriteGate("sf_validate_milestone", projectDir, args.milestoneId);
|
||||
const { executeValidateMilestone } = await getWorkflowToolExecutors();
|
||||
const { projectDir: _projectDir, ...params } = args;
|
||||
return runSerializedWorkflowOperation(() => executeValidateMilestone(params, projectDir));
|
||||
|
|
@ -639,7 +639,7 @@ async function handleReassessRoadmap(
|
|||
projectDir: string,
|
||||
args: z.infer<typeof reassessRoadmapSchema>,
|
||||
): Promise<unknown> {
|
||||
await enforceWorkflowWriteGate("gsd_reassess_roadmap", projectDir, args.milestoneId);
|
||||
await enforceWorkflowWriteGate("sf_reassess_roadmap", projectDir, args.milestoneId);
|
||||
const { executeReassessRoadmap } = await getWorkflowToolExecutors();
|
||||
const { projectDir: _projectDir, ...params } = args;
|
||||
return runSerializedWorkflowOperation(() => executeReassessRoadmap(params, projectDir));
|
||||
|
|
@ -649,7 +649,7 @@ async function handleSaveGateResult(
|
|||
projectDir: string,
|
||||
args: z.infer<typeof saveGateResultSchema>,
|
||||
): Promise<unknown> {
|
||||
await enforceWorkflowWriteGate("gsd_save_gate_result", projectDir, args.milestoneId);
|
||||
await enforceWorkflowWriteGate("sf_save_gate_result", projectDir, args.milestoneId);
|
||||
const { executeSaveGateResult } = await getWorkflowToolExecutors();
|
||||
const { projectDir: _projectDir, ...params } = args;
|
||||
return runSerializedWorkflowOperation(() => executeSaveGateResult(params, projectDir));
|
||||
|
|
@ -982,13 +982,13 @@ const journalQuerySchema = z.object(journalQueryParams);
|
|||
|
||||
export function registerWorkflowTools(server: McpToolServer): void {
|
||||
server.tool(
|
||||
"gsd_decision_save",
|
||||
"sf_decision_save",
|
||||
"Record a project decision to the SF database and regenerate DECISIONS.md.",
|
||||
decisionSaveParams,
|
||||
async (args: Record<string, unknown>) => {
|
||||
const parsed = parseWorkflowArgs(decisionSaveSchema, args);
|
||||
const { projectDir, ...params } = parsed;
|
||||
await enforceWorkflowWriteGate("gsd_decision_save", projectDir);
|
||||
await enforceWorkflowWriteGate("sf_decision_save", projectDir);
|
||||
const result = await runSerializedWorkflowDbOperation(projectDir, async () => {
|
||||
const { saveDecisionToDb } = await importLocalModule<any>("../../../src/resources/extensions/sf/db-writer.js");
|
||||
return saveDecisionToDb(params, projectDir);
|
||||
|
|
@ -998,13 +998,13 @@ export function registerWorkflowTools(server: McpToolServer): void {
|
|||
);
|
||||
|
||||
server.tool(
|
||||
"gsd_save_decision",
|
||||
"Alias for gsd_decision_save. Record a project decision to the SF database and regenerate DECISIONS.md.",
|
||||
"sf_save_decision",
|
||||
"Alias for sf_decision_save. Record a project decision to the SF database and regenerate DECISIONS.md.",
|
||||
decisionSaveParams,
|
||||
async (args: Record<string, unknown>) => {
|
||||
const parsed = parseWorkflowArgs(decisionSaveSchema, args);
|
||||
const { projectDir, ...params } = parsed;
|
||||
await enforceWorkflowWriteGate("gsd_decision_save", projectDir);
|
||||
await enforceWorkflowWriteGate("sf_decision_save", projectDir);
|
||||
const result = await runSerializedWorkflowDbOperation(projectDir, async () => {
|
||||
const { saveDecisionToDb } = await importLocalModule<any>("../../../src/resources/extensions/sf/db-writer.js");
|
||||
return saveDecisionToDb(params, projectDir);
|
||||
|
|
@ -1014,13 +1014,13 @@ export function registerWorkflowTools(server: McpToolServer): void {
|
|||
);
|
||||
|
||||
server.tool(
|
||||
"gsd_requirement_update",
|
||||
"sf_requirement_update",
|
||||
"Update an existing requirement in the SF database and regenerate REQUIREMENTS.md.",
|
||||
requirementUpdateParams,
|
||||
async (args: Record<string, unknown>) => {
|
||||
const parsed = parseWorkflowArgs(requirementUpdateSchema, args);
|
||||
const { projectDir, id, ...updates } = parsed;
|
||||
await enforceWorkflowWriteGate("gsd_requirement_update", projectDir);
|
||||
await enforceWorkflowWriteGate("sf_requirement_update", projectDir);
|
||||
await runSerializedWorkflowDbOperation(projectDir, async () => {
|
||||
const { updateRequirementInDb } = await importLocalModule<any>("../../../src/resources/extensions/sf/db-writer.js");
|
||||
return updateRequirementInDb(id, updates, projectDir);
|
||||
|
|
@ -1030,13 +1030,13 @@ export function registerWorkflowTools(server: McpToolServer): void {
|
|||
);
|
||||
|
||||
server.tool(
|
||||
"gsd_update_requirement",
|
||||
"Alias for gsd_requirement_update. Update an existing requirement in the SF database and regenerate REQUIREMENTS.md.",
|
||||
"sf_update_requirement",
|
||||
"Alias for sf_requirement_update. Update an existing requirement in the SF database and regenerate REQUIREMENTS.md.",
|
||||
requirementUpdateParams,
|
||||
async (args: Record<string, unknown>) => {
|
||||
const parsed = parseWorkflowArgs(requirementUpdateSchema, args);
|
||||
const { projectDir, id, ...updates } = parsed;
|
||||
await enforceWorkflowWriteGate("gsd_requirement_update", projectDir);
|
||||
await enforceWorkflowWriteGate("sf_requirement_update", projectDir);
|
||||
await runSerializedWorkflowDbOperation(projectDir, async () => {
|
||||
const { updateRequirementInDb } = await importLocalModule<any>("../../../src/resources/extensions/sf/db-writer.js");
|
||||
return updateRequirementInDb(id, updates, projectDir);
|
||||
|
|
@ -1046,13 +1046,13 @@ export function registerWorkflowTools(server: McpToolServer): void {
|
|||
);
|
||||
|
||||
server.tool(
|
||||
"gsd_requirement_save",
|
||||
"sf_requirement_save",
|
||||
"Record a new requirement to the SF database and regenerate REQUIREMENTS.md.",
|
||||
requirementSaveParams,
|
||||
async (args: Record<string, unknown>) => {
|
||||
const parsed = parseWorkflowArgs(requirementSaveSchema, args);
|
||||
const { projectDir, ...params } = parsed;
|
||||
await enforceWorkflowWriteGate("gsd_requirement_save", projectDir);
|
||||
await enforceWorkflowWriteGate("sf_requirement_save", projectDir);
|
||||
const result = await runSerializedWorkflowDbOperation(projectDir, async () => {
|
||||
const { saveRequirementToDb } = await importLocalModule<any>("../../../src/resources/extensions/sf/db-writer.js");
|
||||
return saveRequirementToDb(params, projectDir);
|
||||
|
|
@ -1062,13 +1062,13 @@ export function registerWorkflowTools(server: McpToolServer): void {
|
|||
);
|
||||
|
||||
server.tool(
|
||||
"gsd_save_requirement",
|
||||
"Alias for gsd_requirement_save. Record a new requirement to the SF database and regenerate REQUIREMENTS.md.",
|
||||
"sf_save_requirement",
|
||||
"Alias for sf_requirement_save. Record a new requirement to the SF database and regenerate REQUIREMENTS.md.",
|
||||
requirementSaveParams,
|
||||
async (args: Record<string, unknown>) => {
|
||||
const parsed = parseWorkflowArgs(requirementSaveSchema, args);
|
||||
const { projectDir, ...params } = parsed;
|
||||
await enforceWorkflowWriteGate("gsd_requirement_save", projectDir);
|
||||
await enforceWorkflowWriteGate("sf_requirement_save", projectDir);
|
||||
const result = await runSerializedWorkflowDbOperation(projectDir, async () => {
|
||||
const { saveRequirementToDb } = await importLocalModule<any>("../../../src/resources/extensions/sf/db-writer.js");
|
||||
return saveRequirementToDb(params, projectDir);
|
||||
|
|
@ -1078,12 +1078,12 @@ export function registerWorkflowTools(server: McpToolServer): void {
|
|||
);
|
||||
|
||||
server.tool(
|
||||
"gsd_milestone_generate_id",
|
||||
"sf_milestone_generate_id",
|
||||
"Generate the next milestone ID for a new SF milestone.",
|
||||
milestoneGenerateIdParams,
|
||||
async (args: Record<string, unknown>) => {
|
||||
const { projectDir } = parseWorkflowArgs(milestoneGenerateIdSchema, args);
|
||||
await enforceWorkflowWriteGate("gsd_milestone_generate_id", projectDir);
|
||||
await enforceWorkflowWriteGate("sf_milestone_generate_id", projectDir);
|
||||
const id = await runSerializedWorkflowDbOperation(projectDir, async () => {
|
||||
const {
|
||||
claimReservedId,
|
||||
|
|
@ -1106,12 +1106,12 @@ export function registerWorkflowTools(server: McpToolServer): void {
|
|||
);
|
||||
|
||||
server.tool(
|
||||
"gsd_generate_milestone_id",
|
||||
"Alias for gsd_milestone_generate_id. Generate the next milestone ID for a new SF milestone.",
|
||||
"sf_generate_milestone_id",
|
||||
"Alias for sf_milestone_generate_id. Generate the next milestone ID for a new SF milestone.",
|
||||
milestoneGenerateIdParams,
|
||||
async (args: Record<string, unknown>) => {
|
||||
const { projectDir } = parseWorkflowArgs(milestoneGenerateIdSchema, args);
|
||||
await enforceWorkflowWriteGate("gsd_milestone_generate_id", projectDir);
|
||||
await enforceWorkflowWriteGate("sf_milestone_generate_id", projectDir);
|
||||
const id = await runSerializedWorkflowDbOperation(projectDir, async () => {
|
||||
const {
|
||||
claimReservedId,
|
||||
|
|
@ -1134,39 +1134,39 @@ export function registerWorkflowTools(server: McpToolServer): void {
|
|||
);
|
||||
|
||||
server.tool(
|
||||
"gsd_plan_milestone",
|
||||
"sf_plan_milestone",
|
||||
"Write milestone planning state to the SF database and render ROADMAP.md from DB.",
|
||||
planMilestoneParams,
|
||||
async (args: Record<string, unknown>) => {
|
||||
const parsed = parseWorkflowArgs(planMilestoneSchema, args);
|
||||
const { projectDir, ...params } = parsed;
|
||||
await enforceWorkflowWriteGate("gsd_plan_milestone", projectDir, params.milestoneId);
|
||||
await enforceWorkflowWriteGate("sf_plan_milestone", projectDir, params.milestoneId);
|
||||
const { executePlanMilestone } = await getWorkflowToolExecutors();
|
||||
return runSerializedWorkflowOperation(() => executePlanMilestone(params, projectDir));
|
||||
},
|
||||
);
|
||||
|
||||
server.tool(
|
||||
"gsd_plan_slice",
|
||||
"sf_plan_slice",
|
||||
"Write slice/task planning state to the SF database and render plan artifacts from DB.",
|
||||
planSliceParams,
|
||||
async (args: Record<string, unknown>) => {
|
||||
const parsed = parseWorkflowArgs(planSliceSchema, args);
|
||||
const { projectDir, ...params } = parsed;
|
||||
await enforceWorkflowWriteGate("gsd_plan_slice", projectDir, params.milestoneId);
|
||||
await enforceWorkflowWriteGate("sf_plan_slice", projectDir, params.milestoneId);
|
||||
const { executePlanSlice } = await getWorkflowToolExecutors();
|
||||
return runSerializedWorkflowOperation(() => executePlanSlice(params, projectDir));
|
||||
},
|
||||
);
|
||||
|
||||
server.tool(
|
||||
"gsd_plan_task",
|
||||
"sf_plan_task",
|
||||
"Write task planning state to the SF database and render tasks/T##-PLAN.md from DB.",
|
||||
planTaskParams,
|
||||
async (args: Record<string, unknown>) => {
|
||||
const parsed = parseWorkflowArgs(planTaskSchema, args);
|
||||
const { projectDir, ...params } = parsed;
|
||||
await enforceWorkflowWriteGate("gsd_plan_task", projectDir, params.milestoneId);
|
||||
await enforceWorkflowWriteGate("sf_plan_task", projectDir, params.milestoneId);
|
||||
const result = await runSerializedWorkflowDbOperation(projectDir, async () => {
|
||||
const { handlePlanTask } = await importLocalModule<any>("../../../src/resources/extensions/sf/tools/plan-task.js");
|
||||
return handlePlanTask(params, projectDir);
|
||||
|
|
@ -1181,13 +1181,13 @@ export function registerWorkflowTools(server: McpToolServer): void {
|
|||
);
|
||||
|
||||
server.tool(
|
||||
"gsd_task_plan",
|
||||
"Alias for gsd_plan_task. Write task planning state to the SF database and render tasks/T##-PLAN.md from DB.",
|
||||
"sf_task_plan",
|
||||
"Alias for sf_plan_task. Write task planning state to the SF database and render tasks/T##-PLAN.md from DB.",
|
||||
planTaskParams,
|
||||
async (args: Record<string, unknown>) => {
|
||||
const parsed = parseWorkflowArgs(planTaskSchema, args);
|
||||
const { projectDir, ...params } = parsed;
|
||||
await enforceWorkflowWriteGate("gsd_plan_task", projectDir, params.milestoneId);
|
||||
await enforceWorkflowWriteGate("sf_plan_task", projectDir, params.milestoneId);
|
||||
const result = await runSerializedWorkflowDbOperation(projectDir, async () => {
|
||||
const { handlePlanTask } = await importLocalModule<any>("../../../src/resources/extensions/sf/tools/plan-task.js");
|
||||
return handlePlanTask(params, projectDir);
|
||||
|
|
@ -1202,7 +1202,7 @@ export function registerWorkflowTools(server: McpToolServer): void {
|
|||
);
|
||||
|
||||
server.tool(
|
||||
"gsd_replan_slice",
|
||||
"sf_replan_slice",
|
||||
"Replan a slice after a blocker is discovered, preserving completed tasks and re-rendering PLAN.md + REPLAN.md.",
|
||||
replanSliceParams,
|
||||
async (args: Record<string, unknown>) => {
|
||||
|
|
@ -1212,8 +1212,8 @@ export function registerWorkflowTools(server: McpToolServer): void {
|
|||
);
|
||||
|
||||
server.tool(
|
||||
"gsd_slice_replan",
|
||||
"Alias for gsd_replan_slice. Replan a slice after a blocker is discovered.",
|
||||
"sf_slice_replan",
|
||||
"Alias for sf_replan_slice. Replan a slice after a blocker is discovered.",
|
||||
replanSliceParams,
|
||||
async (args: Record<string, unknown>) => {
|
||||
const parsed = parseWorkflowArgs(replanSliceSchema, args);
|
||||
|
|
@ -1222,7 +1222,7 @@ export function registerWorkflowTools(server: McpToolServer): void {
|
|||
);
|
||||
|
||||
server.tool(
|
||||
"gsd_slice_complete",
|
||||
"sf_slice_complete",
|
||||
"Record a completed slice to the SF database, render SUMMARY.md + UAT.md, and update roadmap projection.",
|
||||
sliceCompleteParams,
|
||||
async (args: Record<string, unknown>) => {
|
||||
|
|
@ -1232,8 +1232,8 @@ export function registerWorkflowTools(server: McpToolServer): void {
|
|||
);
|
||||
|
||||
server.tool(
|
||||
"gsd_complete_slice",
|
||||
"Alias for gsd_slice_complete. Record a completed slice to the SF database and render summary/UAT artifacts.",
|
||||
"sf_complete_slice",
|
||||
"Alias for sf_slice_complete. Record a completed slice to the SF database and render summary/UAT artifacts.",
|
||||
sliceCompleteParams,
|
||||
async (args: Record<string, unknown>) => {
|
||||
const parsed = parseWorkflowArgs(sliceCompleteSchema, args);
|
||||
|
|
@ -1242,12 +1242,12 @@ export function registerWorkflowTools(server: McpToolServer): void {
|
|||
);
|
||||
|
||||
server.tool(
|
||||
"gsd_skip_slice",
|
||||
"sf_skip_slice",
|
||||
"Mark a slice as skipped so auto-mode advances past it without executing.",
|
||||
skipSliceParams,
|
||||
async (args: Record<string, unknown>) => {
|
||||
const { projectDir, milestoneId, sliceId, reason } = parseWorkflowArgs(skipSliceSchema, args);
|
||||
await enforceWorkflowWriteGate("gsd_skip_slice", projectDir, milestoneId);
|
||||
await enforceWorkflowWriteGate("sf_skip_slice", projectDir, milestoneId);
|
||||
await runSerializedWorkflowDbOperation(projectDir, async () => {
|
||||
const { getSlice, updateSliceStatus } = await importLocalModule<any>("../../../src/resources/extensions/sf/sf-db.js");
|
||||
const { invalidateStateCache } = await importLocalModule<any>("../../../src/resources/extensions/sf/state.js");
|
||||
|
|
@ -1272,7 +1272,7 @@ export function registerWorkflowTools(server: McpToolServer): void {
|
|||
);
|
||||
|
||||
server.tool(
|
||||
"gsd_complete_milestone",
|
||||
"sf_complete_milestone",
|
||||
"Record a completed milestone to the SF database and render its SUMMARY.md.",
|
||||
completeMilestoneParams,
|
||||
async (args: Record<string, unknown>) => {
|
||||
|
|
@ -1282,8 +1282,8 @@ export function registerWorkflowTools(server: McpToolServer): void {
|
|||
);
|
||||
|
||||
server.tool(
|
||||
"gsd_milestone_complete",
|
||||
"Alias for gsd_complete_milestone. Record a completed milestone to the SF database and render its SUMMARY.md.",
|
||||
"sf_milestone_complete",
|
||||
"Alias for sf_complete_milestone. Record a completed milestone to the SF database and render its SUMMARY.md.",
|
||||
completeMilestoneParams,
|
||||
async (args: Record<string, unknown>) => {
|
||||
const parsed = parseWorkflowArgs(completeMilestoneSchema, args);
|
||||
|
|
@ -1292,7 +1292,7 @@ export function registerWorkflowTools(server: McpToolServer): void {
|
|||
);
|
||||
|
||||
server.tool(
|
||||
"gsd_validate_milestone",
|
||||
"sf_validate_milestone",
|
||||
"Validate a milestone, persist validation results to the SF database, and render VALIDATION.md.",
|
||||
validateMilestoneParams,
|
||||
async (args: Record<string, unknown>) => {
|
||||
|
|
@ -1302,8 +1302,8 @@ export function registerWorkflowTools(server: McpToolServer): void {
|
|||
);
|
||||
|
||||
server.tool(
|
||||
"gsd_milestone_validate",
|
||||
"Alias for gsd_validate_milestone. Validate a milestone and render VALIDATION.md.",
|
||||
"sf_milestone_validate",
|
||||
"Alias for sf_validate_milestone. Validate a milestone and render VALIDATION.md.",
|
||||
validateMilestoneParams,
|
||||
async (args: Record<string, unknown>) => {
|
||||
const parsed = parseWorkflowArgs(validateMilestoneSchema, args);
|
||||
|
|
@ -1312,7 +1312,7 @@ export function registerWorkflowTools(server: McpToolServer): void {
|
|||
);
|
||||
|
||||
server.tool(
|
||||
"gsd_reassess_roadmap",
|
||||
"sf_reassess_roadmap",
|
||||
"Reassess a milestone roadmap after a slice completes, writing ASSESSMENT.md and re-rendering ROADMAP.md.",
|
||||
reassessRoadmapParams,
|
||||
async (args: Record<string, unknown>) => {
|
||||
|
|
@ -1322,8 +1322,8 @@ export function registerWorkflowTools(server: McpToolServer): void {
|
|||
);
|
||||
|
||||
server.tool(
|
||||
"gsd_roadmap_reassess",
|
||||
"Alias for gsd_reassess_roadmap. Reassess a roadmap after slice completion.",
|
||||
"sf_roadmap_reassess",
|
||||
"Alias for sf_reassess_roadmap. Reassess a roadmap after slice completion.",
|
||||
reassessRoadmapParams,
|
||||
async (args: Record<string, unknown>) => {
|
||||
const parsed = parseWorkflowArgs(reassessRoadmapSchema, args);
|
||||
|
|
@ -1332,7 +1332,7 @@ export function registerWorkflowTools(server: McpToolServer): void {
|
|||
);
|
||||
|
||||
server.tool(
|
||||
"gsd_save_gate_result",
|
||||
"sf_save_gate_result",
|
||||
"Save a quality gate result to the SF database.",
|
||||
saveGateResultParams,
|
||||
async (args: Record<string, unknown>) => {
|
||||
|
|
@ -1342,13 +1342,13 @@ export function registerWorkflowTools(server: McpToolServer): void {
|
|||
);
|
||||
|
||||
server.tool(
|
||||
"gsd_summary_save",
|
||||
"sf_summary_save",
|
||||
"Save a SF summary/research/context/assessment artifact to the database and disk.",
|
||||
summarySaveParams,
|
||||
async (args: Record<string, unknown>) => {
|
||||
const parsed = parseWorkflowArgs(summarySaveSchema, args);
|
||||
const { projectDir, milestone_id, slice_id, task_id, artifact_type, content } = parsed;
|
||||
await enforceWorkflowWriteGate("gsd_summary_save", projectDir, milestone_id);
|
||||
await enforceWorkflowWriteGate("sf_summary_save", projectDir, milestone_id);
|
||||
const executors = await getWorkflowToolExecutors();
|
||||
const supportedArtifactTypes = getSupportedSummaryArtifactTypes(executors);
|
||||
if (!supportedArtifactTypes.includes(artifact_type)) {
|
||||
|
|
@ -1363,7 +1363,7 @@ export function registerWorkflowTools(server: McpToolServer): void {
|
|||
);
|
||||
|
||||
server.tool(
|
||||
"gsd_task_complete",
|
||||
"sf_task_complete",
|
||||
"Record a completed task to the SF database and render its SUMMARY.md.",
|
||||
taskCompleteParams,
|
||||
async (args: Record<string, unknown>) => {
|
||||
|
|
@ -1374,8 +1374,8 @@ export function registerWorkflowTools(server: McpToolServer): void {
|
|||
);
|
||||
|
||||
server.tool(
|
||||
"gsd_complete_task",
|
||||
"Alias for gsd_task_complete. Record a completed task to the SF database and render its SUMMARY.md.",
|
||||
"sf_complete_task",
|
||||
"Alias for sf_task_complete. Record a completed task to the SF database and render its SUMMARY.md.",
|
||||
taskCompleteParams,
|
||||
async (args: Record<string, unknown>) => {
|
||||
const parsed = parseWorkflowArgs(taskCompleteSchema, args);
|
||||
|
|
@ -1385,19 +1385,19 @@ export function registerWorkflowTools(server: McpToolServer): void {
|
|||
);
|
||||
|
||||
server.tool(
|
||||
"gsd_milestone_status",
|
||||
"sf_milestone_status",
|
||||
"Read the current status of a milestone and all its slices from the SF database.",
|
||||
milestoneStatusParams,
|
||||
async (args: Record<string, unknown>) => {
|
||||
const { projectDir, milestoneId } = parseWorkflowArgs(milestoneStatusSchema, args);
|
||||
await enforceWorkflowWriteGate("gsd_milestone_status", projectDir, milestoneId);
|
||||
await enforceWorkflowWriteGate("sf_milestone_status", projectDir, milestoneId);
|
||||
const { executeMilestoneStatus } = await getWorkflowToolExecutors();
|
||||
return runSerializedWorkflowOperation(() => executeMilestoneStatus({ milestoneId }, projectDir));
|
||||
},
|
||||
);
|
||||
|
||||
server.tool(
|
||||
"gsd_journal_query",
|
||||
"sf_journal_query",
|
||||
"Query the structured event journal for auto-mode iterations.",
|
||||
journalQueryParams,
|
||||
async (args: Record<string, unknown>) => {
|
||||
|
|
|
|||
|
|
@ -1601,7 +1601,7 @@ export class AgentSession {
|
|||
// Extensions (e.g., discuss flows) may narrow the active tool list
|
||||
// via setActiveTools() during a session. Without this refresh, the
|
||||
// narrowed set persists into the next session — causing tools like
|
||||
// gsd_plan_slice to be missing from auto-mode subagent sessions.
|
||||
// sf_plan_slice to be missing from auto-mode subagent sessions.
|
||||
this._refreshToolRegistry({
|
||||
activeToolNames: this.getActiveToolNames(),
|
||||
includeAllExtensionTools: true,
|
||||
|
|
|
|||
|
|
@ -179,7 +179,7 @@ test("chat-controller renders serverToolUse before trailing text matching conten
|
|||
const serverToolUse = {
|
||||
type: "serverToolUse",
|
||||
id: toolId,
|
||||
name: "mcp__gsd-workflow__secure_env_collect",
|
||||
name: "mcp__sf-workflow__secure_env_collect",
|
||||
input: { projectDir: "/tmp/project", keys: [{ key: "SECURE_PASSWORD" }], destination: "dotenv" },
|
||||
};
|
||||
|
||||
|
|
|
|||
|
|
@ -337,9 +337,9 @@ function respawnWorker(mid) {
|
|||
SF_MILESTONE_LOCK: mid,
|
||||
SF_PROJECT_ROOT: PROJECT_ROOT,
|
||||
SF_PARALLEL_WORKER: '1',
|
||||
GSD_MILESTONE_LOCK: mid,
|
||||
GSD_PROJECT_ROOT: PROJECT_ROOT,
|
||||
GSD_PARALLEL_WORKER: '1',
|
||||
SF_MILESTONE_LOCK: mid,
|
||||
SF_PROJECT_ROOT: PROJECT_ROOT,
|
||||
SF_PARALLEL_WORKER: '1',
|
||||
},
|
||||
stdio: ['ignore', stdoutFd, stderrFd],
|
||||
windowsHide: true,
|
||||
|
|
|
|||
|
|
@ -20,15 +20,15 @@ const RTK_SKIP =
|
|||
process.env.SF_SKIP_RTK_INSTALL === 'true' ||
|
||||
process.env.SF_RTK_DISABLED === '1' ||
|
||||
process.env.SF_RTK_DISABLED === 'true' ||
|
||||
process.env.GSD_SKIP_RTK_INSTALL === '1' ||
|
||||
process.env.GSD_SKIP_RTK_INSTALL === 'true' ||
|
||||
process.env.GSD_RTK_DISABLED === '1' ||
|
||||
process.env.GSD_RTK_DISABLED === 'true'
|
||||
process.env.SF_SKIP_RTK_INSTALL === '1' ||
|
||||
process.env.SF_SKIP_RTK_INSTALL === 'true' ||
|
||||
process.env.SF_RTK_DISABLED === '1' ||
|
||||
process.env.SF_RTK_DISABLED === 'true'
|
||||
|
||||
const RTK_VERSION = '0.33.1'
|
||||
const RTK_REPO = 'rtk-ai/rtk'
|
||||
const RTK_ENV = { ...process.env, RTK_TELEMETRY_DISABLED: '1' }
|
||||
const managedBinDir = join(process.env.SF_HOME || process.env.GSD_HOME || join(homedir(), '.sf'), 'agent', 'bin')
|
||||
const managedBinDir = join(process.env.SF_HOME || process.env.SF_HOME || join(homedir(), '.sf'), 'agent', 'bin')
|
||||
const managedBinaryPath = join(managedBinDir, platform() === 'win32' ? 'rtk.exe' : 'rtk')
|
||||
|
||||
function run(cmd) {
|
||||
|
|
|
|||
|
|
@ -108,7 +108,7 @@ function renderMarkdown({ summary, history, binaryPath }) {
|
|||
function main() {
|
||||
const outputIndex = process.argv.indexOf('--output')
|
||||
const outputPath = outputIndex !== -1 ? process.argv[outputIndex + 1] : null
|
||||
const binaryPath = process.env.SF_RTK_PATH || process.env.GSD_RTK_PATH || getManagedRtkPath()
|
||||
const binaryPath = process.env.SF_RTK_PATH || process.env.SF_RTK_PATH || getManagedRtkPath()
|
||||
|
||||
if (!binaryPath) {
|
||||
throw new Error('RTK binary path not resolved')
|
||||
|
|
|
|||
|
|
@ -132,7 +132,7 @@ tmp6=$(mktemp)
|
|||
env -i HOME="$HOME" PATH="$PATH" \
|
||||
ANTHROPIC_API_KEY="${ANTHROPIC_API_KEY:-}" \
|
||||
SF_TEST_AUTH_PATH="$tmp_auth" \
|
||||
GSD_TEST_AUTH_PATH="$tmp_auth" \
|
||||
SF_TEST_AUTH_PATH="$tmp_auth" \
|
||||
node -e "
|
||||
import('./dist/app-paths.js').then(async (paths) => {
|
||||
// Override authFilePath for test
|
||||
|
|
|
|||
|
|
@ -146,7 +146,7 @@ export function summarizeToolArgs(toolName: unknown, toolInput: unknown): string
|
|||
return String(input.url ?? '')
|
||||
default: {
|
||||
// SF tools: show milestone/slice/task IDs when present
|
||||
if (name.startsWith('gsd_')) {
|
||||
if (name.startsWith('sf_')) {
|
||||
return summarizeGsdTool(name, input)
|
||||
}
|
||||
// Fallback: show first string-valued key up to 60 chars
|
||||
|
|
@ -175,7 +175,7 @@ function summarizeGsdTool(name: string, input: Record<string, unknown>): string
|
|||
}
|
||||
return id
|
||||
}
|
||||
// Fallback for SF tools without IDs (e.g. gsd_decision_save)
|
||||
// Fallback for SF tools without IDs (e.g. sf_decision_save)
|
||||
if (input.decision) {
|
||||
const d = String(input.decision)
|
||||
return d.length > 60 ? d.slice(0, 57) + '...' : d
|
||||
|
|
|
|||
|
|
@ -13,20 +13,20 @@ const args = process.argv.slice(2)
|
|||
const firstArg = args[0]
|
||||
|
||||
// Read package.json once — reused for version, banner, and SF_VERSION below
|
||||
let gsdVersion = '0.0.0'
|
||||
let sfVersion = '0.0.0'
|
||||
try {
|
||||
const pkg = JSON.parse(readFileSync(join(gsdRoot, 'package.json'), 'utf-8'))
|
||||
gsdVersion = pkg.version || '0.0.0'
|
||||
sfVersion = pkg.version || '0.0.0'
|
||||
} catch { /* ignore */ }
|
||||
|
||||
if (firstArg === '--version' || firstArg === '-v') {
|
||||
process.stdout.write(gsdVersion + '\n')
|
||||
process.stdout.write(sfVersion + '\n')
|
||||
process.exit(0)
|
||||
}
|
||||
|
||||
if (firstArg === '--help' || firstArg === '-h') {
|
||||
const { printHelp } = await import('./help-text.js')
|
||||
printHelp(gsdVersion)
|
||||
printHelp(sfVersion)
|
||||
process.exit(0)
|
||||
}
|
||||
|
||||
|
|
@ -101,7 +101,7 @@ if (!existsSync(appRoot)) {
|
|||
process.stderr.write(
|
||||
renderLogo(colorCyan) +
|
||||
'\n' +
|
||||
` Singularity Forge ${dim}v${gsdVersion}${reset}\n` +
|
||||
` Singularity Forge ${dim}v${sfVersion}${reset}\n` +
|
||||
` ${green}Welcome.${reset} Setting up your environment...\n\n`
|
||||
)
|
||||
process.env.SF_FIRST_RUN_BANNER = '1'
|
||||
|
|
@ -134,7 +134,7 @@ const { Module } = await import('module');
|
|||
(Module as any)._initPaths?.()
|
||||
|
||||
// SF_VERSION — expose package version so extensions can display it
|
||||
process.env.SF_VERSION = gsdVersion
|
||||
process.env.SF_VERSION = sfVersion
|
||||
|
||||
// SF_BIN_PATH — absolute path to this loader (dist/loader.js), used by patched subagent
|
||||
// to spawn sf instead of pi when dispatching workflow tasks.
|
||||
|
|
|
|||
|
|
@ -30,7 +30,7 @@ const bundledExtensionsDir = join(resourcesDir, 'extensions')
|
|||
const resourceVersionManifestName = 'managed-resources.json'
|
||||
|
||||
interface ManagedResourceManifest {
|
||||
gsdVersion: string
|
||||
sfVersion: string
|
||||
syncedAt?: number
|
||||
/** Content fingerprint of bundled resources — detects same-version content changes. */
|
||||
contentHash?: string
|
||||
|
|
@ -101,7 +101,7 @@ function writeManagedResourceManifest(agentDir: string): void {
|
|||
} catch { /* non-fatal */ }
|
||||
|
||||
const manifest: ManagedResourceManifest = {
|
||||
gsdVersion: getBundledGsdVersion(),
|
||||
sfVersion: getBundledGsdVersion(),
|
||||
syncedAt: Date.now(),
|
||||
contentHash: computeResourceFingerprint(),
|
||||
installedExtensionRootFiles,
|
||||
|
|
@ -113,7 +113,7 @@ function writeManagedResourceManifest(agentDir: string): void {
|
|||
export function readManagedResourceVersion(agentDir: string): string | null {
|
||||
try {
|
||||
const manifest = JSON.parse(readFileSync(getManagedResourceManifestPath(agentDir), 'utf-8')) as ManagedResourceManifest
|
||||
return typeof manifest?.gsdVersion === 'string' ? manifest.gsdVersion : null
|
||||
return typeof manifest?.sfVersion === 'string' ? manifest.sfVersion : null
|
||||
} catch {
|
||||
return null
|
||||
}
|
||||
|
|
@ -549,7 +549,7 @@ export function initResources(agentDir: string): void {
|
|||
// Skip the full copy when both version AND content fingerprint match.
|
||||
// Version-only checks miss same-version content changes (npm link dev workflow,
|
||||
// hotfixes within a release). The content hash catches those at ~1ms cost.
|
||||
if (manifest && manifest.gsdVersion === currentVersion) {
|
||||
if (manifest && manifest.sfVersion === currentVersion) {
|
||||
// Version matches — check content fingerprint for same-version staleness.
|
||||
const currentHash = computeResourceFingerprint()
|
||||
const hasStaleExtensionFiles = hasStaleCompiledExtensionSiblings(extensionsDir, bundledExtensionsDir)
|
||||
|
|
|
|||
|
|
@ -9,7 +9,7 @@ Work autonomously to complete the assigned task. Use all available tools as need
|
|||
|
||||
- Do **not** spawn subagents or act as an orchestrator unless the parent task explicitly instructs you to do so.
|
||||
- If the task looks like SF orchestration, planning, scouting, parallel dispatch, or review routing, stop and report that the caller should use the appropriate specialist agent instead (for example: `sf-worker`, `sf-scout`, `sf-reviewer`, or the top-level orchestrator).
|
||||
- In particular, do **not** call `gsd_scout`, `subagent`, `launch_parallel_view`, or `gsd_execute_parallel` on your own initiative.
|
||||
- In particular, do **not** call `sf_scout`, `subagent`, `launch_parallel_view`, or `sf_execute_parallel` on your own initiative.
|
||||
|
||||
Output format when finished:
|
||||
|
||||
|
|
|
|||
|
|
@ -13,7 +13,7 @@ export async function queryShellEnv(
|
|||
timeout: number,
|
||||
signal?: AbortSignal,
|
||||
): Promise<{ cwd: string; env: Record<string, string>; shell: string } | null> {
|
||||
const sentinel = `__GSD_ENV_${randomUUID().slice(0, 8)}__`;
|
||||
const sentinel = `__SF_ENV_${randomUUID().slice(0, 8)}__`;
|
||||
const startIndex = bg.output.length;
|
||||
|
||||
const cmd = [
|
||||
|
|
@ -121,9 +121,9 @@ export async function runOnSession(
|
|||
signal?: AbortSignal,
|
||||
): Promise<{ exitCode: number; output: string; timedOut: boolean }> {
|
||||
const sentinel = randomUUID().slice(0, 8);
|
||||
const startMarker = `__GSD_SENTINEL_${sentinel}_START__`;
|
||||
const endMarker = `__GSD_SENTINEL_${sentinel}_END__`;
|
||||
const exitVar = `__GSD_EXIT_${sentinel}__`;
|
||||
const startMarker = `__SF_SENTINEL_${sentinel}_START__`;
|
||||
const endMarker = `__SF_SENTINEL_${sentinel}_END__`;
|
||||
const exitVar = `__SF_EXIT_${sentinel}__`;
|
||||
|
||||
// Snapshot current output buffer position
|
||||
const startIndex = bg.output.length;
|
||||
|
|
|
|||
|
|
@ -703,7 +703,7 @@ export function buildSdkOptions(
|
|||
// Pre-authorize the safe built-ins and every registered workflow MCP
|
||||
// server's tools. `acceptEdits` mode (the interactive default) only
|
||||
// auto-approves file edits — Read/Glob/Grep, basic shell inspection, and
|
||||
// every `mcp__gsd-workflow__*` call still surface as "This command
|
||||
// every `mcp__sf-workflow__*` call still surface as "This command
|
||||
// requires approval" and block SF actions (#4099).
|
||||
const allowedTools = [
|
||||
"Read",
|
||||
|
|
|
|||
|
|
@ -16,7 +16,7 @@ describe("PartialMessageBuilder — malformed tool arguments (#2574)", () => {
|
|||
builder.handleEvent({
|
||||
type: "content_block_start",
|
||||
index: 0,
|
||||
content_block: { type: "tool_use", id: "tool_1", name: "gsd_plan_slice", input: {} },
|
||||
content_block: { type: "tool_use", id: "tool_1", name: "sf_plan_slice", input: {} },
|
||||
} as BetaRawMessageStreamEvent);
|
||||
|
||||
// Feed JSON fragments as input_json_delta
|
||||
|
|
@ -152,8 +152,8 @@ describe("PartialMessageBuilder — malformed tool arguments (#2574)", () => {
|
|||
describe("parseMcpToolName", () => {
|
||||
test("splits mcp__<server>__<tool> into parts", () => {
|
||||
assert.deepEqual(
|
||||
parseMcpToolName("mcp__gsd-workflow__gsd_plan_milestone"),
|
||||
{ server: "sf-workflow", tool: "gsd_plan_milestone" },
|
||||
parseMcpToolName("mcp__sf-workflow__sf_plan_milestone"),
|
||||
{ server: "sf-workflow", tool: "sf_plan_milestone" },
|
||||
);
|
||||
});
|
||||
|
||||
|
|
@ -173,7 +173,7 @@ describe("parseMcpToolName", () => {
|
|||
|
||||
test("returns null for non-prefixed names", () => {
|
||||
assert.equal(parseMcpToolName("Bash"), null);
|
||||
assert.equal(parseMcpToolName("gsd_plan_milestone"), null);
|
||||
assert.equal(parseMcpToolName("sf_plan_milestone"), null);
|
||||
});
|
||||
|
||||
test("returns null for malformed prefixes", () => {
|
||||
|
|
@ -193,7 +193,7 @@ describe("PartialMessageBuilder — MCP tool name normalization", () => {
|
|||
content_block: {
|
||||
type: "tool_use",
|
||||
id: "tool_1",
|
||||
name: "mcp__gsd-workflow__gsd_plan_milestone",
|
||||
name: "mcp__sf-workflow__sf_plan_milestone",
|
||||
input: {},
|
||||
},
|
||||
} as BetaRawMessageStreamEvent);
|
||||
|
|
@ -202,7 +202,7 @@ describe("PartialMessageBuilder — MCP tool name normalization", () => {
|
|||
assert.equal(event!.type, "toolcall_start");
|
||||
if (event!.type === "toolcall_start") {
|
||||
const toolCall = (event.partial.content[event.contentIndex] as any);
|
||||
assert.equal(toolCall.name, "gsd_plan_milestone");
|
||||
assert.equal(toolCall.name, "sf_plan_milestone");
|
||||
assert.equal(toolCall.mcpServer, "sf-workflow");
|
||||
}
|
||||
});
|
||||
|
|
@ -227,12 +227,12 @@ describe("PartialMessageBuilder — MCP tool name normalization", () => {
|
|||
const block: BetaContentBlock = {
|
||||
type: "tool_use",
|
||||
id: "tool_2",
|
||||
name: "mcp__gsd-workflow__gsd_task_complete",
|
||||
name: "mcp__sf-workflow__sf_task_complete",
|
||||
input: { taskId: "T001" },
|
||||
};
|
||||
const mapped = mapContentBlock(block) as any;
|
||||
assert.equal(mapped.type, "toolCall");
|
||||
assert.equal(mapped.name, "gsd_task_complete");
|
||||
assert.equal(mapped.name, "sf_task_complete");
|
||||
assert.equal(mapped.mcpServer, "sf-workflow");
|
||||
assert.deepEqual(mapped.arguments, { taskId: "T001" });
|
||||
});
|
||||
|
|
|
|||
|
|
@ -485,7 +485,7 @@ describe("stream-adapter — session persistence (#2859)", () => {
|
|||
"Grep",
|
||||
"Bash(ls:*)",
|
||||
"Bash(pwd)",
|
||||
"mcp__gsd-workflow__*",
|
||||
"mcp__sf-workflow__*",
|
||||
]);
|
||||
} finally {
|
||||
process.env.SF_WORKFLOW_MCP_COMMAND = prev.SF_WORKFLOW_MCP_COMMAND;
|
||||
|
|
|
|||
|
|
@ -6,7 +6,7 @@ import { assertSafeDirectory } from "./validate-directory.js";
|
|||
import { detectWorkflowMcpLaunchConfig } from "./workflow-mcp.js";
|
||||
|
||||
export const SF_WORKFLOW_MCP_SERVER_NAME = "sf-workflow";
|
||||
export const GSD_WORKFLOW_MCP_SERVER_NAME = SF_WORKFLOW_MCP_SERVER_NAME;
|
||||
export const SF_WORKFLOW_MCP_SERVER_NAME = SF_WORKFLOW_MCP_SERVER_NAME;
|
||||
|
||||
export interface ProjectMcpServerConfig {
|
||||
command?: string;
|
||||
|
|
|
|||
|
|
@ -267,7 +267,7 @@ export const SF_ROOT_FILES = {
|
|||
CODEBASE: "CODEBASE.md",
|
||||
} as const;
|
||||
|
||||
export const GSD_ROOT_FILES = SF_ROOT_FILES;
|
||||
export const SF_ROOT_FILES = SF_ROOT_FILES;
|
||||
|
||||
export type SFRootFileKey = keyof typeof SF_ROOT_FILES;
|
||||
|
||||
|
|
|
|||
|
|
@ -19,7 +19,7 @@ export default function createExtension(pi: ExtensionAPI) {
|
|||
options: [
|
||||
{
|
||||
label: "Add a custom tool",
|
||||
description: "Register a new tool the LLM can call (like gsd_plan, plan_clarify).",
|
||||
description: "Register a new tool the LLM can call (like sf_plan, plan_clarify).",
|
||||
},
|
||||
{
|
||||
label: "Add a slash command",
|
||||
|
|
|
|||
10
src/rtk.ts
10
src/rtk.ts
|
|
@ -12,9 +12,9 @@ export const RTK_VERSION = "0.33.1";
|
|||
export const SF_RTK_DISABLED_ENV = "SF_RTK_DISABLED";
|
||||
export const SF_SKIP_RTK_INSTALL_ENV = "SF_SKIP_RTK_INSTALL";
|
||||
export const SF_RTK_PATH_ENV = "SF_RTK_PATH";
|
||||
export const GSD_RTK_DISABLED_ENV = "GSD_RTK_DISABLED";
|
||||
export const GSD_SKIP_RTK_INSTALL_ENV = "GSD_SKIP_RTK_INSTALL";
|
||||
export const GSD_RTK_PATH_ENV = "GSD_RTK_PATH";
|
||||
export const SF_RTK_DISABLED_ENV = "SF_RTK_DISABLED";
|
||||
export const SF_SKIP_RTK_INSTALL_ENV = "SF_SKIP_RTK_INSTALL";
|
||||
export const SF_RTK_PATH_ENV = "SF_RTK_PATH";
|
||||
export const RTK_TELEMETRY_DISABLED_ENV = "RTK_TELEMETRY_DISABLED";
|
||||
|
||||
const RTK_REPO = "rtk-ai/rtk";
|
||||
|
|
@ -45,11 +45,11 @@ function isTruthy(value: string | undefined): boolean {
|
|||
}
|
||||
|
||||
export function isRtkEnabled(env: NodeJS.ProcessEnv = process.env): boolean {
|
||||
return !isTruthy(env[SF_RTK_DISABLED_ENV]) && !isTruthy(env[GSD_RTK_DISABLED_ENV]);
|
||||
return !isTruthy(env[SF_RTK_DISABLED_ENV]) && !isTruthy(env[SF_RTK_DISABLED_ENV]);
|
||||
}
|
||||
|
||||
function resolveAppRoot(env: NodeJS.ProcessEnv = process.env): string {
|
||||
return env.SF_HOME || env.GSD_HOME || join(osHomedir(), ".sf");
|
||||
return env.SF_HOME || env.SF_HOME || join(osHomedir(), ".sf");
|
||||
}
|
||||
|
||||
export function getManagedRtkDir(env: NodeJS.ProcessEnv = process.env): string {
|
||||
|
|
|
|||
|
|
@ -253,7 +253,7 @@ test("initResources skips copy when managed version matches current version", as
|
|||
|
||||
// Simulate version mismatch by writing older version to manifest
|
||||
const manifestPath = join(fakeAgentDir, "managed-resources.json");
|
||||
writeFileSync(manifestPath, JSON.stringify({ gsdVersion: "0.0.1", syncedAt: Date.now() }));
|
||||
writeFileSync(manifestPath, JSON.stringify({ sfVersion: "0.0.1", syncedAt: Date.now() }));
|
||||
|
||||
// Third run: version mismatch — full sync, marker removed
|
||||
initResources(fakeAgentDir);
|
||||
|
|
@ -369,7 +369,7 @@ test("deriveState returns pre-planning phase for empty .sf/ directory", async (t
|
|||
test("deriveState returns pre-planning phase when no .sf/ directory exists", async (t) => {
|
||||
const { deriveState } = await import("../resources/extensions/sf/state.ts");
|
||||
// Use a temp dir with no .sf/ subdirectory at all
|
||||
const tmp = mkdtempSync(join(tmpdir(), "sf-state-nogsd-"));
|
||||
const tmp = mkdtempSync(join(tmpdir(), "sf-state-nosf-"));
|
||||
|
||||
t.after(() => rmSync(tmp, { recursive: true, force: true }));
|
||||
// Should not throw — missing .sf/ is a valid "no project" state
|
||||
|
|
|
|||
|
|
@ -233,17 +233,17 @@ describe('summarizeToolArgs', () => {
|
|||
})
|
||||
|
||||
it('summarizes sf tool with milestone/slice/task IDs', () => {
|
||||
assert.equal(summarizeToolArgs('gsd_task_complete', {
|
||||
assert.equal(summarizeToolArgs('sf_task_complete', {
|
||||
milestoneId: 'M001', sliceId: 'S01', taskId: 'T01', oneLiner: 'Built the thing',
|
||||
}), 'M001/S01/T01 Built the thing')
|
||||
})
|
||||
|
||||
it('summarizes gsd_plan_milestone with milestone ID', () => {
|
||||
assert.equal(summarizeToolArgs('gsd_plan_milestone', { milestoneId: 'M002' }), 'M002')
|
||||
it('summarizes sf_plan_milestone with milestone ID', () => {
|
||||
assert.equal(summarizeToolArgs('sf_plan_milestone', { milestoneId: 'M002' }), 'M002')
|
||||
})
|
||||
|
||||
it('summarizes gsd_decision_save with decision text', () => {
|
||||
const result = summarizeToolArgs('gsd_decision_save', { decision: 'Use SQLite for persistence' })
|
||||
it('summarizes sf_decision_save with decision text', () => {
|
||||
const result = summarizeToolArgs('sf_decision_save', { decision: 'Use SQLite for persistence' })
|
||||
assert.equal(result, 'Use SQLite for persistence')
|
||||
})
|
||||
|
||||
|
|
|
|||
|
|
@ -250,7 +250,7 @@ test("sf exits early with a clear message when synced resources are newer than t
|
|||
mkdirSync(fakeAgentDir, { recursive: true });
|
||||
writeFileSync(
|
||||
join(fakeAgentDir, "managed-resources.json"),
|
||||
JSON.stringify({ gsdVersion: "999.0.0" }),
|
||||
JSON.stringify({ sfVersion: "999.0.0" }),
|
||||
);
|
||||
|
||||
t.after(() => { rmSync(fakeHome, { recursive: true, force: true }); });
|
||||
|
|
|
|||
|
|
@ -191,7 +191,7 @@ test("current SF command family samples dispatch to correct outcomes after S02",
|
|||
})
|
||||
})
|
||||
|
||||
const EXPECTED_GSD_OUTCOMES = new Map<string, "surface" | "prompt" | "local" | "view-navigate">([
|
||||
const EXPECTED_SF_OUTCOMES = new Map<string, "surface" | "prompt" | "local" | "view-navigate">([
|
||||
// Surface commands (19)
|
||||
["status", "surface"],
|
||||
["visualize", "view-navigate"],
|
||||
|
|
@ -229,12 +229,12 @@ const EXPECTED_GSD_OUTCOMES = new Map<string, "surface" | "prompt" | "local" | "
|
|||
|
||||
test("every registered /sf subcommand has an explicit browser dispatch outcome", async (t) => {
|
||||
assert.equal(
|
||||
EXPECTED_GSD_OUTCOMES.size,
|
||||
EXPECTED_SF_OUTCOMES.size,
|
||||
30,
|
||||
"EXPECTED_GSD_OUTCOMES must cover all 30 SF subcommands (19 surface + 1 view-navigate + 9 passthrough + 1 help)",
|
||||
"EXPECTED_SF_OUTCOMES must cover all 30 SF subcommands (19 surface + 1 view-navigate + 9 passthrough + 1 help)",
|
||||
)
|
||||
|
||||
for (const [subcommand, expectedKind] of EXPECTED_GSD_OUTCOMES) {
|
||||
for (const [subcommand, expectedKind] of EXPECTED_SF_OUTCOMES) {
|
||||
await t.test(`/sf ${subcommand} -> ${expectedKind}`, () => {
|
||||
const outcome = dispatchBrowserSlashCommand(`/sf ${subcommand}`)
|
||||
assert.equal(
|
||||
|
|
@ -259,9 +259,9 @@ test("every registered /sf subcommand has an explicit browser dispatch outcome",
|
|||
}
|
||||
|
||||
if (expectedKind === "local") {
|
||||
await t.test(`/sf ${subcommand} dispatches to gsd_help action`, () => {
|
||||
await t.test(`/sf ${subcommand} dispatches to sf_help action`, () => {
|
||||
const outcome = dispatchBrowserSlashCommand(`/sf ${subcommand}`) as any
|
||||
assert.equal(outcome.action, "gsd_help", `/sf ${subcommand} should dispatch to gsd_help action`)
|
||||
assert.equal(outcome.action, "sf_help", `/sf ${subcommand} should dispatch to sf_help action`)
|
||||
})
|
||||
}
|
||||
|
||||
|
|
@ -281,10 +281,10 @@ test("SF dispatch edge cases", async (t) => {
|
|||
assert.equal(outcome.command.message, "/sf")
|
||||
})
|
||||
|
||||
await t.test("/sf help dispatches to local gsd_help action", () => {
|
||||
await t.test("/sf help dispatches to local sf_help action", () => {
|
||||
const outcome = dispatchBrowserSlashCommand("/sf help")
|
||||
assert.equal(outcome.kind, "local")
|
||||
assert.equal(outcome.action, "gsd_help")
|
||||
assert.equal(outcome.action, "sf_help")
|
||||
})
|
||||
|
||||
await t.test("/sf unknown-xyz passes through to bridge", () => {
|
||||
|
|
@ -328,7 +328,7 @@ test("SF dispatch edge cases", async (t) => {
|
|||
})
|
||||
|
||||
test("every SF surface dispatches through the contract wiring end-to-end", async (t) => {
|
||||
const gsdSurfaces = [...EXPECTED_GSD_OUTCOMES.entries()].filter(([, kind]) => kind === "surface")
|
||||
const gsdSurfaces = [...EXPECTED_SF_OUTCOMES.entries()].filter(([, kind]) => kind === "surface")
|
||||
|
||||
assert.equal(gsdSurfaces.length, 19, "should have exactly 19 SF surface subcommands")
|
||||
|
||||
|
|
|
|||
|
|
@ -56,7 +56,7 @@ describe("diagnostics type exports", () => {
|
|||
|
||||
it("ForensicReport has all required fields", () => {
|
||||
const report: ForensicReport = {
|
||||
gsdVersion: "1.0.0",
|
||||
sfVersion: "1.0.0",
|
||||
timestamp: new Date().toISOString(),
|
||||
basePath: "/tmp/test",
|
||||
activeMilestone: "M001",
|
||||
|
|
@ -72,7 +72,7 @@ describe("diagnostics type exports", () => {
|
|||
journalSummary: null,
|
||||
activityLogMeta: null,
|
||||
}
|
||||
assert.equal(typeof report.gsdVersion, "string")
|
||||
assert.equal(typeof report.sfVersion, "string")
|
||||
assert.equal(typeof report.timestamp, "string")
|
||||
assert.deepEqual(report.anomalies, [])
|
||||
assert.deepEqual(report.recentUnits, [])
|
||||
|
|
|
|||
|
|
@ -210,7 +210,7 @@ test("pruneRemovedBundledExtensions removes stale subdirectory extensions not in
|
|||
);
|
||||
|
||||
// Bump the manifest version to force a re-sync (simulates upgrading SF).
|
||||
manifest.gsdVersion = "0.0.0-force-resync";
|
||||
manifest.sfVersion = "0.0.0-force-resync";
|
||||
manifest.contentHash = "0000000000000000";
|
||||
writeFileSync(manifestPath, JSON.stringify(manifest));
|
||||
|
||||
|
|
|
|||
|
|
@ -17,7 +17,7 @@ test("resource manifest includes contentHash", async (t) => {
|
|||
// module-level resolved paths. Instead, verify the manifest schema
|
||||
// by simulating what writeManagedResourceManifest produces.
|
||||
const manifest = {
|
||||
gsdVersion: "2.28.0",
|
||||
sfVersion: "2.28.0",
|
||||
syncedAt: Date.now(),
|
||||
contentHash: "abc123def456",
|
||||
};
|
||||
|
|
@ -29,7 +29,7 @@ test("resource manifest includes contentHash", async (t) => {
|
|||
|
||||
writeFileSync(manifestPath, JSON.stringify(manifest));
|
||||
const read = JSON.parse(readFileSync(manifestPath, "utf-8"));
|
||||
assert.equal(read.gsdVersion, "2.28.0");
|
||||
assert.equal(read.sfVersion, "2.28.0");
|
||||
assert.equal(read.contentHash, "abc123def456");
|
||||
assert.equal(typeof read.syncedAt, "number");
|
||||
});
|
||||
|
|
@ -38,14 +38,14 @@ test("missing contentHash in manifest triggers re-sync (upgrade path)", () => {
|
|||
// Old manifests won't have contentHash. The new logic should treat
|
||||
// a missing contentHash as "stale" and re-sync.
|
||||
const oldManifest = {
|
||||
gsdVersion: "2.28.0",
|
||||
sfVersion: "2.28.0",
|
||||
syncedAt: Date.now(),
|
||||
};
|
||||
|
||||
// Simulate the check in initResources:
|
||||
// if (manifest.contentHash && manifest.contentHash === currentHash)
|
||||
const currentHash = "somehash";
|
||||
const shouldSkip = oldManifest.gsdVersion === "2.28.0"
|
||||
const shouldSkip = oldManifest.sfVersion === "2.28.0"
|
||||
&& ("contentHash" in oldManifest)
|
||||
&& (oldManifest as any).contentHash === currentHash;
|
||||
|
||||
|
|
@ -54,13 +54,13 @@ test("missing contentHash in manifest triggers re-sync (upgrade path)", () => {
|
|||
|
||||
test("matching contentHash skips re-sync", () => {
|
||||
const manifest = {
|
||||
gsdVersion: "2.28.0",
|
||||
sfVersion: "2.28.0",
|
||||
syncedAt: Date.now(),
|
||||
contentHash: "abc123",
|
||||
};
|
||||
|
||||
const currentHash = "abc123";
|
||||
const shouldSkip = manifest.gsdVersion === "2.28.0"
|
||||
const shouldSkip = manifest.sfVersion === "2.28.0"
|
||||
&& manifest.contentHash != null
|
||||
&& manifest.contentHash === currentHash;
|
||||
|
||||
|
|
@ -69,13 +69,13 @@ test("matching contentHash skips re-sync", () => {
|
|||
|
||||
test("different contentHash triggers re-sync", () => {
|
||||
const manifest = {
|
||||
gsdVersion: "2.28.0",
|
||||
sfVersion: "2.28.0",
|
||||
syncedAt: Date.now(),
|
||||
contentHash: "old_hash",
|
||||
};
|
||||
|
||||
const currentHash = "new_hash";
|
||||
const shouldSkip = manifest.gsdVersion === "2.28.0"
|
||||
const shouldSkip = manifest.sfVersion === "2.28.0"
|
||||
&& manifest.contentHash != null
|
||||
&& manifest.contentHash === currentHash;
|
||||
|
||||
|
|
|
|||
|
|
@ -41,14 +41,14 @@ function withFakeRtk<T>(mapping: Record<string, string | { status?: number; stdo
|
|||
const previousPath = process.env.SF_RTK_PATH;
|
||||
const previousDisabled = process.env.SF_RTK_DISABLED;
|
||||
const previousTimeout = process.env.SF_RTK_REWRITE_TIMEOUT_MS;
|
||||
const previousGsdPath = process.env.GSD_RTK_PATH;
|
||||
const previousGsdDisabled = process.env.GSD_RTK_DISABLED;
|
||||
const previousGsdTimeout = process.env.GSD_RTK_REWRITE_TIMEOUT_MS;
|
||||
const previousGsdPath = process.env.SF_RTK_PATH;
|
||||
const previousGsdDisabled = process.env.SF_RTK_DISABLED;
|
||||
const previousGsdTimeout = process.env.SF_RTK_REWRITE_TIMEOUT_MS;
|
||||
process.env.SF_RTK_PATH = fake.path;
|
||||
process.env.SF_RTK_REWRITE_TIMEOUT_MS = "20000";
|
||||
delete process.env.SF_RTK_DISABLED;
|
||||
delete process.env.GSD_RTK_PATH;
|
||||
delete process.env.GSD_RTK_DISABLED;
|
||||
delete process.env.SF_RTK_PATH;
|
||||
delete process.env.SF_RTK_DISABLED;
|
||||
|
||||
const finalize = () => {
|
||||
if (previousPath === undefined) delete process.env.SF_RTK_PATH;
|
||||
|
|
@ -57,12 +57,12 @@ function withFakeRtk<T>(mapping: Record<string, string | { status?: number; stdo
|
|||
else process.env.SF_RTK_DISABLED = previousDisabled;
|
||||
if (previousTimeout === undefined) delete process.env.SF_RTK_REWRITE_TIMEOUT_MS;
|
||||
else process.env.SF_RTK_REWRITE_TIMEOUT_MS = previousTimeout;
|
||||
if (previousGsdPath === undefined) delete process.env.GSD_RTK_PATH;
|
||||
else process.env.GSD_RTK_PATH = previousGsdPath;
|
||||
if (previousGsdDisabled === undefined) delete process.env.GSD_RTK_DISABLED;
|
||||
else process.env.GSD_RTK_DISABLED = previousGsdDisabled;
|
||||
if (previousGsdTimeout === undefined) delete process.env.GSD_RTK_REWRITE_TIMEOUT_MS;
|
||||
else process.env.GSD_RTK_REWRITE_TIMEOUT_MS = previousGsdTimeout;
|
||||
if (previousGsdPath === undefined) delete process.env.SF_RTK_PATH;
|
||||
else process.env.SF_RTK_PATH = previousGsdPath;
|
||||
if (previousGsdDisabled === undefined) delete process.env.SF_RTK_DISABLED;
|
||||
else process.env.SF_RTK_DISABLED = previousGsdDisabled;
|
||||
if (previousGsdTimeout === undefined) delete process.env.SF_RTK_REWRITE_TIMEOUT_MS;
|
||||
else process.env.SF_RTK_REWRITE_TIMEOUT_MS = previousGsdTimeout;
|
||||
fake.cleanup();
|
||||
};
|
||||
|
||||
|
|
@ -94,16 +94,16 @@ function withManagedFakeRtk<T>(mapping: Record<string, string | { status?: numbe
|
|||
const previousPath = process.env.SF_RTK_PATH;
|
||||
const previousDisabled = process.env.SF_RTK_DISABLED;
|
||||
const previousTimeout = process.env.SF_RTK_REWRITE_TIMEOUT_MS;
|
||||
const previousGsdHome = process.env.GSD_HOME;
|
||||
const previousGsdPath = process.env.GSD_RTK_PATH;
|
||||
const previousGsdDisabled = process.env.GSD_RTK_DISABLED;
|
||||
const previousGsdTimeout = process.env.GSD_RTK_REWRITE_TIMEOUT_MS;
|
||||
const previousGsdHome = process.env.SF_HOME;
|
||||
const previousGsdPath = process.env.SF_RTK_PATH;
|
||||
const previousGsdDisabled = process.env.SF_RTK_DISABLED;
|
||||
const previousGsdTimeout = process.env.SF_RTK_REWRITE_TIMEOUT_MS;
|
||||
process.env.SF_HOME = managedHome;
|
||||
process.env.SF_RTK_REWRITE_TIMEOUT_MS = "20000";
|
||||
delete process.env.SF_RTK_PATH;
|
||||
delete process.env.SF_RTK_DISABLED;
|
||||
delete process.env.GSD_RTK_PATH;
|
||||
delete process.env.GSD_RTK_DISABLED;
|
||||
delete process.env.SF_RTK_PATH;
|
||||
delete process.env.SF_RTK_DISABLED;
|
||||
|
||||
const env: NodeJS.ProcessEnv = {
|
||||
...process.env,
|
||||
|
|
@ -121,14 +121,14 @@ function withManagedFakeRtk<T>(mapping: Record<string, string | { status?: numbe
|
|||
else process.env.SF_RTK_DISABLED = previousDisabled;
|
||||
if (previousTimeout === undefined) delete process.env.SF_RTK_REWRITE_TIMEOUT_MS;
|
||||
else process.env.SF_RTK_REWRITE_TIMEOUT_MS = previousTimeout;
|
||||
if (previousGsdHome === undefined) delete process.env.GSD_HOME;
|
||||
else process.env.GSD_HOME = previousGsdHome;
|
||||
if (previousGsdPath === undefined) delete process.env.GSD_RTK_PATH;
|
||||
else process.env.GSD_RTK_PATH = previousGsdPath;
|
||||
if (previousGsdDisabled === undefined) delete process.env.GSD_RTK_DISABLED;
|
||||
else process.env.GSD_RTK_DISABLED = previousGsdDisabled;
|
||||
if (previousGsdTimeout === undefined) delete process.env.GSD_RTK_REWRITE_TIMEOUT_MS;
|
||||
else process.env.GSD_RTK_REWRITE_TIMEOUT_MS = previousGsdTimeout;
|
||||
if (previousGsdHome === undefined) delete process.env.SF_HOME;
|
||||
else process.env.SF_HOME = previousGsdHome;
|
||||
if (previousGsdPath === undefined) delete process.env.SF_RTK_PATH;
|
||||
else process.env.SF_RTK_PATH = previousGsdPath;
|
||||
if (previousGsdDisabled === undefined) delete process.env.SF_RTK_DISABLED;
|
||||
else process.env.SF_RTK_DISABLED = previousGsdDisabled;
|
||||
if (previousGsdTimeout === undefined) delete process.env.SF_RTK_REWRITE_TIMEOUT_MS;
|
||||
else process.env.SF_RTK_REWRITE_TIMEOUT_MS = previousGsdTimeout;
|
||||
fake.cleanup();
|
||||
rmSync(managedHome, { recursive: true, force: true });
|
||||
};
|
||||
|
|
|
|||
|
|
@ -16,19 +16,19 @@ import { createFakeRtk } from "./rtk-test-utils.ts";
|
|||
let originalRtkDisabled: string | undefined;
|
||||
|
||||
beforeEach(() => {
|
||||
// Save and clear SF_RTK_DISABLED (and GSD_RTK_DISABLED) so tests can use fake RTK binaries
|
||||
originalRtkDisabled = process.env.SF_RTK_DISABLED ?? process.env.GSD_RTK_DISABLED;
|
||||
// Save and clear SF_RTK_DISABLED (and SF_RTK_DISABLED) so tests can use fake RTK binaries
|
||||
originalRtkDisabled = process.env.SF_RTK_DISABLED ?? process.env.SF_RTK_DISABLED;
|
||||
delete process.env.SF_RTK_DISABLED;
|
||||
delete process.env.SF_RTK_DISABLED;
|
||||
delete process.env.GSD_RTK_DISABLED;
|
||||
});
|
||||
|
||||
afterEach(() => {
|
||||
// Restore original env
|
||||
delete process.env.SF_RTK_DISABLED;
|
||||
if (originalRtkDisabled !== undefined) {
|
||||
process.env.GSD_RTK_DISABLED = originalRtkDisabled;
|
||||
process.env.SF_RTK_DISABLED = originalRtkDisabled;
|
||||
} else {
|
||||
delete process.env.GSD_RTK_DISABLED;
|
||||
delete process.env.SF_RTK_DISABLED;
|
||||
}
|
||||
});
|
||||
|
||||
|
|
|
|||
|
|
@ -595,7 +595,7 @@ export async function launchWebMode(
|
|||
SF_WEB_PROJECT_SESSIONS_DIR: options.projectSessionsDir,
|
||||
SF_WEB_PACKAGE_ROOT: resolution.packageRoot,
|
||||
SF_WEB_HOST_KIND: resolution.kind,
|
||||
...(resolution.kind === 'source-dev' ? { NEXT_PUBLIC_GSD_DEV: '1' } : {}),
|
||||
...(resolution.kind === 'source-dev' ? { NEXT_PUBLIC_SF_DEV: '1' } : {}),
|
||||
...(options.allowedOrigins?.length ? { SF_WEB_ALLOWED_ORIGINS: options.allowedOrigins.join(',') } : {}),
|
||||
}
|
||||
|
||||
|
|
|
|||
|
|
@ -57,7 +57,7 @@ export async function collectForensicsData(projectCwdOverride?: string): Promise
|
|||
' metrics = { totalUnits: units.length, totalCost, totalDuration };',
|
||||
'}',
|
||||
'const result = {',
|
||||
' gsdVersion: report.gsdVersion,',
|
||||
' sfVersion: report.sfVersion,',
|
||||
' timestamp: report.timestamp,',
|
||||
' basePath: report.basePath,',
|
||||
' activeMilestone: report.activeMilestone,',
|
||||
|
|
|
|||
|
|
@ -207,7 +207,7 @@ async function doMerge(ext: ExtensionModules, basePath: string, name: string): P
|
|||
}
|
||||
|
||||
const commitType = ext.inferCommitType(name)
|
||||
const commitMessage = `${commitType}: merge worktree ${name}\n\nGSD-Worktree: ${name}`
|
||||
const commitMessage = `${commitType}: merge worktree ${name}\n\nSF-Worktree: ${name}`
|
||||
|
||||
process.stderr.write(`\nMerging ${chalk.bold.cyan(name)} → ${chalk.magenta(ext.nativeDetectMainBranch(basePath))}\n`)
|
||||
process.stderr.write(chalk.dim(` ${status.filesChanged} files, ${chalk.green(`+${status.linesAdded}`)} ${chalk.red(`-${status.linesRemoved}`)}\n\n`))
|
||||
|
|
|
|||
|
|
@ -19,9 +19,9 @@ export const dynamic = "force-dynamic";
|
|||
|
||||
// Persist counter across HMR re-evaluations in dev
|
||||
const g = globalThis as Record<string, unknown>;
|
||||
if (!g.__gsd_pty_next_index__) g.__gsd_pty_next_index__ = 1;
|
||||
if (!g.__sf_pty_next_index__) g.__sf_pty_next_index__ = 1;
|
||||
function getNextIndex(): number {
|
||||
return (g.__gsd_pty_next_index__ as number)++;
|
||||
return (g.__sf_pty_next_index__ as number)++;
|
||||
}
|
||||
|
||||
export async function GET(): Promise<Response> {
|
||||
|
|
|
|||
|
|
@ -35,7 +35,7 @@ export type BrowserSlashCommandSurface =
|
|||
| "sf-cleanup"
|
||||
| "sf-queue"
|
||||
|
||||
export type BrowserSlashCommandLocalAction = "clear_terminal" | "refresh_workspace" | "gsd_help"
|
||||
export type BrowserSlashCommandLocalAction = "clear_terminal" | "refresh_workspace" | "sf_help"
|
||||
|
||||
export type BrowserSlashPromptCommandType = "prompt" | "follow_up"
|
||||
|
||||
|
|
@ -188,7 +188,7 @@ function dispatchGSDSubcommand(
|
|||
kind: "local",
|
||||
input,
|
||||
commandName: "sf",
|
||||
action: "gsd_help",
|
||||
action: "sf_help",
|
||||
}
|
||||
}
|
||||
|
||||
|
|
|
|||
|
|
@ -9,7 +9,7 @@ import { authFetch } from "@/lib/auth"
|
|||
* Build-time hint — may be `false` even in source-dev if the build happened
|
||||
* without the env var. The runtime check via `/api/dev-mode` is authoritative.
|
||||
*/
|
||||
const BUILD_TIME_HINT = process.env.NEXT_PUBLIC_GSD_DEV === "1"
|
||||
const BUILD_TIME_HINT = process.env.NEXT_PUBLIC_SF_DEV === "1"
|
||||
|
||||
/**
|
||||
* Exported for static guards that run before the runtime check resolves.
|
||||
|
|
|
|||
|
|
@ -78,7 +78,7 @@ export interface ForensicJournalSummary {
|
|||
}
|
||||
|
||||
export interface ForensicReport {
|
||||
gsdVersion: string
|
||||
sfVersion: string
|
||||
timestamp: string
|
||||
basePath: string
|
||||
activeMilestone: string | null
|
||||
|
|
|
|||
|
|
@ -29,8 +29,8 @@ interface LoadedNodePty {
|
|||
}
|
||||
|
||||
// Use globalThis to persist across Turbopack/HMR module re-evaluations in dev
|
||||
const GLOBAL_KEY = "__gsd_pty_sessions__" as const;
|
||||
const CLEANUP_GUARD_KEY = "__gsd_pty_cleanup_installed__" as const;
|
||||
const GLOBAL_KEY = "__sf_pty_sessions__" as const;
|
||||
const CLEANUP_GUARD_KEY = "__sf_pty_cleanup_installed__" as const;
|
||||
const MAX_SESSION_BUFFER_BYTES = 1024 * 1024;
|
||||
|
||||
function getSessions(): Map<string, PtySession> {
|
||||
|
|
|
|||
|
|
@ -4074,7 +4074,7 @@ export class SFWorkspaceStore {
|
|||
await this.refreshBoot()
|
||||
return outcome
|
||||
}
|
||||
if (outcome.action === "gsd_help") {
|
||||
if (outcome.action === "sf_help") {
|
||||
this.patchState({
|
||||
terminalLines: withTerminalLine(
|
||||
withTerminalLine(this.state.terminalLines, createTerminalLine("input", trimmed)),
|
||||
|
|
|
|||
Loading…
Add table
Reference in a new issue