fix: clean provider surfaces and core build

This commit is contained in:
Mikael Hugo 2026-05-05 16:31:53 +02:00
parent 4c98cb8c33
commit ab6cad4c84
67 changed files with 655 additions and 1538 deletions

View file

@ -2,7 +2,7 @@
<!--
PRs without a linked issue will be closed.
Open or find an issue first: https://github.com/singularity-forge/sf-run/issues
Open or find an issue first: https://github.com/singularity-ng/singularity-forge/issues
-->
Closes #<!-- issue number — required -->

View file

@ -4,8 +4,8 @@
**The evolution of [Singularity Forge](https://github.com/sf-build/get-shit-done) — now a real coding agent.**
[![npm version](https://img.shields.io/npm/v/sf-run?style=for-the-badge&logo=npm&logoColor=white&color=CB3837)](https://www.npmjs.com/package/sf-run)
[![npm downloads](https://img.shields.io/npm/dm/sf-run?style=for-the-badge&logo=npm&logoColor=white&color=CB3837)](https://www.npmjs.com/package/sf-run)
[![npm version](https://img.shields.io/npm/v/singularity-forge?style=for-the-badge&logo=npm&logoColor=white&color=CB3837)](https://www.npmjs.com/package/singularity-forge)
[![npm downloads](https://img.shields.io/npm/dm/singularity-forge?style=for-the-badge&logo=npm&logoColor=white&color=CB3837)](https://www.npmjs.com/package/singularity-forge)
[![GitHub stars](https://img.shields.io/github/stars/sf-build/SF?style=for-the-badge&logo=github&color=181717)](https://github.com/sf-build/SF)
[![Discord](https://img.shields.io/badge/Discord-Join%20us-5865F2?style=for-the-badge&logo=discord&logoColor=white)](https://discord.com/invite/nKXTsAcmbT)
[![License](https://img.shields.io/badge/license-MIT-blue?style=for-the-badge)](LICENSE)
@ -17,7 +17,7 @@ This version is different. SF is now a standalone CLI built on the [Pi SDK](http
One command. Walk away. Come back to a built project with clean git history.
<pre><code>npm install -g sf-run@latest</code></pre>
<pre><code>npm install -g singularity-forge@latest</code></pre>
> SF now provisions a managed [RTK](https://github.com/rtk-ai/rtk) binary on supported macOS, Linux, and Windows installs to compress shell-command output in `bash`, `async_bash`, `bg_shell`, and verification flows. SF forces `RTK_TELEMETRY_DISABLED=1` for all managed invocations. Set `SF_RTK_DISABLED=1` to disable the integration.
@ -279,7 +279,7 @@ Step mode is the on-ramp. Auto mode is the highway.
### Install
```bash
npm install -g sf-run
npm install -g singularity-forge
```
### Log in to a provider
@ -771,8 +771,8 @@ Use expensive models where quality matters (planning, complex execution) and che
## Star History
<a href="https://star-history.com/#singularity-forge/sf-run&Date">
<img alt="Star History Chart" src="https://api.star-history.com/svg?repos=singularity-forge/sf-run&type=Date" />
<a href="https://star-history.com/#singularity-ng/singularity-forge&Date">
<img alt="Star History Chart" src="https://api.star-history.com/svg?repos=singularity-ng/singularity-forge&type=Date" />
</a>
---
@ -787,6 +787,6 @@ Use expensive models where quality matters (planning, complex execution) and che
**The original SF showed what was possible. This version delivers it.**
**`npm install -g sf-run && sf`**
**`npm install -g singularity-forge && sf`**
</div>

View file

@ -17,7 +17,7 @@ RUN apt-get update && apt-get install -y --no-install-recommends \
# Install SF globally — version controlled via build arg
ARG SF_VERSION=latest
RUN npm install -g sf-run@${SF_VERSION}
RUN npm install -g singularity-forge@${SF_VERSION}
# Create non-root user for sandbox isolation
RUN groupadd --gid 1000 sf \

View file

@ -44,7 +44,7 @@ git --version # should print 2.20+
**Step 4 — Install SF:**
```bash
npm install -g sf-run
npm install -g singularity-forge
```
**Step 5 — Set up your LLM provider:**
@ -116,7 +116,7 @@ git --version # should print 2.20+
**Step 4 — Install SF:**
```powershell
npm install -g sf-run
npm install -g singularity-forge
```
**Step 5 — Set up your LLM provider:**
@ -220,7 +220,7 @@ git --version # should print 2.20+
**Step 3 — Install SF:**
```bash
npm install -g sf-run
npm install -g singularity-forge
```
**Step 4 — Set up your LLM provider:**
@ -263,7 +263,7 @@ Inside the session, type `/model` to confirm your LLM is connected.
> npm config set prefix '~/.npm-global'
> echo 'export PATH="$HOME/.npm-global/bin:$PATH"' >> ~/.bashrc
> source ~/.bashrc
> npm install -g sf-run
> npm install -g singularity-forge
> ```
---
@ -279,8 +279,8 @@ Run SF in an isolated sandbox without installing Node.js on your host.
**Step 2 — Clone the SF repo:**
```bash
git clone https://github.com/singularity-forge/sf-run.git
cd sf-run/docker
git clone https://github.com/singularity-ng/singularity-forge.git
cd singularity-forge/docker
```
**Step 3 — Create and enter a sandbox:**
@ -402,7 +402,7 @@ SF is also available as a VS Code extension. Install from the marketplace (publi
- **Sidebar dashboard** — connection status, model info, token usage
- **Full command palette** — start/stop agent, switch models, export sessions
The CLI (`sf-run`) must be installed first — the extension connects to it via RPC.
The CLI (`singularity-forge`) must be installed first — the extension connects to it via RPC.
---
@ -439,7 +439,7 @@ sf sessions
SF checks for updates every 24 hours and prompts at startup. You can also update manually:
```bash
npm update -g sf-run
npm update -g singularity-forge
```
Or from within a session:

View file

@ -70,6 +70,6 @@ After pinning:
```bash
node --version # v24.x.x
npm install -g sf-run
npm install -g singularity-forge
sf --version
```

View file

@ -43,7 +43,7 @@ It checks:
### `command not found: sf` after install
**Symptoms:** `npm install -g sf-run` succeeds but `sf` isn't found.
**Symptoms:** `npm install -g singularity-forge` succeeds but `sf` isn't found.
**Cause:** npm's global bin directory isn't in your shell's `$PATH`.
@ -59,14 +59,14 @@ echo 'export PATH="$(npm prefix -g)/bin:$PATH"' >> ~/.zshrc
source ~/.zshrc
```
**Workaround:** Run `npx sf-run` or `$(npm prefix -g)/bin/sf` directly.
**Workaround:** Run `npx singularity-forge` or `$(npm prefix -g)/bin/sf` directly.
**Common causes:**
- **Homebrew Node**`/opt/homebrew/bin` should be in PATH but sometimes isn't if Homebrew init is missing from your shell profile
- **Version manager (nvm, fnm, mise)** — global bin is version-specific; ensure your version manager initializes in your shell config
- **oh-my-zsh** — the `gitfast` plugin aliases `sf` to `git svn dcommit`. Check with `alias sf` and unalias if needed
### `npm install -g sf-run` fails
### `npm install -g singularity-forge` fails
**Common causes:**
- Missing workspace packages — fixed in v2.10.4+
@ -307,7 +307,7 @@ Doctor rebuilds `STATE.md` from plan and roadmap files on disk and fixes detecte
## Getting Help
- **GitHub Issues:** [github.com/singularity-forge/sf-run/issues](https://github.com/singularity-forge/sf-run/issues)
- **GitHub Issues:** [github.com/singularity-ng/singularity-forge/issues](https://github.com/singularity-ng/singularity-forge/issues)
- **Dashboard:** `Ctrl+Alt+G` or `/sf status` for real-time diagnostics
- **Forensics:** `/sf forensics` for structured post-mortem analysis of autonomous mode failures
- **Session logs:** `.sf/activity/` contains JSONL session dumps for crash forensics

View file

@ -44,7 +44,7 @@ git --version # 应输出 2.20+
**第 4 步:安装 SF**
```bash
npm install -g sf-run
npm install -g singularity-forge
```
**第 5 步:设置你的 LLM provider**
@ -116,7 +116,7 @@ git --version # 应输出 2.20+
**第 4 步:安装 SF**
```powershell
npm install -g sf-run
npm install -g singularity-forge
```
**第 5 步:设置你的 LLM provider**
@ -220,7 +220,7 @@ git --version # 应输出 2.20+
**第 3 步:安装 SF**
```bash
npm install -g sf-run
npm install -g singularity-forge
```
**第 4 步:设置你的 LLM provider**
@ -263,7 +263,7 @@ sf --version # 输出已安装版本
> npm config set prefix '~/.npm-global'
> echo 'export PATH="$HOME/.npm-global/bin:$PATH"' >> ~/.bashrc
> source ~/.bashrc
> npm install -g sf-run
> npm install -g singularity-forge
> ```
---
@ -279,8 +279,8 @@ sf --version # 输出已安装版本
**第 2 步:克隆 SF 仓库:**
```bash
git clone https://github.com/singularity-forge/sf-run.git
cd sf-run/docker
git clone https://github.com/singularity-ng/singularity-forge.git
cd singularity-forge/docker
```
**第 3 步:创建并进入沙箱:**
@ -402,7 +402,7 @@ SF 也提供 VS Code 扩展。你可以从扩展市场安装publisher: FluxLa
- **侧边栏仪表板**显示连接状态、模型信息、Token 使用量
- **完整命令面板**:启动 / 停止 agent、切换模型、导出会话
CLI`sf-run`)需要先安装好,扩展会通过 RPC 与其连接。
CLI`singularity-forge`)需要先安装好,扩展会通过 RPC 与其连接。
---
@ -439,7 +439,7 @@ sf sessions
SF 每 24 小时检查一次更新,并在启动时提示。你也可以手动更新:
```bash
npm update -g sf-run
npm update -g singularity-forge
```
或者在会话中执行:
@ -456,7 +456,7 @@ npm update -g sf-run
|------|----------|
| `command not found: sf` | 把 npm 全局 bin 目录加入 PATH见上面的系统说明 |
| `sf` 实际执行了 `git svn dcommit` | oh-my-zsh 冲突,执行 `unalias sf` 或改用 `sf-cli` |
| `npm install -g sf-run` 权限错误 | 修复 npm prefix见 Linux 说明)或改用 nvm |
| `npm install -g singularity-forge` 权限错误 | 修复 npm prefix见 Linux 说明)或改用 nvm |
| 无法连接到 LLM | 用 `sf config` 检查 API key并确认网络可用 |
| `sf` 启动时卡住 | 检查 Node.js 版本:`node --version`(需要 22+ |

View file

@ -70,6 +70,6 @@ brew unpin node@24
```bash
node --version # v24.x.x
npm install -g sf-run
npm install -g singularity-forge
sf --version
```

View file

@ -45,7 +45,7 @@
### 安装后出现 `command not found: sf`
**症状:** `npm install -g sf-run` 成功,但系统找不到 `sf`
**症状:** `npm install -g singularity-forge` 成功,但系统找不到 `sf`
**原因:** npm 的全局 bin 目录没有加入 shell 的 `$PATH`
@ -61,7 +61,7 @@ echo 'export PATH="$(npm prefix -g)/bin:$PATH"' >> ~/.zshrc
source ~/.zshrc
```
**临时方案:** 直接执行 `npx sf-run`,或使用 `$(npm prefix -g)/bin/sf`
**临时方案:** 直接执行 `npx singularity-forge`,或使用 `$(npm prefix -g)/bin/sf`
**常见原因:**
@ -69,7 +69,7 @@ source ~/.zshrc
- **版本管理器nvm、fnm、mise**:全局 bin 路径是按版本区分的,需确保版本管理器正确初始化
- **oh-my-zsh**`gitfast` 插件会把 `sf` alias 到 `git svn dcommit`。可通过 `alias sf` 检查,并在需要时取消 alias
### `npm install -g sf-run` 失败
### `npm install -g singularity-forge` 失败
**常见原因:**
@ -324,7 +324,7 @@ Doctor 会从磁盘上的 plan 和 roadmap 文件重建 `STATE.md`,并修复
## 获取帮助
- **GitHub Issues** [github.com/singularity-forge/sf-run/issues](https://github.com/singularity-forge/sf-run/issues)
- **GitHub Issues** [github.com/singularity-ng/singularity-forge/issues](https://github.com/singularity-ng/singularity-forge/issues)
- **Dashboard** `Ctrl+Alt+G``/sf status`,查看实时诊断信息
- **Forensics** `/sf forensics`,用于对自动模式失败做结构化事后分析
- **Session logs** `.sf/activity/` 中包含用于崩溃取证的 JSONL 会话转储

View file

@ -38,7 +38,7 @@ You can stay hands-on with **step mode** (reviewing each step) or let SF run aut
```bash
# Install
npm install -g sf-run
npm install -g singularity-forge
# Launch
sf

View file

@ -3,7 +3,7 @@
## Install SF
```bash
npm install -g sf-run
npm install -g singularity-forge
```
Requires **Node.js 24.0.0 or later** (24 LTS recommended) and **Git**.
@ -57,7 +57,7 @@ The extension provides:
- **Sidebar dashboard** — connection status, model info, token usage, quick actions
- **Full command palette** — start/stop agent, switch models, export sessions
The CLI (`sf-run`) must be installed first — the extension connects to it via RPC.
The CLI (`singularity-forge`) must be installed first — the extension connects to it via RPC.
## Web Interface

View file

@ -33,7 +33,7 @@
"links": [
{
"label": "GitHub",
"href": "https://github.com/singularity-forge/sf-run"
"href": "https://github.com/singularity-ng/singularity-forge"
}
],
"primary": {
@ -44,7 +44,7 @@
},
"footer": {
"socials": {
"github": "https://github.com/singularity-forge/sf-run"
"github": "https://github.com/singularity-ng/singularity-forge"
}
},
"navigation": {

View file

@ -6,7 +6,7 @@ description: "Install SF, configure your LLM provider, and run your first autono
## Install
```bash
npm install -g sf-run
npm install -g singularity-forge
```
Requires Node.js 22+ and Git.
@ -156,7 +156,7 @@ SF is also available as a VS Code extension (publisher: FluxLabs). It provides:
- **Sidebar dashboard** — connection status, model info, token usage, quick actions
- **Full command palette** — start/stop agent, switch models, export sessions
The CLI (`sf-run`) must be installed first — the extension connects to it via RPC.
The CLI (`singularity-forge`) must be installed first — the extension connects to it via RPC.
## Web interface

View file

@ -38,7 +38,7 @@ It checks file structure, referential integrity, completion state consistency, g
source ~/.zshrc
```
**Workaround:** `npx sf-run` or `$(npm prefix -g)/bin/sf`
**Workaround:** `npx singularity-forge` or `$(npm prefix -g)/bin/sf`
</Accordion>
<Accordion title="Provider errors during auto mode">

899
package-lock.json generated

File diff suppressed because it is too large Load diff

View file

@ -1,15 +1,15 @@
{
"name": "singularity-forge",
"version": "2.75.2",
"version": "2.75.3",
"description": "Singularity Forge runtime core",
"license": "MIT",
"repository": {
"type": "git",
"url": "https://github.com/singularity-ng/singularity-foundry.git"
"url": "https://github.com/singularity-ng/singularity-forge.git"
},
"homepage": "https://github.com/singularity-ng/singularity-foundry#readme",
"homepage": "https://github.com/singularity-ng/singularity-forge#readme",
"bugs": {
"url": "https://github.com/singularity-ng/singularity-foundry/issues"
"url": "https://github.com/singularity-ng/singularity-forge/issues"
},
"type": "module",
"workspaces": [
@ -105,20 +105,21 @@
"release:changelog": "node scripts/generate-changelog.mjs",
"release:bump": "node scripts/bump-version.mjs",
"release:update-changelog": "node scripts/update-changelog.mjs",
"docker:build-runtime": "docker build --target runtime -t ghcr.io/singularity-forge/sf-run .",
"docker:build-runtime": "docker build --target runtime -t ghcr.io/singularity-ng/singularity-forge .",
"docker:build-builder": "docker build --target builder -t ghcr.io/singularity-forge/sf-ci-builder .",
"prepublishOnly": "npm run sync-pkg-version && npm run sync-platform-versions && node scripts/prepublish-check.mjs && npm run build && npm run typecheck:extensions && npm run validate-pack",
"test:live-regression": "node --experimental-strip-types tests/live-regression/run.ts"
},
"dependencies": {
"@anthropic-ai/sdk": "^0.92.0",
"@anthropic-ai/sdk": "^0.93.0",
"@anthropic-ai/vertex-sdk": "^0.14.4",
"@aws-sdk/client-bedrock-runtime": "^3.983.0",
"@clack/prompts": "^1.1.0",
"@google/gemini-cli-core": "^0.40.1",
"@google/genai": "^1.40.0",
"@mariozechner/jiti": "^2.6.2",
"@mistralai/mistralai": "^2.2.1",
"@modelcontextprotocol/sdk": "^1.27.1",
"@modelcontextprotocol/sdk": "^1.29.0",
"@octokit/rest": "^22.0.1",
"@silvia-odwyer/photon-node": "^0.3.4",
"@sinclair/typebox": "^0.34.49",
@ -169,7 +170,7 @@
"vitest": "^4.1.5"
},
"optionalDependencies": {
"@anthropic-ai/claude-agent-sdk": "^0.2.83",
"@anthropic-ai/claude-agent-sdk": "^0.2.128",
"@singularity-forge/engine-darwin-arm64": ">=2.10.2",
"@singularity-forge/engine-darwin-x64": ">=2.10.2",
"@singularity-forge/engine-linux-arm64-gnu": ">=2.10.2",

View file

@ -1,6 +1,6 @@
{
"name": "@singularity-forge/daemon",
"version": "2.75.0",
"version": "2.75.3",
"description": "sf-run daemon — background process for project monitoring and Discord integration",
"license": "MIT",
"repository": {
@ -29,8 +29,8 @@
"test": "vitest run packages/daemon/src --root ../.. --config vitest.config.ts"
},
"dependencies": {
"@anthropic-ai/sdk": "^0.92.0",
"@singularity-forge/rpc-client": "^2.75.0",
"@anthropic-ai/sdk": "^0.93.0",
"@singularity-forge/rpc-client": "^2.75.3",
"discord.js": "^14.25.1",
"yaml": "^2.8.0",
"zod": "^3.24.0"

View file

@ -1,6 +1,6 @@
{
"name": "@singularity-forge/native",
"version": "2.75.0",
"version": "2.75.3",
"description": "Native Rust bindings for sf-run — high-performance native modules via N-API",
"type": "commonjs",
"main": "./dist/index.js",

View file

@ -1,6 +1,6 @@
{
"name": "@singularity-forge/pi-agent-core",
"version": "2.75.0",
"version": "2.75.3",
"description": "General-purpose agent core (vendored from pi-mono)",
"type": "module",
"main": "./dist/index.js",

View file

@ -1,6 +1,6 @@
{
"name": "@singularity-forge/pi-ai",
"version": "2.75.0",
"version": "2.75.3",
"description": "Unified LLM API (vendored from pi-mono)",
"type": "module",
"main": "./dist/index.js",
@ -23,10 +23,10 @@
"build": "tsc -p tsconfig.json"
},
"dependencies": {
"@anthropic-ai/sdk": "^0.92.0",
"@anthropic-ai/sdk": "^0.93.0",
"@anthropic-ai/vertex-sdk": "^0.14.4",
"@aws-sdk/client-bedrock-runtime": "^3.983.0",
"@google/gemini-cli-core": "0.38.2",
"@google/gemini-cli-core": "0.40.1",
"@google/genai": "^1.40.0",
"@mistralai/mistralai": "^2.2.1",
"@sinclair/typebox": "^0.34.41",

View file

@ -1117,10 +1117,10 @@ async function generateModels() {
});
}
// Google Cloud Code Assist models (Gemini CLI) — sourced from
// Google Cloud Code Assist models — sourced from
// @google/gemini-cli-core's VALID_GEMINI_MODELS so new models ship
// automatically on `npm update @google/gemini-cli-core`. cli-core is
// the authoritative list (what the real `gemini` CLI binary supports).
// the authoritative list for Code Assist-backed Gemini models.
//
// We filter out `*-customtools` preview variants — they require a
// specific tool protocol that SF's generic adapter doesn't speak.

View file

@ -80,7 +80,7 @@ const server = new CodeAssistServer(authClient, projectId);
cli-core reads `~/.gemini/oauth_creds.json` (migrated to keychain on newer
installs), refreshes tokens, writes back. SF's `/login` flow for this provider
becomes "run the real `gemini` binary first" — exactly what the user asked for.
becomes "let cli-core own the login flow" instead of reimplementing Google OAuth in SF.
Pros: full integration benefit, SF drops ~80 lines of auth management.
Cons: breaks existing SF auth storage path for this provider; users must

View file

@ -2,8 +2,8 @@
* Google Gemini CLI provider.
*
* Delegates auth, project discovery, and the Code Assist transport to
* @google/gemini-cli-core the same library the real `gemini` CLI uses.
* cli-core reads ~/.gemini/oauth_creds.json itself, refreshes tokens,
* @google/gemini-cli-core the library behind Google's Gemini tooling.
* cli-core reads ~/.gemini/oauth_creds.json itself when present, refreshes tokens,
* discovers the project (free-tier or whatever's onboarded server-side)
* via setupUser(), and handles all the User-Agent / quota-classification details.
* Request retry/fallback stays in the caller so SF can move to the next model.
@ -101,9 +101,9 @@ let toolCallCounter = 0;
/**
* Build a CodeAssistServer using cli-core's own auth + project discovery.
*
* - getOauthClient() reads ~/.gemini/oauth_creds.json, refreshes if expired,
* and returns an authenticated AuthClient. Triggers the browser OAuth flow
* on cache miss.
* - getOauthClient() reads ~/.gemini/oauth_creds.json when present, refreshes if
* expired, and returns an authenticated AuthClient. cli-core owns any
* interactive login flow it needs.
* - setupUser() asks the Code Assist API for the project + tier tied to this
* identity (free-tier auto-provisioned if needed; otherwise whatever the
* user has been onboarded to server-side).
@ -211,8 +211,8 @@ export const streamGoogleGeminiCli: StreamFunction<
};
try {
// cli-core handles auth + project discovery. If ~/.gemini/oauth_creds.json
// is missing the user needs to run the real `gemini` CLI to authenticate.
// cli-core handles auth + project discovery. SF uses cli-core directly
// and does not spawn a separate provider CLI process.
const server = await getCodeAssistServer();
let req = buildRequest(model, context, options);
const nextReq = await options?.onPayload?.(req, model);

View file

@ -10,8 +10,8 @@
*
* Note: Google Cloud Code Assist (google-gemini-cli) is not handled here.
* The provider delegates to @google/gemini-cli-core, which reads
* ~/.gemini/oauth_creds.json directly. Users authenticate via the real
* `gemini` CLI; we just consume the credentials.
* ~/.gemini/oauth_creds.json when present and owns any login flow it needs.
* SF uses cli-core directly and does not spawn a separate provider CLI process.
*
* Note: OpenAI Codex (ChatGPT) is not handled here via OAuth flows.
* The real `codex` CLI writes auth state to ~/.codex/auth.json after login.

View file

@ -1,6 +1,6 @@
{
"name": "@singularity-forge/pi-coding-agent",
"version": "2.75.0",
"version": "2.75.3",
"description": "Coding agent CLI (vendored from pi-mono)",
"type": "module",
"piConfig": {

View file

@ -5,20 +5,23 @@ import { parseArgs } from "./args.js";
describe("parseArgs", () => {
it("parses optional-value extension flags with implicit and explicit values", () => {
const extensionFlags = new Map([
["genai-proxy", { type: "string" as const, allowNoValue: true }],
["gemini-cli-proxy", { type: "string" as const, allowNoValue: true }],
]);
const defaultFlagArgs = parseArgs(["--gemini-cli-proxy"], extensionFlags);
const explicitFlagArgs = parseArgs(
["--gemini-cli-proxy=8080"],
const defaultFlagArgs = parseArgs(["--genai-proxy"], extensionFlags);
const explicitFlagArgs = parseArgs(["--genai-proxy=8080"], extensionFlags);
const legacyFlagArgs = parseArgs(
["--gemini-cli-proxy=3001"],
extensionFlags,
);
assert.deepEqual(
[
defaultFlagArgs.unknownFlags.get("gemini-cli-proxy"),
explicitFlagArgs.unknownFlags.get("gemini-cli-proxy"),
defaultFlagArgs.unknownFlags.get("genai-proxy"),
explicitFlagArgs.unknownFlags.get("genai-proxy"),
legacyFlagArgs.unknownFlags.get("gemini-cli-proxy"),
],
[true, "8080"],
[true, "8080", "3001"],
);
});
});

View file

@ -320,7 +320,7 @@ ${chalk.bold("Examples:")}
${APP_NAME} --models claude-sonnet,claude-haiku,gpt-4o
# Limit to a specific provider with glob pattern
${APP_NAME} --models "github-copilot/*"
${APP_NAME} --models "openrouter/*"
# Cycle models with fixed thinking levels
${APP_NAME} --models sonnet:high,haiku:low
@ -344,10 +344,8 @@ ${chalk.bold("Environment Variables:")}
AZURE_OPENAI_RESOURCE_NAME - Azure OpenAI resource name (alternative to base URL)
AZURE_OPENAI_API_VERSION - Azure OpenAI API version (default: v1)
AZURE_OPENAI_DEPLOYMENT_NAME_MAP - Azure OpenAI model=deployment map (comma-separated)
GEMINI_API_KEY - Google Gemini API key
GROQ_API_KEY - Groq API key
CEREBRAS_API_KEY - Cerebras API key
XAI_API_KEY - xAI Grok API key
OPENROUTER_API_KEY - OpenRouter API key
AI_GATEWAY_API_KEY - Vercel AI Gateway API key
ZAI_API_KEY - ZAI API key

View file

@ -79,8 +79,8 @@ function validateNotGoogleOAuthToken(provider: string, key: string): void {
`\n\nIf you're using Google's Gemini CLI, its OAuth tokens are not compatible. ` +
`Either:\n` +
` 1. Get an API key from https://aistudio.google.com/apikey and set GEMINI_API_KEY\n` +
` 2. Authenticate with the real \`gemini\` CLI; the google-gemini-cli ` +
`provider reads ~/.gemini/oauth_creds.json automatically`,
` 2. Use the google-gemini-cli provider, which delegates OAuth handling ` +
`to @google/gemini-cli-core`,
);
}
}
@ -956,7 +956,7 @@ export class AuthStorage {
new Error(
`Blocked Google OAuth access token (ya29.*) for provider "${providerId}". ` +
`Use an API key from https://aistudio.google.com/apikey, or authenticate with ` +
`the real \`gemini\` CLI to use the google-gemini-cli provider.`,
`the google-gemini-cli provider, which delegates OAuth handling to @google/gemini-cli-core.`,
),
);
return undefined;
@ -996,7 +996,7 @@ export class AuthStorage {
new Error(
`GEMINI_API_KEY contains a Google OAuth access token (ya29.*), not an API key. ` +
`Get an API key from https://aistudio.google.com/apikey, or authenticate with ` +
`the real \`gemini\` CLI to use the google-gemini-cli provider.`,
`the google-gemini-cli provider, which delegates OAuth handling to @google/gemini-cli-core.`,
),
);
return undefined;

View file

@ -347,22 +347,14 @@ describe("ModelRegistry.getModelsForProxy — family priority ordering", () => {
);
});
it("Gemini family: google-gemini-cli before google before google-vertex", () => {
it("Gemini family: google-gemini-cli only for bare model routing", () => {
const registry = createRegistry();
registerNone(registry, "google-vertex", "gemini-2.5-pro");
registerNone(registry, "google", "gemini-2.5-pro");
registerNone(registry, "google-gemini-cli", "gemini-2.5-pro");
const result = registry.getModelsForProxy("gemini-2.5-pro");
const providers = result.map((m) => m.provider);
assert.equal(
providers[0],
"google-gemini-cli",
"free CLI first for Gemini",
);
assert.ok(
providers.indexOf("google") < providers.indexOf("google-vertex"),
"paid API before vertex",
);
assert.deepEqual(providers, ["google-gemini-cli"]);
});
it("provider not in any family rule falls back to end of list", () => {

View file

@ -71,6 +71,8 @@ export const PROXY_FAMILY_PRIORITY: ReadonlyArray<{
* honest about which endpoints are "native" vs "via intermediary".
*/
family_failover?: string[];
/** Disable generic fallback for families that must stay on one provider. */
global_fallback?: boolean;
}> = [
// MiniMax direct (api.minimax.io) → CN endpoint as its direct pair
{
@ -86,36 +88,28 @@ export const PROXY_FAMILY_PRIORITY: ReadonlyArray<{
// or the Token Plan endpoint (token-plan-sgp.xiaomimimo.com). Both served
// under the `xiaomi` provider namespace.
{ match: /^mimo-|^XiaomiMiMo\//i, prefix: "mimo-", providers: ["xiaomi"] },
// Gemini/Gemma: google-gemini-cli (CLI OAuth via ~/.gemini), google
// (API key), google-vertex are all FIRST-PARTY Google endpoints.
// github-copilot re-serves and is failover only.
// Gemini/Gemma: route bare model IDs through google-gemini-cli only.
// Direct GenAI and Vertex providers stay explicit provider-qualified routes,
// but they are hidden from normal SF/TUI selection and default fallback.
{
match: /^gemini-|^gemma-/i,
prefix: "gemini-",
providers: ["google-gemini-cli", "google", "google-vertex"],
family_failover: ["github-copilot"],
providers: ["google-gemini-cli"],
global_fallback: false,
},
// Claude: Anthropic is the ONLY direct provider. github-copilot re-serves
// Claude via GitHub's platform as failover.
// Claude: Anthropic is the default provider. Copilot is disabled.
{
match: /^claude-/i,
prefix: "claude-",
providers: ["anthropic"],
family_failover: ["github-copilot"],
},
// GPT / o-series / codex: OpenAI is direct. azure-openai-responses is
// Microsoft's re-serving of OpenAI weights — treated as failover (it is
// the same weights via a different legal/contractual relationship).
// github-copilot likewise re-serves.
// Microsoft's re-serving of OpenAI weights — treated as failover. Copilot is disabled.
{
match: /^gpt-|^o\d|^codex-/i,
prefix: "gpt-",
providers: ["openai"],
family_failover: [
"azure-openai-responses",
"openai-codex",
"github-copilot",
],
family_failover: ["azure-openai-responses", "openai-codex"],
},
];
@ -251,6 +245,10 @@ const OPENCODE_FREE_MODEL_IDS = new Set([
const HIDDEN_MODEL_PROVIDERS = new Set([
"claude-code",
"google",
"google-vertex",
"github-copilot",
"xai",
"xiaomi-token-plan-ams",
"xiaomi-token-plan-cn",
"xiaomi-token-plan-sgp",
@ -1159,7 +1157,10 @@ export class ModelRegistry {
const familyProviders = overrideEntry?.[1] ?? familyEntry?.providers ?? [];
const familyFailover = familyEntry?.family_failover ?? [];
const seen = new Set([...familyProviders, ...familyFailover]);
const globalFallback = GLOBAL_PROVIDER_FALLBACK.filter((p) => !seen.has(p));
const globalFallback =
familyEntry?.global_fallback === false
? []
: GLOBAL_PROVIDER_FALLBACK.filter((p) => !seen.has(p));
return [...familyProviders, ...familyFailover, ...globalFallback];
}

View file

@ -29,25 +29,20 @@ const PROXY_FAMILY_PRIORITY: Array<{ match: RegExp; providers: string[] }> = [
{ match: /^glm-/i, providers: ["zai", "opencode", "opencode-go"] },
// Kimi: kimi-coding direct > opencode aggregators
{ match: /^kimi-/i, providers: ["kimi-coding", "opencode", "opencode-go"] },
// Gemini/Gemma: google direct > vertex (enterprise) > CLI (OAuth) > copilot
// Gemini/Gemma: proxy bare model IDs through cli-core only.
{
match: /^gemini-|^gemma-/i,
providers: [
"google",
"google-vertex",
"google-gemini-cli",
"github-copilot",
],
providers: ["google-gemini-cli"],
},
// Claude: anthropic direct > opencode > copilot
// Claude: anthropic direct > opencode. Copilot is disabled.
{
match: /^claude-/i,
providers: ["anthropic", "opencode", "github-copilot"],
providers: ["anthropic", "opencode"],
},
// GPT/OpenAI: openai direct > azure > copilot
// GPT/OpenAI: openai direct > azure. Copilot is disabled.
{
match: /^gpt-|^o[0-9]|^codex-/i,
providers: ["openai", "azure-openai-responses", "github-copilot"],
providers: ["openai", "azure-openai-responses"],
},
];
@ -83,13 +78,7 @@ export class ProxyServer {
// 1. Model Listing
app.get(["/v1/models", "/v1beta/models"], async (req, res) => {
const providers = [
"google",
"google-gemini-cli",
"google-vertex",
"anthropic",
"openai",
];
const providers = ["google-gemini-cli", "anthropic", "openai"];
const allModels = providers.flatMap((p) => getModels(p as any));
const formatted = allModels.map((m) => ({

View file

@ -1,6 +1,6 @@
{
"name": "@singularity-forge/pi-tui",
"version": "2.75.0",
"version": "2.75.3",
"description": "Terminal User Interface library (vendored from pi-mono)",
"type": "module",
"main": "./dist/index.js",

View file

@ -1,6 +1,6 @@
{
"name": "@singularity-forge/rpc-client",
"version": "2.75.0",
"version": "2.75.3",
"description": "Standalone RPC client SDK for sf-run — zero internal dependencies",
"license": "MIT",
"repository": {

View file

@ -1,6 +1,6 @@
{
"name": "sf",
"version": "2.75.0",
"version": "2.75.3",
"engines": {
"node": ">=24.15.0"
},

View file

@ -333,7 +333,7 @@ function extractCostFromNdjson(mid) {
// Auto-detect the SF loader path — works across npm global, homebrew, and local installs
function findSfLoader() {
// 1. Check if we're running from inside the sf-run repo itself
// 1. Check if we're running from inside the singularity-forge repo itself
const repoLoader = path.resolve(
import.meta.dirname,
"..",
@ -349,7 +349,7 @@ function findSfLoader() {
timeout: 3000,
}).trim();
const candidates = [
path.join(globalRoot, "sf-run", "dist", "loader.js"),
path.join(globalRoot, "singularity-forge", "dist", "loader.js"),
path.join(globalRoot, "@sf", "pi", "dist", "loader.js"),
];
for (const c of candidates) {

View file

@ -180,7 +180,7 @@ try {
console.log(
"==> Verifying @singularity-forge/* workspace package resolution...",
);
const installedRoot = join(installDir, "node_modules", "sf-run");
const installedRoot = join(installDir, "node_modules", "singularity-forge");
const criticalPackages = [
{ scope: "@singularity-forge", name: "pi-coding-agent" },
{ scope: "@singularity-forge", name: "rpc-client" },

View file

@ -64,7 +64,7 @@ function exitIfManagedResourcesAreNewer(currentAgentDir: string): void {
process.stderr.write(
`[sf] ${chalk.yellow("Version mismatch detected")}\n` +
`[sf] Synced resources are from ${chalk.bold(`v${managedVersion}`)}, but this \`sf\` binary is ${chalk.dim(`v${currentVersion}`)}.\n` +
`[sf] Run ${chalk.bold("npm install -g sf-run@latest")} or ${chalk.bold("sf update")}, then try again.\n`,
`[sf] Run ${chalk.bold("npm install -g singularity-forge@latest")} or ${chalk.bold("sf update")}, then try again.\n`,
);
process.exit(1);
}

View file

@ -1570,7 +1570,7 @@ async function runHeadlessOnce(
error("Failed to start RPC session", {
operation: "RpcClient.start",
file: cliPath,
guidance: "Verify SF_BIN_PATH is set or reinstall sf-run",
guidance: "Verify SF_BIN_PATH is set or reinstall singularity-forge",
cause: err,
}),
"[headless]",

View file

@ -19,7 +19,7 @@ const SUBCOMMAND_HELP: Record<string, string> = {
"",
"Update SF to the latest version.",
"",
"Equivalent to: npm install -g sf-run@latest",
"Equivalent to: npm install -g singularity-forge@latest",
].join("\n"),
sessions: [

View file

@ -360,13 +360,13 @@ if (missingPackages.length > 0) {
process.stderr.write(
`\nError: SF installation is broken — missing packages: ${missing}\n\n` +
`This is usually caused by one of:\n` +
` • An outdated version installed from npm (run: npm install -g singularity-foundry@latest)\n` +
` • An outdated version installed from npm (run: npm install -g singularity-forge@latest)\n` +
` • The packages/ directory was excluded from the installed tarball\n` +
` • A filesystem error prevented linking or copying the workspace packages\n\n` +
`Fix it by reinstalling:\n\n` +
` npm install -g singularity-foundry@latest\n\n` +
` npm install -g singularity-forge@latest\n\n` +
`If the issue persists, please open an issue at:\n` +
` https://github.com/singularity-ng/singularity-foundry/issues\n`,
` https://github.com/singularity-ng/singularity-forge/issues\n`,
);
process.exit(1);
}

View file

@ -67,11 +67,9 @@ const LLM_PROVIDER_IDS = [
"anthropic-vertex",
"claude-code",
"openai",
"github-copilot",
"openai-codex",
"google-gemini-cli",
"groq",
"xai",
"openrouter",
"mistral",
"xiaomi",
@ -88,7 +86,6 @@ const API_KEY_PREFIXES: Record<string, string[]> = {
const OTHER_PROVIDERS = [
{ value: "groq", label: "Groq", hint: "console.groq.com/keys" },
{ value: "xai", label: "xAI (Grok)", hint: "console.x.ai" },
{
value: "openrouter",
label: "OpenRouter",
@ -382,7 +379,7 @@ async function runLlmStep(
{
value: "browser",
label: "Sign in with your browser",
hint: "GitHub Copilot, ChatGPT, Google, etc.",
hint: "ChatGPT, Google Code Assist, ZAI, etc.",
},
{
value: "api-key",
@ -440,9 +437,8 @@ async function runLlmStep(
const provider = await p.select({
message: "Choose provider",
options: [
{ value: "github-copilot", label: "GitHub Copilot" },
{ value: "openai-codex", label: "ChatGPT Plus/Pro (Codex)" },
{ value: "google-gemini-cli", label: "Google Gemini CLI" },
{ value: "google-gemini-cli", label: "Google Code Assist" },
],
});
if (p.isCancel(provider)) return false;

View file

@ -18,11 +18,9 @@ const PI_SETTINGS_PATH = join(homedir(), ".pi", "agent", "settings.json");
const LLM_PROVIDER_IDS = [
"anthropic",
"openai",
"github-copilot",
"openai-codex",
"google-gemini-cli",
"groq",
"xai",
"openrouter",
"mistral",
];

View file

@ -2,7 +2,7 @@
"id": "genai-proxy",
"name": "GenAI Proxy",
"version": "1.0.0",
"description": "OpenAI-compatible proxy for Gemini CLI and GenAI clients",
"description": "OpenAI-compatible proxy for GenAI clients",
"tier": "bundled",
"requires": {
"platform": ">=2.29.0"

View file

@ -1,7 +1,8 @@
import { createProxyServer } from "./proxy-server.js";
const PROXY_COMMAND_NAME = "genai-proxy";
const PROXY_FLAG_NAME = "gemini-cli-proxy";
const PROXY_FLAG_NAME = "genai-proxy";
const LEGACY_PROXY_FLAG_NAME = "gemini-cli-proxy";
const DEFAULT_PROXY_PORT = 3000;
export function installGenaiProxyExtension(api, dependencies) {
let proxyServer = null;
@ -20,17 +21,24 @@ export function installGenaiProxyExtension(api, dependencies) {
});
return proxyServer;
};
const startProxyFromFlag = async (value, context) => {
const server = ensureProxyServer(context, resolveProxyPort(value));
await server.start();
};
api.registerFlag(PROXY_FLAG_NAME, {
description: "Start the Gemini CLI proxy server",
description: "Start the GenAI proxy server",
type: "string",
allowNoValue: true,
onStartup: async (value, context) => {
const server = ensureProxyServer(context, resolveProxyPort(value));
await server.start();
},
onStartup: startProxyFromFlag,
});
api.registerFlag(LEGACY_PROXY_FLAG_NAME, {
description: "Legacy alias for --genai-proxy",
type: "string",
allowNoValue: true,
onStartup: startProxyFromFlag,
});
api.registerCommand(PROXY_COMMAND_NAME, {
description: "Manage the Gemini CLI proxy server",
description: "Manage the GenAI proxy server",
handler: async (args, context) => {
await handleProxyCommand(
args ?? "",

View file

@ -2,8 +2,7 @@
* Google Search Extension
*
* Provides a `google_search` tool that performs web searches via Gemini's
* Google Search grounding feature. Uses the user's existing GEMINI_API_KEY or
* GOOGLE_GENERATIVE_AI_API_KEY and Google Cloud GenAI credits.
* Google Search grounding feature through @google/gemini-cli-core.
*
* The tool sends queries to Gemini Flash with `googleSearch: {}` enabled.
* Gemini internally performs Google searches, synthesizes an answer, and
@ -23,85 +22,54 @@ import {
resolveSearchProvider,
} from "../search-the-web/provider.js";
let client = null;
function getGeminiApiKey() {
return process.env.GEMINI_API_KEY || process.env.GOOGLE_GENERATIVE_AI_API_KEY;
}
async function getClient() {
if (!client) {
const { GoogleGenAI } = await import("@google/genai");
client = new GoogleGenAI({ apiKey: getGeminiApiKey() });
}
return client;
/**
* Build a Code Assist server through @google/gemini-cli-core.
*
* The OAuth fallback uses cli-core directly instead of carrying SF's own
* Cloud Code Assist wire client.
*/
async function buildCodeAssistServer(accessToken, projectId) {
const [{ CodeAssistServer }, { OAuth2Client }] = await Promise.all([
import("@google/gemini-cli-core"),
import("google-auth-library"),
]);
const authClient = new OAuth2Client();
authClient.setCredentials({ access_token: accessToken });
return new CodeAssistServer(authClient, projectId, { headers: {} });
}
/**
* Perform a search using OAuth credentials via the Cloud Code Assist API.
* This is used as a fallback when a Gemini API key env var is not set.
* Perform a search using OAuth credentials via @google/gemini-cli-core.
*/
async function searchWithOAuth(query, accessToken, projectId, signal) {
const model = process.env.GEMINI_SEARCH_MODEL || "gemini-2.5-flash";
const url = `https://cloudcode-pa.googleapis.com/v1internal:streamGenerateContent?alt=sse`;
const GEMINI_CLI_HEADERS = {
ideType: "IDE_UNSPECIFIED",
platform: "PLATFORM_UNSPECIFIED",
pluginType: "GEMINI",
};
const executeFetch = async (retries = 3) => {
const response = await fetch(url, {
method: "POST",
headers: {
Authorization: `Bearer ${accessToken}`,
"Content-Type": "application/json",
"User-Agent": "google-cloud-sdk vscode_cloudshelleditor/0.1",
"X-Goog-Api-Client": "gl-node/22.17.0",
"Client-Metadata": JSON.stringify(GEMINI_CLI_HEADERS),
const server = await buildCodeAssistServer(accessToken, projectId);
const promptId = `sf-google-search-${Date.now()}-${Math.random().toString(36).slice(2, 11)}`;
const chunks = server.generateContentStream(
{
model,
contents: [{ role: "user", parts: [{ text: query }] }],
config: {
tools: [{ googleSearch: {} }],
},
body: JSON.stringify({
project: projectId,
model,
request: {
contents: [{ parts: [{ text: query }] }],
tools: [{ googleSearch: {} }],
},
userAgent: "pi-coding-agent",
}),
signal,
});
if (
!response.ok &&
retries > 0 &&
(response.status === 429 || response.status >= 500)
) {
await new Promise((resolve) => setTimeout(resolve, 1000 * (4 - retries)));
return executeFetch(retries - 1);
},
promptId,
"USER",
);
let answer = "";
let grounding;
for await (const chunk of chunks) {
if (signal?.aborted) {
throw new Error("Request was aborted");
}
const candidate = chunk?.candidates?.[0];
const text = candidate?.content?.parts?.find((p) => p.text)?.text;
if (text) {
answer += text;
}
if (candidate?.groundingMetadata) {
grounding = candidate.groundingMetadata;
}
return response;
};
const response = await executeFetch();
if (!response.ok) {
const errorText = await response.text();
throw new Error(
`Cloud Code Assist API error (${response.status}): ${errorText}`,
);
}
// Note: streamGenerateContent returns SSE; for now, we consume all chunks.
// For simplicity and to match the previous structure, we'll read to end.
const text = await response.text();
const jsonLines = text
.split("\n")
.filter((l) => l.startsWith("data:"))
.map((l) => l.slice(5).trim())
.filter((l) => l.length > 0);
let data;
if (jsonLines.length > 0) {
// Aggregate chunks if needed, but for now we take the last chunk or assume it's one
data = JSON.parse(jsonLines[jsonLines.length - 1]);
} else {
data = JSON.parse(text);
}
const candidate = data.response?.candidates?.[0];
const answer = candidate?.content?.parts?.find((p) => p.text)?.text ?? "";
const grounding = candidate?.groundingMetadata;
const sources = [];
const seenTitles = new Set();
if (grounding?.groundingChunks) {
@ -230,7 +198,7 @@ export default function (pi) {
"Returns an AI-synthesized answer grounded in Google Search results, plus source URLs. " +
"Use this when you need current information from the web: recent events, documentation, " +
"product details, technical references, news, etc. " +
"Requires GEMINI_API_KEY, GOOGLE_GENERATIVE_AI_API_KEY, or Google login. Alternative to Brave-based search tools.",
"Requires Google Code Assist OAuth via the google-gemini-cli provider. Alternative to Brave-based search tools.",
promptSnippet:
"Search the web via Google Search to get current information with sources",
promptGuidelines: [
@ -257,24 +225,20 @@ export default function (pi) {
async execute(_toolCallId, params, signal, _onUpdate, ctx) {
const startTime = Date.now();
const maxSources = Math.min(Math.max(params.maxSources ?? 5, 1), 10);
// Check for credentials
let oauthToken;
let projectId;
const geminiApiKey = getGeminiApiKey();
if (!geminiApiKey) {
const oauthRaw =
await ctx.modelRegistry.getApiKeyForProvider("google-gemini-cli");
if (oauthRaw) {
try {
const parsed = JSON.parse(oauthRaw);
oauthToken = parsed.token;
projectId = parsed.projectId;
} catch {
// Fall through to error
}
const oauthRaw =
await ctx.modelRegistry.getApiKeyForProvider("google-gemini-cli");
if (oauthRaw) {
try {
const parsed = JSON.parse(oauthRaw);
oauthToken = parsed.token;
projectId = parsed.projectId;
} catch {
// Fall through to error
}
}
if (!geminiApiKey && (!oauthToken || !projectId)) {
if (!oauthToken || !projectId) {
// No Gemini credentials — try fallback through search-the-web providers
try {
const fallbackResult = await executeFallbackSearch(
@ -308,7 +272,7 @@ export default function (pi) {
content: [
{
type: "text",
text: "Error: No authentication found for Google Search. Please set GEMINI_API_KEY, GOOGLE_GENERATIVE_AI_API_KEY, or log in via Google.\n\nExample: export GEMINI_API_KEY=your_key or use /login google",
text: "Error: No Google Code Assist OAuth found for google_search. Configure the google-gemini-cli provider, or use a non-Google fallback search provider.",
},
],
isError: true,
@ -340,64 +304,12 @@ export default function (pi) {
// Call Gemini with Google Search grounding
let result;
try {
if (geminiApiKey) {
const ai = await getClient();
// Add a 30-second timeout to prevent hanging (#1100)
const timeoutController = new AbortController();
const timeoutId = setTimeout(() => timeoutController.abort(), 30_000);
const combinedSignal = signal
? AbortSignal.any([signal, timeoutController.signal])
: timeoutController.signal;
let response;
try {
response = await ai.models.generateContent({
model: process.env.GEMINI_SEARCH_MODEL || "gemini-2.5-flash",
contents: params.query,
config: {
tools: [{ googleSearch: {} }],
abortSignal: combinedSignal,
},
});
} finally {
clearTimeout(timeoutId);
}
// Extract answer text
const answer = response.text ?? "";
// Extract grounding metadata
const candidate = response.candidates?.[0];
const grounding = candidate?.groundingMetadata;
// Parse sources from grounding chunks
const sources = [];
const seenTitles = new Set();
if (grounding?.groundingChunks) {
for (const chunk of grounding.groundingChunks) {
if (chunk.web) {
const title = chunk.web.title ?? "Untitled";
// Dedupe by title since URIs are redirect URLs that differ per call
if (seenTitles.has(title)) continue;
seenTitles.add(title);
// domain field is not available via Gemini API, use title as fallback
// (title is typically the domain name, e.g. "wikipedia.org")
const domain = chunk.web.domain ?? title;
sources.push({
title,
uri: chunk.web.uri ?? "",
domain,
});
}
}
}
// Extract search queries Gemini actually performed
const searchQueries = grounding?.webSearchQueries ?? [];
result = { answer, sources, searchQueries, cached: false };
} else {
result = await searchWithOAuth(
params.query,
oauthToken,
projectId,
signal,
);
}
result = await searchWithOAuth(
params.query,
oauthToken,
projectId,
signal,
);
} catch (err) {
const msg = err instanceof Error ? err.message : String(err);
let errorType = "api_error";
@ -486,16 +398,14 @@ export default function (pi) {
// ── Session cleanup ─────────────────────────────────────────────────────
pi.on("session_shutdown", async () => {
resultCache.clear();
client = null;
});
// ── Startup notification ─────────────────────────────────────────────────
pi.on("session_start", async (_event, ctx) => {
if (getGeminiApiKey()) return;
const hasOAuth =
await ctx.modelRegistry.authStorage.hasAuth("google-gemini-cli");
if (!hasOAuth) {
ctx.ui.notify(
"Google Search: No authentication set. Log in via Google or set GEMINI_API_KEY / GOOGLE_GENERATIVE_AI_API_KEY to use google_search.",
"Google Search: No google-gemini-cli OAuth set. Configure Google Code Assist OAuth to use google_search.",
"warning",
);
}

View file

@ -43,7 +43,6 @@ function loadAuthJson() {
const STATUS_URLS = {
anthropic: "https://status.anthropic.com/api/v2/status.json",
codex: "https://status.openai.com/api/v2/status.json",
copilot: "https://www.githubstatus.com/api/v2/status.json",
};
async function fetchProviderStatus(provider) {
const url = STATUS_URLS[provider];
@ -200,112 +199,6 @@ async function fetchClaudeUsage() {
}
}
// ============================================================================
// Copilot Usage
// ============================================================================
function loadCopilotRefreshToken() {
// The copilot_internal/user endpoint needs the GitHub OAuth token (ghu_*),
// NOT the Copilot session token (tid=*). The refresh token IS the GitHub OAuth token.
const data = loadAuthJson();
// Use refresh token (GitHub OAuth token ghu_*) for the usage API
if (data?.["github-copilot"]?.refresh) return data["github-copilot"].refresh;
return undefined;
}
async function fetchCopilotUsage(_modelRegistry) {
const token = loadCopilotRefreshToken();
if (!token) {
return {
provider: "copilot",
displayName: "Copilot",
windows: [],
error: "No token",
};
}
const headersBase = {
"Editor-Version": "vscode/1.96.2",
"User-Agent": "GitHubCopilotChat/0.26.7",
"X-Github-Api-Version": "2025-04-01",
Accept: "application/json",
};
const tryFetch = async (authHeader) => {
const controller = new AbortController();
setTimeout(() => controller.abort(), 5000);
const res = await fetch("https://api.github.com/copilot_internal/user", {
headers: {
...headersBase,
Authorization: authHeader,
},
signal: controller.signal,
});
return res;
};
try {
// Copilot access tokens (from /login github-copilot) expect Bearer. PATs accept "token".
// GitHub OAuth token (ghu_*) requires "token" prefix, not Bearer
const attempts = [`token ${token}`];
let lastStatus;
let res;
for (const auth of attempts) {
res = await tryFetch(auth);
lastStatus = res.status;
if (res.ok) break;
if (res.status === 401 || res.status === 403) continue; // try next scheme
break;
}
if (!res || !res.ok) {
const status = lastStatus ?? 0;
return {
provider: "copilot",
displayName: "Copilot",
windows: [],
error: `HTTP ${status}`,
};
}
const data = await res.json();
const windows = [];
// Parse reset date for display
const resetDate = data.quota_reset_date_utc
? new Date(data.quota_reset_date_utc)
: undefined;
const resetDesc = resetDate ? formatReset(resetDate) : undefined;
// Premium interactions (e.g., Claude, o1 models) - has a cap
if (data.quota_snapshots?.premium_interactions) {
const pi = data.quota_snapshots.premium_interactions;
const remaining = pi.remaining ?? 0;
const entitlement = pi.entitlement ?? 0;
const usedPercent = Math.max(0, 100 - (pi.percent_remaining || 0));
windows.push({
label: `Premium`,
usedPercent,
resetDescription: resetDesc
? `${resetDesc} (${remaining}/${entitlement})`
: `${remaining}/${entitlement}`,
});
}
// Chat quota - often unlimited, only show if limited
if (data.quota_snapshots?.chat && !data.quota_snapshots.chat.unlimited) {
const chat = data.quota_snapshots.chat;
windows.push({
label: "Chat",
usedPercent: Math.max(0, 100 - (chat.percent_remaining || 0)),
resetDescription: resetDesc,
});
}
return {
provider: "copilot",
displayName: "Copilot",
windows,
plan: data.copilot_plan,
};
} catch (e) {
return {
provider: "copilot",
displayName: "Copilot",
windows: [],
error: String(e),
};
}
}
// ============================================================================
// Gemini Usage
// ============================================================================
async function fetchGeminiUsage(_modelRegistry) {
@ -740,13 +633,11 @@ class UsageComponent {
// Fetch usage and status in parallel
const [
claude,
copilot,
gemini,
codex,
kiro,
zai,
claudeStatus,
copilotStatus,
geminiStatus,
codexStatus,
] = await Promise.all([
@ -756,12 +647,6 @@ class UsageComponent {
windows: [],
error: "Timeout",
}),
timeout(fetchCopilotUsage(this.modelRegistry), 6000, {
provider: "copilot",
displayName: "Copilot",
windows: [],
error: "Timeout",
}),
timeout(fetchGeminiUsage(this.modelRegistry), 6000, {
provider: "gemini",
displayName: "Gemini",
@ -789,9 +674,6 @@ class UsageComponent {
timeout(fetchProviderStatus("anthropic"), 3000, {
indicator: "unknown",
}),
timeout(fetchProviderStatus("copilot"), 3000, {
indicator: "unknown",
}),
timeout(fetchGeminiStatus(), 3000, { indicator: "unknown" }),
timeout(fetchProviderStatus("codex"), 3000, {
indicator: "unknown",
@ -799,11 +681,10 @@ class UsageComponent {
]);
// Attach status to usage
claude.status = claudeStatus;
copilot.status = copilotStatus;
gemini.status = geminiStatus;
codex.status = codexStatus;
// Filter out providers with no data and no error (not configured)
const allUsages = [claude, copilot, gemini, codex, kiro, zai];
const allUsages = [claude, gemini, codex, kiro, zai];
this.usages = allUsages.filter(
(u) =>
u.windows.length > 0 ||

View file

@ -889,9 +889,9 @@ export function resolveModelId(modelId, availableModels, currentProvider) {
* Flat-rate providers charge the same per request regardless of model.
* Dynamic routing provides no cost benefit it only degrades quality (#3453).
* Uses case-insensitive matching with alias support to prevent fail-open on
* provider naming variations (e.g. "copilot" vs "github-copilot").
* provider naming variations.
*/
const BUILTIN_FLAT_RATE = new Set(["github-copilot", "copilot", "claude-code"]);
const BUILTIN_FLAT_RATE = new Set(["claude-code"]);
/**
* Check if a provider has flat-rate pricing where model selection provides no cost benefit.
* Consults built-in list, auth mode, and user preference list.

View file

@ -29,7 +29,8 @@ import { sfRoot } from "./paths.js";
import { loadPrompt } from "./prompt-loader.js";
import { deriveState } from "./state.js";
const UPDATE_REGISTRY_URL = "https://registry.npmjs.org/sf-run/latest";
const UPDATE_REGISTRY_URL =
"https://registry.npmjs.org/singularity-forge/latest";
const UPDATE_FETCH_TIMEOUT_MS = 5000;
function resolveInstallCommand(pkg) {
if ("bun" in process.versions) return `bun add -g ${pkg}`;
@ -598,7 +599,7 @@ function compareSemverLocal(a, b) {
}
export async function handleUpdate(ctx, deps = {}) {
const { execSync } = await import("node:child_process");
const NPM_PACKAGE = "sf-run";
const NPM_PACKAGE = "singularity-forge";
const current = deps.currentVersion ?? process.env.SF_VERSION ?? "0.0.0";
ctx.ui.notify(
`Current version: v${current}\nChecking npm registry...`,

View file

@ -36,7 +36,6 @@ function modelToProviderId(model) {
"google-vertex": "google-vertex",
anthropic: "anthropic",
openai: "openai",
"github-copilot": "github-copilot",
};
if (prefixMap[prefix]) return prefixMap[prefix];
}
@ -50,7 +49,6 @@ function modelToProviderId(model) {
return "openai";
if (lower.startsWith("gemini")) return "google";
if (lower.startsWith("llama") || lower.startsWith("mixtral")) return "groq";
if (lower.startsWith("grok")) return "xai";
if (lower.startsWith("mistral") || lower.startsWith("codestral"))
return "mistral";
return null;
@ -144,11 +142,11 @@ function resolveKey(providerId) {
/**
* Providers that can serve models normally associated with another provider.
* Key = the provider whose models can be served, Value = alternative providers to check.
* e.g. GitHub Copilot subscriptions can access Claude and GPT models.
* Copilot is disabled from SF default routing.
*/
const PROVIDER_ROUTES = {
anthropic: ["github-copilot"],
openai: ["github-copilot", "openai-codex"],
anthropic: [],
openai: ["openai-codex"],
google: ["google-gemini-cli"],
};
/**

View file

@ -27,13 +27,6 @@ export const PROVIDER_REGISTRY = [
prefixes: ["sk-"],
dashboardUrl: "platform.openai.com/api-keys",
},
{
id: "github-copilot",
label: "GitHub Copilot",
category: "llm",
envVar: "GITHUB_TOKEN",
hasOAuth: true,
},
{
id: "openai-codex",
label: "ChatGPT Plus/Pro (Codex)",
@ -42,18 +35,10 @@ export const PROVIDER_REGISTRY = [
},
{
id: "google-gemini-cli",
label: "Google Gemini CLI",
label: "Google Code Assist",
category: "llm",
hasOAuth: true,
},
{
id: "google",
label: "Google (Gemini)",
category: "llm",
envVar: "GEMINI_API_KEY",
envAliases: ["GOOGLE_GENERATIVE_AI_API_KEY"],
dashboardUrl: "aistudio.google.com/apikey",
},
{
id: "groq",
label: "Groq",
@ -61,13 +46,6 @@ export const PROVIDER_REGISTRY = [
envVar: "GROQ_API_KEY",
dashboardUrl: "console.groq.com",
},
{
id: "xai",
label: "xAI (Grok)",
category: "llm",
envVar: "XAI_API_KEY",
dashboardUrl: "console.x.ai",
},
{
id: "openrouter",
label: "OpenRouter",
@ -573,10 +551,6 @@ const TEST_ENDPOINTS = {
url: "", // Constructed dynamically with token in URL
headers: () => ({}),
},
xai: {
url: "https://api.x.ai/v1/models",
headers: (key) => ({ Authorization: `Bearer ${key}` }),
},
mistral: {
url: "https://api.mistral.ai/v1/models",
headers: (key) => ({ Authorization: `Bearer ${key}` }),

View file

@ -35,6 +35,10 @@ const OPENCODE_FREE_MODEL_IDS = new Set([
]);
const HIDDEN_MODEL_PROVIDERS = new Set([
"claude-code",
"google",
"google-vertex",
"github-copilot",
"xai",
"xiaomi-token-plan-ams",
"xiaomi-token-plan-cn",
"xiaomi-token-plan-sgp",

View file

@ -18,9 +18,9 @@ test("docker/Dockerfile.sandbox exists and uses Node 24 base", () => {
assert.match(content, /FROM node:24/);
});
test("docker/Dockerfile.sandbox installs sf-run globally", () => {
test("docker/Dockerfile.sandbox installs singularity-forge globally", () => {
const content = readFile("docker/Dockerfile.sandbox");
assert.match(content, /npm install -g sf-run/);
assert.match(content, /npm install -g singularity-forge/);
});
test("docker/Dockerfile.sandbox creates a non-root user", () => {

View file

@ -25,10 +25,6 @@ function createMockPI() {
};
}
/**
* Build a mock modelRegistry whose getApiKeyForProvider returns the given
* JSON string (matching what the real OAuth provider's getApiKey produces).
*/
function mockModelRegistry(oauthJson?: string) {
return {
authStorage: {
@ -43,82 +39,26 @@ function restoreEnv(name: string, value: string | undefined) {
else process.env[name] = value;
}
test("fix: google-search uses OAuth if GEMINI_API_KEY is missing", async (_t) => {
const originalKey = process.env.GEMINI_API_KEY;
const originalAlias = process.env.GOOGLE_GENERATIVE_AI_API_KEY;
delete process.env.GEMINI_API_KEY;
delete process.env.GOOGLE_GENERATIVE_AI_API_KEY;
const originalFetch = global.fetch;
(global as any).fetch = async (url: string, options: any) => {
assert.ok(
url.includes("cloudcode-pa.googleapis.com"),
"Should use Cloud Code Assist endpoint",
);
assert.equal(
options.headers.Authorization,
"Bearer mock-token",
"Should use correct bearer token",
);
return {
ok: true,
json: async () => ({
response: {
candidates: [{ content: { parts: [{ text: "Mocked AI Answer" }] } }],
},
}),
text: async () =>
JSON.stringify({
response: {
candidates: [
{ content: { parts: [{ text: "Mocked AI Answer" }] } },
],
},
}),
};
};
afterEach(() => {
global.fetch = originalFetch;
restoreEnv("GEMINI_API_KEY", originalKey);
restoreEnv("GOOGLE_GENERATIVE_AI_API_KEY", originalAlias);
});
const pi = createMockPI();
googleSearchExtension(pi as any);
const oauthJson = JSON.stringify({
token: "mock-token",
projectId: "mock-project",
});
const mockCtx = {
ui: { notify() {} },
modelRegistry: mockModelRegistry(oauthJson),
};
await pi.fire("session_start", {}, mockCtx);
const registeredTool = (pi as any).registeredTool;
const result = await registeredTool.execute(
"call-1",
{ query: "test" },
new AbortController().signal,
() => {},
mockCtx,
);
assert.equal(result.isError, undefined);
assert.ok(result.content[0].text.includes("Mocked AI Answer"));
});
test("google-search warns if NO authentication is present", async (_t) => {
test("google_search_when_no_code_assist_oauth_warns_and_falls_back_or_errors", async () => {
const originalKey = process.env.GEMINI_API_KEY;
const originalAlias = process.env.GOOGLE_GENERATIVE_AI_API_KEY;
const originalTavily = process.env.TAVILY_API_KEY;
const originalBrave = process.env.BRAVE_API_KEY;
const originalMinimax = process.env.MINIMAX_API_KEY;
delete process.env.GEMINI_API_KEY;
delete process.env.GOOGLE_GENERATIVE_AI_API_KEY;
process.env.TAVILY_API_KEY = "";
delete process.env.BRAVE_API_KEY;
delete process.env.MINIMAX_API_KEY;
afterEach(() => {
restoreEnv("GEMINI_API_KEY", originalKey);
restoreEnv("GOOGLE_GENERATIVE_AI_API_KEY", originalAlias);
restoreEnv("TAVILY_API_KEY", originalTavily);
restoreEnv("BRAVE_API_KEY", originalBrave);
restoreEnv("MINIMAX_API_KEY", originalMinimax);
});
const pi = createMockPI();
googleSearchExtension(pi as any);
@ -134,21 +74,21 @@ test("google-search warns if NO authentication is present", async (_t) => {
await pi.fire("session_start", {}, mockCtx);
assert.equal(notifications.length, 1);
assert.ok(notifications[0].msg.includes("No authentication set"));
assert.ok(notifications[0].msg.includes("No google-gemini-cli OAuth set"));
const registeredTool = (pi as any).registeredTool;
const result = await registeredTool.execute(
"call-2",
"call-1",
{ query: "test" },
new AbortController().signal,
() => {},
mockCtx,
);
assert.equal(result.isError, true);
assert.ok(result.content[0].text.includes("No authentication found"));
assert.ok(result.content[0].text.includes("No Google Code Assist OAuth"));
});
test("google-search uses GEMINI_API_KEY if present (precedence)", async (_t) => {
test("google_search_ignores_genai_api_key_env_and_requires_code_assist_oauth", async () => {
const originalKey = process.env.GEMINI_API_KEY;
const originalAlias = process.env.GOOGLE_GENERATIVE_AI_API_KEY;
process.env.GEMINI_API_KEY = "mock-api-key";
@ -158,42 +98,7 @@ test("google-search uses GEMINI_API_KEY if present (precedence)", async (_t) =>
restoreEnv("GEMINI_API_KEY", originalKey);
restoreEnv("GOOGLE_GENERATIVE_AI_API_KEY", originalAlias);
});
const pi = createMockPI();
googleSearchExtension(pi as any);
const notifications: any[] = [];
const mockCtx = {
ui: {
notify(msg: string, level: string) {
notifications.push({ msg, level });
},
},
modelRegistry: mockModelRegistry(
JSON.stringify({
token: "should-not-be-used",
projectId: "mock-project",
}),
),
};
await pi.fire("session_start", {}, mockCtx);
assert.equal(
notifications.length,
0,
"Should NOT notify if API Key is present",
);
});
test("google-search accepts GOOGLE_GENERATIVE_AI_API_KEY", async (_t) => {
const originalKey = process.env.GEMINI_API_KEY;
const originalAlias = process.env.GOOGLE_GENERATIVE_AI_API_KEY;
delete process.env.GEMINI_API_KEY;
process.env.GOOGLE_GENERATIVE_AI_API_KEY = "mock-alias-key";
afterEach(() => {
restoreEnv("GEMINI_API_KEY", originalKey);
restoreEnv("GOOGLE_GENERATIVE_AI_API_KEY", originalAlias);
});
const pi = createMockPI();
googleSearchExtension(pi as any);
@ -208,5 +113,6 @@ test("google-search accepts GOOGLE_GENERATIVE_AI_API_KEY", async (_t) => {
};
await pi.fire("session_start", {}, mockCtx);
assert.equal(notifications.length, 0);
assert.equal(notifications.length, 1);
assert.ok(notifications[0].msg.includes("google-gemini-cli OAuth"));
});

View file

@ -1,20 +1,61 @@
/**
* google-search-oauth-shape.test.ts Regression test for #2963.
* google-search-oauth-shape.test.ts OAuth fallback must use cli-core.
*
* The OAuth fallback in google_search manually POSTs to the Cloud Code Assist
* endpoint. The original implementation sent a request body that did not match
* the endpoint's expected contract, causing a 400 INVALID_ARGUMENT response.
*
* This test captures the fetch call and asserts that the URL and body conform
* to the Cloud Code Assist wire format used by the working provider in
* packages/pi-ai/src/providers/google-gemini-cli.ts.
* The OAuth fallback used to carry its own Cloud Code Assist HTTP client. SF now
* delegates that transport to @google/gemini-cli-core, matching the main
* google-gemini-cli provider boundary.
*/
import assert from "node:assert/strict";
import { afterEach, test } from "vitest";
import { afterEach, test, vi } from "vitest";
import googleSearchExtension from "../resources/extensions/google-search/index.js";
// ── Helpers ─────────────────────────────────────────────────────────────────
const cliCoreMock = vi.hoisted(() => ({
lastRequest: undefined as any,
lastPromptId: undefined as string | undefined,
lastUserTier: undefined as string | undefined,
}));
vi.mock("@google/gemini-cli-core", () => ({
CodeAssistServer: class {
async *generateContentStream(
request: any,
promptId: string,
userTier: string,
) {
cliCoreMock.lastRequest = request;
cliCoreMock.lastPromptId = promptId;
cliCoreMock.lastUserTier = userTier;
yield {
candidates: [
{
content: {
parts: [{ text: "Sunny, 85 F in Austin today." }],
},
groundingMetadata: {
groundingChunks: [
{
web: {
title: "weather.com",
uri: "https://weather.com/austin",
domain: "weather.com",
},
},
],
webSearchQueries: ["weather today in Austin Texas"],
},
},
],
};
}
},
}));
vi.mock("google-auth-library", () => ({
OAuth2Client: class {
setCredentials(_credentials: any) {}
},
}));
function createMockPI() {
const handlers: Array<{ event: string; handler: any }> = [];
@ -50,231 +91,67 @@ function mockModelRegistry(oauthJson?: string) {
};
}
/** A valid SSE response body matching the Cloud Code Assist wire format. */
function makeOkSSEBody() {
const payload = {
response: {
candidates: [
{
content: {
parts: [{ text: "Sunny, 85 °F in Austin today." }],
},
groundingMetadata: {
groundingChunks: [
{
web: {
title: "weather.com",
uri: "https://weather.com/austin",
domain: "weather.com",
},
},
],
webSearchQueries: ["weather today in Austin Texas"],
},
},
],
},
async function runOAuthSearch(query: string) {
const originalKey = process.env.GEMINI_API_KEY;
const originalAlias = process.env.GOOGLE_GENERATIVE_AI_API_KEY;
delete process.env.GEMINI_API_KEY;
delete process.env.GOOGLE_GENERATIVE_AI_API_KEY;
cliCoreMock.lastRequest = undefined;
cliCoreMock.lastPromptId = undefined;
cliCoreMock.lastUserTier = undefined;
afterEach(() => {
if (originalKey !== undefined) process.env.GEMINI_API_KEY = originalKey;
else delete process.env.GEMINI_API_KEY;
if (originalAlias !== undefined) {
process.env.GOOGLE_GENERATIVE_AI_API_KEY = originalAlias;
} else {
delete process.env.GOOGLE_GENERATIVE_AI_API_KEY;
}
});
const pi = createMockPI();
googleSearchExtension(pi as any);
const ctx = {
ui: { notify() {} },
modelRegistry: mockModelRegistry(
JSON.stringify({ token: "tok", projectId: "proj" }),
),
};
return `data: ${JSON.stringify(payload)}\n\n`;
await pi.fire("session_start", {}, ctx);
return pi.registeredTool.execute(
"c1",
{ query },
new AbortController().signal,
() => {},
ctx,
);
}
// ── Tests ────────────────────────────────────────────────────────────────────
test("#2963: OAuth fallback URL must include ?alt=sse query parameter", async (_t) => {
const originalKey = process.env.GEMINI_API_KEY;
delete process.env.GEMINI_API_KEY;
const originalFetch = global.fetch;
let capturedUrl = "";
(global as any).fetch = async (url: string, _options: any) => {
capturedUrl = url;
return { ok: true, text: async () => makeOkSSEBody() };
};
afterEach(() => {
global.fetch = originalFetch;
if (originalKey !== undefined) process.env.GEMINI_API_KEY = originalKey;
else delete process.env.GEMINI_API_KEY;
});
const pi = createMockPI();
googleSearchExtension(pi as any);
const oauthJson = JSON.stringify({ token: "tok", projectId: "proj" });
const ctx = {
ui: { notify() {} },
modelRegistry: mockModelRegistry(oauthJson),
};
await pi.fire("session_start", {}, ctx);
await pi.registeredTool.execute(
"c1",
{ query: "weather" },
new AbortController().signal,
() => {},
ctx,
);
test("google_search_oauth_when_api_key_missing_uses_cli_core_request_shape", async () => {
const result = await runOAuthSearch("weather tools test");
assert.equal(result.isError, undefined);
assert.ok(cliCoreMock.lastPromptId?.startsWith("sf-google-search-"));
assert.equal(cliCoreMock.lastUserTier, "USER");
assert.equal(cliCoreMock.lastRequest.model, "gemini-2.5-flash");
assert.deepEqual(cliCoreMock.lastRequest.contents, [
{ role: "user", parts: [{ text: "weather tools test" }] },
]);
assert.ok(
capturedUrl.includes("?alt=sse"),
`URL must contain ?alt=sse for SSE parsing to work. Got: ${capturedUrl}`,
cliCoreMock.lastRequest.config.tools.some(
(tool: any) => tool.googleSearch !== undefined,
),
);
});
test("#2963: OAuth fallback body must include userAgent field", async (_t) => {
const originalKey = process.env.GEMINI_API_KEY;
delete process.env.GEMINI_API_KEY;
const originalFetch = global.fetch;
test("google_search_oauth_when_cli_core_streams_grounding_returns_sources", async () => {
const result = await runOAuthSearch("weather");
let capturedBody: any = null;
(global as any).fetch = async (_url: string, options: any) => {
capturedBody = JSON.parse(options.body);
return { ok: true, text: async () => makeOkSSEBody() };
};
afterEach(() => {
global.fetch = originalFetch;
if (originalKey !== undefined) process.env.GEMINI_API_KEY = originalKey;
else delete process.env.GEMINI_API_KEY;
});
const pi = createMockPI();
googleSearchExtension(pi as any);
const oauthJson = JSON.stringify({ token: "tok", projectId: "proj" });
const ctx = {
ui: { notify() {} },
modelRegistry: mockModelRegistry(oauthJson),
};
await pi.fire("session_start", {}, ctx);
await pi.registeredTool.execute(
"c2",
{ query: "weather userAgent test" },
new AbortController().signal,
() => {},
ctx,
);
assert.ok(capturedBody, "fetch must have been called");
assert.equal(
typeof capturedBody.userAgent,
"string",
"Body must include a userAgent field (Cloud Code Assist contract)",
);
});
test("#2963: OAuth fallback body must contain google_search tool in correct format", async (_t) => {
const originalKey = process.env.GEMINI_API_KEY;
delete process.env.GEMINI_API_KEY;
const originalFetch = global.fetch;
let capturedBody: any = null;
(global as any).fetch = async (_url: string, options: any) => {
capturedBody = JSON.parse(options.body);
return { ok: true, text: async () => makeOkSSEBody() };
};
afterEach(() => {
global.fetch = originalFetch;
if (originalKey !== undefined) process.env.GEMINI_API_KEY = originalKey;
else delete process.env.GEMINI_API_KEY;
});
const pi = createMockPI();
googleSearchExtension(pi as any);
const oauthJson = JSON.stringify({ token: "tok", projectId: "proj" });
const ctx = {
ui: { notify() {} },
modelRegistry: mockModelRegistry(oauthJson),
};
await pi.fire("session_start", {}, ctx);
await pi.registeredTool.execute(
"c3",
{ query: "weather tools test" },
new AbortController().signal,
() => {},
ctx,
);
assert.ok(capturedBody, "fetch must have been called");
const tools = capturedBody.request?.tools;
assert.ok(Array.isArray(tools), "request.tools must be an array");
assert.ok(
tools.some((t: any) => t.googleSearch !== undefined),
`tools must contain a googleSearch entry. Got: ${JSON.stringify(tools)}`,
);
});
test("#2963: OAuth fallback body has correct top-level structure", async (_t) => {
const originalKey = process.env.GEMINI_API_KEY;
delete process.env.GEMINI_API_KEY;
const originalFetch = global.fetch;
let capturedBody: any = null;
(global as any).fetch = async (_url: string, options: any) => {
capturedBody = JSON.parse(options.body);
return { ok: true, text: async () => makeOkSSEBody() };
};
afterEach(() => {
global.fetch = originalFetch;
if (originalKey !== undefined) process.env.GEMINI_API_KEY = originalKey;
else delete process.env.GEMINI_API_KEY;
});
const pi = createMockPI();
googleSearchExtension(pi as any);
const oauthJson = JSON.stringify({ token: "tok", projectId: "proj" });
const ctx = {
ui: { notify() {} },
modelRegistry: mockModelRegistry(oauthJson),
};
await pi.fire("session_start", {}, ctx);
await pi.registeredTool.execute(
"c4",
{ query: "weather structure test" },
new AbortController().signal,
() => {},
ctx,
);
assert.ok(capturedBody, "fetch must have been called");
// Top-level fields required by CloudCodeAssistRequest
assert.equal(
capturedBody.project,
"proj",
"project must match the OAuth projectId",
);
assert.ok(
typeof capturedBody.model === "string" && capturedBody.model.length > 0,
"model must be a non-empty string",
);
assert.ok(
capturedBody.request && typeof capturedBody.request === "object",
"request must be an object",
);
assert.ok(
typeof capturedBody.userAgent === "string",
"userAgent must be present",
);
// Nested request fields
assert.ok(
Array.isArray(capturedBody.request.contents),
"request.contents must be an array",
);
assert.ok(
Array.isArray(capturedBody.request.tools),
"request.tools must be an array",
);
assert.equal(result.isError, undefined);
assert.ok(result.content[0].text.includes("Sunny, 85 F"));
assert.equal(result.details.sourceCount, 1);
assert.equal(result.details.cached, false);
});

View file

@ -386,7 +386,7 @@ test("sf exits early with a clear message when synced resources are newer than t
);
assert.match(
result.stderr,
/npm install -g sf-run@latest|sf update/,
/npm install -g singularity-forge@latest|sf update/,
"prints upgrade guidance",
);
assert.doesNotMatch(

View file

@ -34,7 +34,6 @@ const ONBOARDING_ENV_KEYS = [
"GOOGLE_CLOUD_LOCATION",
"GROQ_API_KEY",
"CEREBRAS_API_KEY",
"XAI_API_KEY",
"OPENROUTER_API_KEY",
"AI_GATEWAY_API_KEY",
"ZAI_API_KEY",
@ -392,11 +391,9 @@ test("boot and onboarding routes expose locked required state plus explicitly sk
assert.deepEqual(providerIds, [
"anthropic",
"openai",
"github-copilot",
"openai-codex",
"google-gemini-cli",
"groq",
"xai",
"openrouter",
"mistral",
]);
@ -419,20 +416,20 @@ test("runtime env-backed auth unlocks boot onboarding state and reports the envi
const fixture = makeWorkspaceFixture();
clearOnboardingEnv();
const authStorage = AuthStorage.inMemory({});
const previousGithubToken = process.env.GITHUB_TOKEN;
process.env.GITHUB_TOKEN = "ghu_runtime_env_token";
const previousOpenAiKey = process.env.OPENAI_API_KEY;
process.env.OPENAI_API_KEY = "sk-runtime-env-token";
configureBridgeFixture(fixture, "sess-env-auth");
onboarding.configureOnboardingServiceForTests({
authStorage,
getEnvApiKey: (provider: string) =>
provider === "github-copilot" ? process.env.GITHUB_TOKEN : undefined,
provider === "openai" ? process.env.OPENAI_API_KEY : undefined,
});
afterEach(async () => {
if (previousGithubToken === undefined) {
delete process.env.GITHUB_TOKEN;
if (previousOpenAiKey === undefined) {
delete process.env.OPENAI_API_KEY;
} else {
process.env.GITHUB_TOKEN = previousGithubToken;
process.env.OPENAI_API_KEY = previousOpenAiKey;
}
onboarding.resetOnboardingServiceForTests();
await bridge.resetBridgeServiceForTests();
@ -451,14 +448,14 @@ test("runtime env-backed auth unlocks boot onboarding state and reports the envi
assert.equal(bootPayload.onboarding.lockReason, null);
assert.equal(bootPayload.onboarding.bridgeAuthRefresh.phase, "idle");
assert.deepEqual(bootPayload.onboarding.required.satisfiedBy, {
providerId: "github-copilot",
providerId: "openai",
source: "environment",
});
const copilotProvider = bootPayload.onboarding.required.providers.find(
(provider: any) => provider.id === "github-copilot",
const openAiProvider = bootPayload.onboarding.required.providers.find(
(provider: any) => provider.id === "openai",
);
assert.equal(copilotProvider.configured, true);
assert.equal(copilotProvider.configuredVia, "environment");
assert.equal(openAiProvider.configured, true);
assert.equal(openAiProvider.configuredVia, "environment");
});
test("failed API-key validation stays locked, redacts the error, and is reflected in boot state without persisting auth", async (_t) => {
@ -767,20 +764,20 @@ test("logout_provider fails clearly for environment-backed auth that the browser
const fixture = makeWorkspaceFixture();
clearOnboardingEnv();
const authStorage = AuthStorage.inMemory({});
const previousGithubToken = process.env.GITHUB_TOKEN;
process.env.GITHUB_TOKEN = "ghu_env_only_token";
const previousOpenAiKey = process.env.OPENAI_API_KEY;
process.env.OPENAI_API_KEY = "sk-env-only-token";
configureBridgeFixture(fixture, "sess-logout-env");
onboarding.configureOnboardingServiceForTests({
authStorage,
getEnvApiKey: (provider: string) =>
provider === "github-copilot" ? process.env.GITHUB_TOKEN : undefined,
provider === "openai" ? process.env.OPENAI_API_KEY : undefined,
});
afterEach(async () => {
if (previousGithubToken === undefined) {
delete process.env.GITHUB_TOKEN;
if (previousOpenAiKey === undefined) {
delete process.env.OPENAI_API_KEY;
} else {
process.env.GITHUB_TOKEN = previousGithubToken;
process.env.OPENAI_API_KEY = previousOpenAiKey;
}
onboarding.resetOnboardingServiceForTests();
await bridge.resetBridgeServiceForTests();
@ -795,7 +792,7 @@ test("logout_provider fails clearly for environment-backed auth that the browser
assert.equal(bootBeforePayload.onboarding.locked, false);
assert.equal(
bootBeforePayload.onboarding.required.satisfiedBy.providerId,
"github-copilot",
"openai",
);
assert.equal(
bootBeforePayload.onboarding.required.satisfiedBy.source,
@ -807,7 +804,7 @@ test("logout_provider fails clearly for environment-backed auth that the browser
method: "POST",
body: JSON.stringify({
action: "logout_provider",
providerId: "github-copilot",
providerId: "openai",
}),
}),
);
@ -821,7 +818,7 @@ test("logout_provider fails clearly for environment-backed auth that the browser
assert.equal(logoutPayload.onboarding.locked, false);
assert.equal(
logoutPayload.onboarding.required.satisfiedBy.providerId,
"github-copilot",
"openai",
);
assert.equal(
logoutPayload.onboarding.required.satisfiedBy.source,

View file

@ -341,7 +341,7 @@ test("checkForUpdates handles network timeout gracefully", async (_t) => {
test("checkForUpdates handles missing version field in response", async (_t) => {
const tmp = mkdtempSync(join(tmpdir(), "sf-update-"));
const registry = await startMockRegistry({ name: "sf-run" }); // no version field
const registry = await startMockRegistry({ name: "singularity-forge" }); // no version field
afterEach(async () => {
await registry.close();
rmSync(tmp, { recursive: true, force: true });

View file

@ -69,8 +69,8 @@ test("resolveInstallCommand returns bun command when running under Bun (#4145)",
try {
(process.versions as Record<string, string | undefined>).bun = "1.0.0";
assert.equal(
resolveInstallCommand("sf-run@latest"),
"bun add -g sf-run@latest",
resolveInstallCommand("singularity-forge@latest"),
"bun add -g singularity-forge@latest",
);
} finally {
if (orig === undefined) {
@ -87,8 +87,8 @@ test("resolveInstallCommand returns npm command when not running under Bun (#414
try {
delete (process.versions as Record<string, string | undefined>).bun;
assert.equal(
resolveInstallCommand("sf-run@latest"),
"npm install -g sf-run@latest",
resolveInstallCommand("singularity-forge@latest"),
"npm install -g singularity-forge@latest",
);
} finally {
if (orig !== undefined) {

View file

@ -5,7 +5,7 @@ import chalk from "chalk";
import { appRoot } from "./app-paths.js";
const CACHE_FILE = join(appRoot, ".update-check");
const NPM_PACKAGE_NAME = "sf-run";
export const NPM_PACKAGE_NAME = "singularity-forge";
const CHECK_INTERVAL_MS = 24 * 60 * 60 * 1000; // 24 hours
const FETCH_TIMEOUT_MS = 5000;
const DEFAULT_REGISTRY_URL = `https://registry.npmjs.org/${NPM_PACKAGE_NAME}/latest`;
@ -87,7 +87,7 @@ export function resolveInstallCommand(pkg: string): string {
}
function printUpdateBanner(current: string, latest: string): void {
const installCmd = resolveInstallCommand("sf-run");
const installCmd = resolveInstallCommand(`${NPM_PACKAGE_NAME}@latest`);
process.stderr.write(
` ${chalk.yellow("Update available:")} ${chalk.dim(`v${current}`)}${chalk.bold(`v${latest}`)}\n` +
` ${chalk.dim("Run")} ${installCmd} ${chalk.dim("or")} /sf update ${chalk.dim("to upgrade")}\n\n`,

View file

@ -5,7 +5,7 @@ import {
resolveInstallCommand,
} from "./update-check.js";
const NPM_PACKAGE = "sf-run";
const NPM_PACKAGE = "singularity-forge";
export async function runUpdate(): Promise<void> {
const current = process.env.SF_VERSION || "0.0.0";

View file

@ -11,7 +11,7 @@ import {
nativeGetCurrentBranch,
nativeHasChanges,
nativeHasMergeConflicts,
} from "../resources/extensions/sf/native-git-bridge.ts";
} from "../resources/extensions/sf/native-git-bridge.js";
import { resolveBridgeRuntimeConfig } from "./bridge-service.ts";
const MAX_CHANGED_FILES = 25;

View file

@ -174,12 +174,6 @@ const REQUIRED_PROVIDER_CATALOG: RequiredProviderCatalogEntry[] = [
recommended: true,
},
{ id: "openai", label: "OpenAI", supportsApiKey: true, supportsOAuth: false },
{
id: "github-copilot",
label: "GitHub Copilot",
supportsApiKey: false,
supportsOAuth: true,
},
{
id: "openai-codex",
label: "ChatGPT Plus/Pro (Codex Subscription)",
@ -188,17 +182,11 @@ const REQUIRED_PROVIDER_CATALOG: RequiredProviderCatalogEntry[] = [
},
{
id: "google-gemini-cli",
label: "Google Cloud Code Assist (Gemini CLI)",
label: "Google Cloud Code Assist",
supportsApiKey: false,
supportsOAuth: true,
},
{ id: "groq", label: "Groq", supportsApiKey: true, supportsOAuth: false },
{
id: "xai",
label: "xAI (Grok)",
supportsApiKey: true,
supportsOAuth: false,
},
{
id: "openrouter",
label: "OpenRouter",
@ -454,13 +442,6 @@ async function defaultValidateApiKey(
"https://api.groq.com/openai/v1/models",
apiKey,
);
case "xai":
return await validateBearerRequest(
fetchImpl,
providerId,
"https://api.x.ai/v1/models",
apiKey,
);
case "openrouter":
return await validateBearerRequest(
fetchImpl,

View file

@ -1,7 +1,7 @@
import { spawn } from "node:child_process";
import { compareSemver } from "../update-check.ts";
const NPM_PACKAGE_NAME = "sf-run";
const NPM_PACKAGE_NAME = "singularity-forge";
const REGISTRY_URL = `https://registry.npmjs.org/${NPM_PACKAGE_NAME}/latest`;
const FETCH_TIMEOUT_MS = 5000;
const NPM_COMMAND = process.platform === "win32" ? "npm.cmd" : "npm";
@ -67,7 +67,7 @@ export function getUpdateStatus(): UpdateState {
}
/**
* Triggers an async global npm install of sf-run@latest.
* Triggers an async global npm install of singularity-forge@latest.
* Returns `true` if the update was started, `false` if one is already running.
* The child process runs in the background; poll `getUpdateStatus()` for progress.
*/
@ -78,14 +78,18 @@ export function triggerUpdate(targetVersion?: string): boolean {
updateState = { status: "running", targetVersion };
const child = spawn(NPM_COMMAND, ["install", "-g", "sf-run@latest"], {
stdio: ["ignore", "ignore", "pipe"],
// Detach so the child process is not killed if the parent exits
detached: false,
windowsHide: true,
// Avoid shell: true — npm.cmd is directly executable on Windows via spawn.
// Using shell expands the command injection surface unnecessarily.
});
const child = spawn(
NPM_COMMAND,
["install", "-g", `${NPM_PACKAGE_NAME}@latest`],
{
stdio: ["ignore", "ignore", "pipe"],
// Detach so the child process is not killed if the parent exits
detached: false,
windowsHide: true,
// Avoid shell: true — npm.cmd is directly executable on Windows via spawn.
// Using shell expands the command injection surface unnecessarily.
},
);
let stderr = "";

View file

@ -1,7 +1,14 @@
import { execFileSync } from "node:child_process";
import { join } from "node:path";
const binary = process.env.SF_SMOKE_BINARY || "npx";
const args = process.env.SF_SMOKE_BINARY ? ["--help"] : ["sf-run", "--help"];
const defaultBinary = process.execPath;
const defaultArgs = [
join(import.meta.dirname, "..", "..", "dist", "loader.js"),
];
const binary = process.env.SF_SMOKE_BINARY || defaultBinary;
const args = process.env.SF_SMOKE_BINARY
? ["--help"]
: [...defaultArgs, "--help"];
const output = execFileSync(binary, args, {
encoding: "utf8",

View file

@ -3,17 +3,23 @@ import { existsSync, mkdtempSync, rmSync } from "node:fs";
import { tmpdir } from "node:os";
import { join } from "node:path";
// Skip in non-TTY environments (CI containers) — init requires interactive mode
if (!process.stdin.isTTY && process.env.CI) {
console.log(" SKIP test-init (no TTY in CI)");
// Skip in non-TTY environments — init enters the interactive setup flow.
if (!process.stdin.isTTY || !process.stdout.isTTY) {
console.log(" SKIP test-init (no TTY)");
process.exit(0);
}
const tmpDir = mkdtempSync(join(tmpdir(), "sf-smoke-init-"));
try {
const binary = process.env.SF_SMOKE_BINARY || "npx";
const args = process.env.SF_SMOKE_BINARY ? ["init"] : ["sf-run", "init"];
const defaultBinary = process.execPath;
const defaultArgs = [
join(import.meta.dirname, "..", "..", "dist", "loader.js"),
];
const binary = process.env.SF_SMOKE_BINARY || defaultBinary;
const args = process.env.SF_SMOKE_BINARY
? ["init"]
: [...defaultArgs, "init"];
execFileSync(binary, args, {
encoding: "utf8",

View file

@ -1,9 +1,14 @@
import { execFileSync } from "node:child_process";
import { join } from "node:path";
const binary = process.env.SF_SMOKE_BINARY || "npx";
const defaultBinary = process.execPath;
const defaultArgs = [
join(import.meta.dirname, "..", "..", "dist", "loader.js"),
];
const binary = process.env.SF_SMOKE_BINARY || defaultBinary;
const args = process.env.SF_SMOKE_BINARY
? ["--version"]
: ["sf-run", "--version"];
: [...defaultArgs, "--version"];
const output = execFileSync(binary, args, {
encoding: "utf8",

View file

@ -8,6 +8,7 @@
"strict": true,
"declaration": true,
"incremental": true,
"tsBuildInfoFile": "dist/.tsbuildinfo/tsconfig.tsbuildinfo",
"esModuleInterop": true,
"skipLibCheck": true
},