Codebuff is an open-source AI coding assistant.
- The CLI runs in your terminal.
- A local web server handles authentication, billing, and agent publishing.
This repository is a Bun monorepo. The instructions below are for running the full stack locally.
- Bun (see
package.jsonengines) - Docker (for the local Postgres database)
- A GitHub account (for local login)
cp .env.example .env.localPick a port and keep these values consistent:
PORT=3101
NEXT_PUBLIC_WEB_PORT=3101
NEXT_PUBLIC_CODEBUFF_APP_URL=http://localhost:3101
NEXTAUTH_URL=http://localhost:3101Go to GitHub: Settings → Developer settings → OAuth Apps → New OAuth App
Use these values (replace the port if you chose a different one):
- Application name:
codebuff-local - Homepage URL:
http://localhost:3101 - Application description:
codebuff-local - Authorization callback URL:
http://localhost:3101/api/auth/callback/github - Enable Device Flow: enabled
Copy the Client ID and generate a Client secret, then set:
CODEBUFF_GITHUB_ID=<your-client-id>
CODEBUFF_GITHUB_SECRET=<your-client-secret>bun installOption A (recommended): start services and the CLI together.
bun run devOption B: background services + separate CLI.
bun run up
bun run start-cliStop background services:
bun run downOpen http://localhost:<port>/login and sign in with GitHub.
The CLI will also prompt you with a login URL when it needs authentication.
Local runs still use billing, so your user needs credits.
Start Drizzle Studio:
bun run start-studioThen open https://local.drizzle.studio/ and edit the credit_ledger table for your user.
Set a large principal and balance on an active (non-expired) row.
Run an OpenAI-compatible server locally.
Codebuff expects an OpenAI-compatible API with a /v1 base URL and a chat completions endpoint:
POST /v1/chat/completions
Example base URL:
http://localhost:8317/v1
Verify your server works (replace the model id with one your server supports):
curl -s http://localhost:8317/v1/chat/completions \
-H 'Authorization: Bearer factory-api-key' \
-H 'Content-Type: application/json' \
-d '{"model":"gpt-5.1-codex-max","messages":[{"role":"user","content":"hi"}]}'To force Codebuff to use your local model for all agent runs, add this to .env.local:
OPENAI_BASE_URL=http://localhost:8317/v1
OPENAI_API_KEY=factory-api-key
CODEBUFF_MODEL_OVERRIDE=gpt-5.1-codex-max
CODEBUFF_PROVIDER_OVERRIDE=openaiThere are two common ways to add/switch models.
A) Global override (recommended for local model servers)
- Set
CODEBUFF_MODEL_OVERRIDEto the model id your server supports. - Restart the stack after changing env.
Examples:
# If your server expects a bare model id
CODEBUFF_MODEL_OVERRIDE=gpt-5.1-codex-max
CODEBUFF_PROVIDER_OVERRIDE=openai
# Or specify the full provider/model string
CODEBUFF_MODEL_OVERRIDE=openai/gpt-5.1-codex-maxNotes:
- You do not need to “register” the model anywhere in Codebuff as long as the upstream provider accepts the
modelstring. - If you set the override to an
anthropic/*,google/*, etc. model, the request will go to OpenRouter and you must set a validOPEN_ROUTER_API_KEY.
B) Per-agent defaults (edit agent templates in this repo)
Each agent definition has a model field. To change what an agent uses by default:
- Base agents:
.agents/base/*(example:.agents/base/base.ts) - Other agents:
.agents/<agent-name>/*
If you are using a local OpenAI-compatible server, set the model to openai/<model-id>.
Restart after changing env:
bun run down
bun run upTroubleshooting
- If you see
No cookie auth credentials found, that is an OpenRouter 401. It means the request is still being routed to OpenRouter (for example, the model isanthropic/*). Fix it by either setting a realOPEN_ROUTER_API_KEYor forcing the local model override above.
When running the CLI outside this repo, agent templates must be present in the database.
-
Create a publisher profile at
http://localhost:<port>/publishers. Use publisher idcodebuff. -
Publish agents (repeat and add more agent ids if you get an "Invalid agent ID" error):
bun run start-cli -- publish base context-pruner file-explorer file-picker researcher thinker reviewer- Run the CLI against any directory:
bun run start-cli -- --cwd /path/to/other/repo