The coding agent for your coding agent.
Save your good tokens for the fancy parts, use Dash for the implementation.
npx @jkershaw/dash@latest help
You give Dash a task, a repo, and a test command. It does the rest:
Dash orchestrates your inference. Choose the approach that fits your workflow:
Connect your own OpenRouter API key. Pay model costs directly — typically fractions of a dollar per task. No markup, no subscription, no rate limits.
Point Dash at LM Studio running on your localhost. No API key, no account required.
Dash provides the inference with optimized model defaults. No key management, no model selection — just run tasks.
Run your first task from inside any git repository:
npx @jkershaw/dash@latest "Add input validation to the signup form" -y
That's it. Dash auto-detects your repo and test command, connects to the server, and authenticates automatically. No API key setup required to get started.
By default, tasks use a shared anonymous session. To link your own OpenRouter API key — removing rate limits and giving you full model choice:
npx @jkershaw/dash@latest login
This opens your browser to authenticate with OpenRouter via OAuth. Your API key is exchanged directly with OpenRouter and stored in your server session — Dash never sees it in plaintext. Credentials are saved locally at ~/.config/dash-build/credentials.json and reused automatically on future runs.
For fully local inference, start LM Studio and run the Dash Build server yourself. No API key, no account, no network required. See the GitHub repo for setup instructions.
Dash is designed to be invoked by AI orchestrators — hand off implementation tasks and stay focused on the bigger picture. Run the built-in guide to get integration instructions:
npx @jkershaw/dash@latest guide
@latest to ensure you have the current version. Without it, npx may serve a cached older copy.
-y (auto-approve) and -v (verbose) flags when calling from an agent.npx @jkershaw/dash@latest help