Get Goose
Overview
Goose is an open-source, locally-running AI coding agent developed by Block (formerly Square), the financial technology company founded by Jack Dorsey. Unlike cloud-based AI coding tools, Goose runs on your local machine and can interact directly with your file system, terminal, browser, and developer tools — making it suitable for developers who need an AI agent with broad system access without sending code to external cloud services.
Goose is designed as a powerful developer terminal companion: it can execute shell commands, read and modify files, browse documentation, run tests, interact with APIs, and chain these capabilities to complete complex multi-step tasks. Its plugin system ("extensions" in Goose terminology) allows it to integrate with additional tools and services.
In 2026, Goose has become one of the most popular open-source AI agent projects with a growing developer community. Block's backing provides stability and ongoing development. Its local execution model makes it appealing to privacy-conscious developers and organizations that can't send code to external AI services.
Key Features
Local Execution
Runs entirely on your machine. No code or file contents sent to external cloud services (except to the LLM API you configure). Ideal for sensitive codebases.
File System Access
Can read, write, and organize files across your project. Understands directory structures and can make coordinated changes across multiple files.
Terminal/Shell Integration
Executes shell commands, runs test suites, manages dependencies, and interacts with command-line tools as part of task completion.
Extensible Plugin System
Extensions add capabilities: browser control, database access, API integrations, Jira/GitHub integration. Community-contributed extensions available.
LLM Agnostic
Works with multiple LLM providers (Anthropic Claude, OpenAI GPT, local models via Ollama). You choose the model that fits your needs and budget.
Open Source (Apache 2.0)
Fully open source, auditable, and customizable. Self-host, modify, and extend without proprietary restrictions.
Pros & Cons
Advantages
- Open source and fully transparent
- Local execution for privacy
- Works with multiple LLM providers (not locked in)
- Extensible plugin architecture
- Block/Square backing for stability
- Active community development
Disadvantages
- CLI-first interface has steeper learning curve
- Less polished UX than commercial tools
- Requires more configuration than plug-and-play alternatives
- Performance depends on LLM you connect it to
Pricing
| Plan | Price | Notes |
|---|---|---|
| Open Source | Free | Fully free and open source (Apache 2.0). You pay only for LLM API costs separately. |
Best Use Cases
Goose Excels At:
- Privacy-conscious developers and organizations
- Developers who want full control over their AI tools
- Complex multi-step automation (file changes + terminal + API calls)
- Self-hosted or air-gapped development environments
- Power users comfortable with CLI
May Not Be Ideal For:
- Users wanting a polished out-of-the-box experience
- Non-technical users
- Simple code completion (use Copilot for that)
- Teams needing enterprise support SLAs
How It Compares
Goose vs Devin
Devin is a cloud-based autonomous agent with a browser-based interface. Goose runs locally with more privacy but requires more technical setup. Devin is better for non-technical team members who need a polished interface; Goose is better for engineers who want full control and transparency.
Goose vs Aider
Aider is also an open-source CLI coding agent. Goose has broader system access — it can control your browser, execute arbitrary shell commands, and use extensions beyond just code editing. Aider is more focused on pure code editing workflows and is simpler to configure for that use case.
Final Verdict
Our Recommendation
Goose is the best open-source AI coding agent for developers who want full control, privacy, and extensibility without vendor lock-in. Block's development investment and the active community make it a reliable foundation for building custom AI coding workflows. The local execution model and LLM-agnostic design are genuine differentiators for privacy-conscious teams. While the CLI-first interface requires technical comfort, developers who invest the setup time get an extremely powerful, customizable AI agent that can handle complex development tasks end-to-end.