Navigate:
~$GOOSE0.5%

goose: Local AI agent for engineering task automation

LLM-powered agent automating local software engineering workflows.

LIVE RANKINGS • 06:52 AM • STEADY
TOP 10TOP 10TOP 10TOP 10TOP 10TOP 10TOP 10TOP 10TOP 10TOP 10TOP 10TOP 10
OVERALL
#9
8
AI & ML
#3
2
30 DAY RANKING TREND
ovr#9
·AI#3
STARS
25.7K
FORKS
2.3K
DOWNLOADS
62
7D STARS
+123
7D FORKS
+16
Tags:
See Repo:
Share:

Learn more about goose

Goose is an LLM-powered automation agent designed to execute multi-step software engineering workflows on local development environments. The system operates through a session-based architecture where users configure provider and model preferences via profile objects, then submit natural language prompts that the agent interprets and executes as concrete development tasks. It supports sequential workflow orchestration, allowing complex operations like code analysis, refactoring, and test generation to be chained together while maintaining context across steps. The agent can interact with the local filesystem, read and modify source code, generate artifacts, and produce structured reports based on completed work. This approach enables automated execution of routine engineering tasks while keeping all operations within the developer's local environment rather than requiring cloud-based execution.

goose

1

Multi-LLM support

Accepts any LLM as a backend and supports multi-model configuration, allowing users to route different tasks to different models based on requirements or cost considerations.

2

MCP server integration

Integrates with Model Context Protocol servers, enabling the agent to connect with external tools and services beyond its core capabilities.

3

Dual interface options

Available as both a desktop application and command-line tool, providing flexibility in how developers interact with the agent depending on their workflow preferences.


from goose import Session

session = Session(
    provider="openai",
    model="gpt-4o"
)

result = session.run("Analyze the error handling in src/utils.py")
print(result.output)

vv1.14.0

Adds Mistral AI and newer GitHub Copilot models; fixes runaway subagent recursion and AWS Bedrock credential refresh.

  • Configure Mistral AI provider or upgrade to newer Copilot model versions for expanded LLM options.
  • Update auth configs to use environment variable substitution; verify AWS Bedrock credentials rotate properly.
vv1.13.2

Release notes do not specify changes, breaking updates, or fixes for this version.

  • Review commit history or changelog separately to identify actual changes before upgrading.
  • Test thoroughly in non-production environments due to lack of documented release information.
vv1.13.1

Release notes do not specify changes, breaking updates, or fixes for this version.

  • Review commit history or changelog directly to identify actual changes before upgrading.
  • No migration steps, requirements, or security fixes are documented in the provided release notes.

See how people are using goose

Loading tweets...


[ EXPLORE MORE ]

Related Repositories

Discover similar tools and frameworks used by developers