Continue: Open-source AI coding agent CLI
Multi-LLM coding agent with interactive and automated modes.
Learn more about Continue
Continue is a CLI-based coding agent that runs in multiple modes: TUI (terminal user interface) for interactive use, headless for background automation, and as IDE extensions for VS Code and JetBrains. The tool connects to various LLM providers including Claude, GPT, Gemini, and Qwen to execute coding tasks and workflows. It supports event-driven automation through PR triggers, scheduled execution, and custom event handlers. The architecture allows agents to execute workflows step-by-step with optional human approval at decision points.
Multi-Mode Execution
Run the same agent logic in TUI for interactive workflows, headless for CI/CD automation, or as IDE plugins for VS Code and JetBrains. No code changes required when switching between modes—agents adapt to the execution context automatically.
Event-Driven Automation
Workflows trigger on PR events, scheduled intervals, or custom event sources with configurable approval gates. Agents execute autonomously for trusted operations or require step-by-step human approval for sensitive changes.
Multi-Provider LLM Support
Connects to multiple LLM providers including OpenAI, Anthropic, local models, and custom endpoints through a unified interface. Switch between providers without changing code, optimizing for cost, latency, or capability requirements.
import { ContinueClient } from '@continuedev/cli';
const client = new ContinueClient();
const task = await client.createTask({
prompt: 'Fix all TypeScript errors in src/utils',
rules: ['Follow existing code style', 'Add unit tests'],
tools: ['read_file', 'write_file', 'run_terminal']
});
await task.execute();Related Repositories
Discover similar tools and frameworks used by developers
Awesome Nano Banana
Curated collection of images and prompts from Google's Gemini-2.5-Flash-Image model with model comparisons.
LeRobot
PyTorch library for robot imitation learning and sim-to-real transfer.
OpenVINO
Convert and deploy deep learning models across Intel hardware.
Paperless-ngx
Self-hosted OCR document archive with ML classification.
KoboldCpp
Self-contained llama.cpp distribution with KoboldAI API for running LLMs on consumer hardware.