ADK: Python framework for building AI agents
Modular Python framework for building production AI agents.
Learn more about adk-python
ADK is a Python framework designed for building AI agents using code-first principles. It provides modular components for defining agent logic, integrating tools, and orchestrating multi-agent workflows, with support for both Gemini and other language models. The framework includes pre-built tools, custom function integration, OpenAPI specification support, and tool confirmation flows for human-in-the-loop execution. Agents can be deployed to Cloud Run, Vertex AI Agent Engine, or containerized environments, and the framework supports agent-to-agent communication through the A2A protocol.

Code-first architecture
Agent logic, tools, and orchestration are defined directly in Python, enabling version control, testing, and direct code manipulation rather than configuration-based approaches.
Tool ecosystem integration
Supports multiple tool sources including pre-built tools, custom Python functions, OpenAPI specifications, and MCP tools, with tight integration for Google services.
Multi-agent composition
Enables building scalable systems by composing multiple specialized agents into hierarchies, with support for agent-to-agent communication via the A2A protocol.
from google import adk
agent = adk.LlmAgent(
model="gemini-2.0-flash-exp",
system_instruction="You are a helpful coding assistant."
)
response = agent.run("How do I reverse a list in Python?")
print(response.text)Adds Visual Agent Builder UI for drag-and-drop workflow design, plus MCP prompt support and BigQuery anomaly detection tools.
- –Use the new Visual Agent Builder to design agents with a drag-and-drop interface and natural-language assistant.
- –Integrate MCP prompts via McpInstructionProvider and leverage ApigeeLlm for Apigee proxy connections.
Adds service registry for custom FastAPI implementations, session rewind capability, and fixes LangChain 1.0.0 import breakage.
- –Register custom service implementations via new service registry for FastAPI server integration.
- –Fix broken LangChain imports caused by their 1.0.0 release; update dependencies if using LangChain.
Adds invocation pause/resume, LLM context compaction, and citation metadata; no breaking changes noted but adapts to genai SDK 1.41.0 tool naming.
- –Enable pause/resume for long-running invocations and configure LlmEventSummarizer to compact context when token limits approach.
- –Access citation_metadata in LlmResponse and use ReflectRetryToolPlugin to auto-retry tool errors with corrected arguments.
Related Repositories
Discover similar tools and frameworks used by developers
open_clip
PyTorch library for contrastive language-image pretraining.
gym
Standard API for reinforcement learning environment interfaces.
context7
MCP server delivering version-specific library documentation to LLMs.
crawl4ai
Async browser automation extracting web content for LLMs.
LightRAG
Graph-based retrieval framework for structured RAG reasoning.