Skip to main content
Define your entire agent team in a single YAML file. The Composer parses the spec and builds a live agent tree — no Python wiring needed.
pip install langchain-adk[composer]

Quick Start

compose.yaml
defaults:
  model:
    provider: anthropic
    name: claude-opus-4-6

agents:
  assistant:
    type: llm
    instructions: You are a helpful assistant.

main_agent: assistant
from langchain_adk.composer import Composer

agent = Composer.from_yaml("compose.yaml")

async for event in agent.astream("Hello!"):
    if event.is_final_response():
        print(event.text)

Architecture

The Composer is built around three modular registries under builders/:
composer/
├── __init__.py              # Public API: Composer, register_*
├── composer.py              # Orchestrates YAML → agent tree
├── errors.py                # ComposerError, CircularReferenceError
├── schema.py                # Pydantic models for YAML validation
└── builders/
    ├── agents/              # Agent builder registry
    │   ├── __init__.py      # register(), get(), Helpers
    │   ├── _common.py       # Shared LLM-agent builder logic
    │   ├── llm.py           # LlmAgent builder
    │   ├── react.py         # ReActAgent builder
    │   ├── sequential.py    # SequentialAgent builder
    │   ├── parallel.py      # ParallelAgent builder
    │   ├── loop.py          # LoopAgent builder
    │   └── a2a.py           # A2AAgent builder
    ├── models/              # Model provider registry
    │   └── __init__.py      # register(), create()
    └── tools/               # Tool builtin registry + resolvers
        └── __init__.py      # register_builtin(), resolve_*()
Each registry is independently extensible — register a custom agent type, model provider, or builtin tool without touching any other code.

YAML Schema

defaults

Global settings inherited by all agents.
defaults:
  model:
    provider: anthropic    # openai | anthropic | google | dotted.import.path
    name: claude-opus-4-6
    temperature: 0.7       # optional
  max_iterations: 10

models

Named model configurations. Agents reference them by name instead of repeating inline config.
models:
  fast:
    provider: openai
    name: gpt-4o-mini
    temperature: 0.0
  smart:
    provider: anthropic
    name: claude-opus-4-6
    max_tokens: 8192
    api_key: "sk-ant-..."
  local:
    provider: "myapp.models.OllamaChat"  # dotted import path
    name: llama3
    base_url: "http://localhost:11434"
Any key beyond provider, name, and temperature is forwarded directly to the LangChain model constructor (e.g. max_tokens, api_key, base_url, timeout, etc.). Agents reference models by name:
agents:
  researcher:
    type: llm
    model: smart             # reference to models section
  summarizer:
    type: llm
    model: fast              # different model for different agent
Or use inline model: for one-off overrides:
agents:
  bot:
    type: llm
    model:
      provider: openai
      name: gpt-4o
      max_tokens: 4096

tools

Named tool definitions referenced by agents.
tools:
  # Python function by import path
  search:
    function: "myapp.tools.search_web"

  # MCP server
  weather:
    mcp:
      url: "http://localhost:8001/mcp"

  # Built-in tool
  exit:
    builtin: exit_loop

  # Agent as tool (AgentTool)
  researcher:
    agent: ResearchAgent

skills

Named skill definitions, referenced by agents. Each agent that lists skills gets its own in-memory skill store with list_skills and load_skill tools. Skills can be defined inline or loaded from a FastMCP server.
skills:
  # Inline skill (content defined directly)
  summarize:
    name: summarize
    description: "Summarize text into bullet points."
    content: "Extract 3-5 key points. Be concise."

  # Remote skill (loaded from FastMCP server at build time)
  pdf_processing:
    name: pdf-processing
    description: "Process and extract data from PDFs."
    mcp:
      url: "http://localhost:8001/mcp"

  # Remote skill (in-memory FastMCP server)
  coding_standards:
    name: coding-standards
    description: "Team coding standards."
    mcp:
      server: "myapp.skills.server"  # dotted import path
Agents reference skills by name:
agents:
  triage:
    type: llm
    instructions: "Route to the right specialist."
    skills:
      - summarize
      - pdf_processing

agents

Flat dict of agent definitions. Agents reference each other by name.
agents:
  MyAgent:
    type: llm              # llm | react | sequential | parallel | loop | a2a
    description: "..."
    instructions: |
      System prompt goes here.
    model: smart            # reference to models section, or inline:
    # model:
    #   provider: anthropic
    #   name: claude-sonnet-4-6
    #   max_tokens: 4096
    tools:
      - search             # named reference
      - builtin: exit_loop # inline definition
      - agent: OtherAgent  # inline AgentTool
      - transfer:          # transfer routing
          targets: [A, B]
    skills:
      - summarize          # named reference to skills section
    planner:
      type: task            # plan_react | task
      tasks:
        - title: "Research"
    max_iterations: 10

Agent Types

TypeDescriptionRequired fields
llmLLM-powered agent with tool loopinstructions
reactStructured Reason+Act agentinstructions (optional)
sequentialRun sub-agents in orderagents
parallelRun sub-agents concurrentlyagents
loopRepeat sub-agents until doneagents, max_iterations
a2aRemote agent via A2A protocolurl

main_agent

The entry-point agent name.
main_agent: MyAgent

runner

Optional. Enables Composer.runner_from_yaml().
runner:
  app_name: my-app
  session_service: memory  # or dotted.import.path

server

Optional. Enables Composer.server_from_yaml() for A2A.
server:
  app_name: my-app
  version: "1.0.0"
  url: "http://localhost:8000"
  skills:
    - id: general
      name: General
      description: "General assistant."

Examples

Transfer Routing

agents:
  sales:
    type: llm
    description: "Order inquiries."
    tools: [lookup_order]
  support:
    type: llm
    description: "Technical help."
    tools: [search_docs]
  triage:
    type: llm
    instructions: "Route to the right specialist."
    tools:
      - transfer:
          targets: [sales, support]

main_agent: triage

Sequential Pipeline

agents:
  researcher:
    type: llm
    instructions: "Research the topic."
    tools: [search]
  writer:
    type: llm
    instructions: "Write an article from research."
  pipeline:
    type: sequential
    agents: [researcher, writer]

main_agent: pipeline

Loop with Exit

agents:
  writer:
    type: llm
    instructions: "Write a draft."
  reviewer:
    type: llm
    instructions: "Review. Call exit_loop if approved."
    tools:
      - builtin: exit_loop
  loop:
    type: loop
    agents: [writer, reviewer]
    max_iterations: 5

main_agent: loop

Extending with Registries

Custom Agent Types

from langchain_adk.composer import register_builder


async def build_custom(name, agent_def, spec, *, helpers):
    model_cfg = helpers.resolve_model(agent_def)
    tools = await helpers.resolve_tools(agent_def)
    return MyCustomAgent(name=name, tools=tools)

register_builder("custom", build_custom)
Then in YAML:
agents:
  my_agent:
    type: custom
The build function receives (name, agent_def, spec, *, helpers) where helpers provides:
  • helpers.resolve_model(agent_def) — merge agent/default model config
  • helpers.resolve_tools(agent_def) — resolve all tool references
  • helpers.build_agent(name) — recursively build a sub-agent by name

Custom Model Providers

from langchain_adk.composer import register_provider

register_provider("my_llm", MyCustomChatModel)
Built-in providers: openai, anthropic, google. Any unrecognized provider string is treated as a dotted import path to a custom BaseChatModel class.

Custom Builtin Tools

from langchain_adk.composer import register_builtin_tool

register_builtin_tool("my_tool", lambda: my_tool_instance)
Then reference it in YAML:
tools:
  my_tool:
    builtin: my_tool

Python API

MethodReturnsDescription
Composer.from_yaml(path)BaseAgentBuild root agent (sync)
Composer.from_yaml_async(path)BaseAgentBuild root agent (async)
Composer.runner_from_yaml(path)RunnerBuild Runner with sessions
Composer.runner_from_yaml_async(path)RunnerBuild Runner with sessions (async)
Composer.server_from_yaml(path)FastAPIBuild A2A server app
Composer.server_from_yaml_async(path)FastAPIBuild A2A server app (async)
Registry functionDescription
register_builder(type, fn)Add a custom agent type builder
register_provider(name, cls)Add a custom model provider
register_builtin_tool(name, factory)Add a custom builtin tool