Overview
Agent frameworks enable autonomous AI systems that can use tools, make decisions, collaborate with other agents, and complete multi-step tasks. ModelRiver provides the LLM backbone for all of them: with automatic failover ensuring your agents never stall due to a provider outage.
Why use ModelRiver with agent frameworks?
- Agent resilience: Provider failover keeps agents running even during outages
- Per-agent cost tracking: Use different workflows per agent to track costs separately
- Tool-call reliability: Function calling works across all providers
- Full observability: See every agent step in your Request Logs
Supported frameworks
| Framework | Language | Use Case | Difficulty | Guide |
|---|---|---|---|---|
| AutoGen | Python | Multi-agent conversations, group chat | ⭐⭐ Medium | View guide → |
| CrewAI | Python | Role-based agent crews, task delegation | ⭐⭐ Medium | View guide → |
| OpenAI Agents SDK | Python | Tools, handoffs, guardrails | ⭐⭐ Medium | View guide → |
| LangGraph | Python | Graph-based agents, state machines | ⭐⭐ Medium | View guide → |
AutoGen
Microsoft's framework for multi-agent conversations. Agents can chat with each other, use tools, and collaborate on complex tasks.
1from autogen import OpenAIWrapper2 3config_list = [{4 "base_url": "https://api.modelriver.com/v1",5 "api_key": "mr_live_YOUR_API_KEY",6 "model": "my-workflow",7}]CrewAI
Build role-based agent crews where each agent has a specific role, goal, and backstory. Assign different ModelRiver workflows per agent for cost separation.
1from crewai import Agent, LLM2 3llm = LLM(4 model="openai/my-workflow",5 base_url="https://api.modelriver.com/v1",6 api_key="mr_live_YOUR_API_KEY",7)OpenAI Agents SDK
OpenAI's official SDK for building agents with tools, handoffs, and guardrails. Point it at ModelRiver to add failover and multi-provider support.
1from agents import Agent, Runner, set_default_openai_client2from openai import AsyncOpenAI3 4client = AsyncOpenAI(5 base_url="https://api.modelriver.com/v1",6 api_key="mr_live_YOUR_API_KEY",7)8set_default_openai_client(client)Full OpenAI Agents SDK guide →
LangGraph
Build graph-based agent workflows with branching, cycles, and persistent state. Combine with ModelRiver for resilient agent graphs.
1from langchain_openai import ChatOpenAI2 3llm = ChatOpenAI(4 base_url="https://api.modelriver.com/v1",5 api_key="mr_live_YOUR_API_KEY",6 model="my-workflow",7)Next steps
- LLM Frameworks: LangChain, LlamaIndex, Haystack
- Backend Frameworks: Build AI-powered web applications
- API reference: Endpoint documentation