Overview
Backend frameworks are the foundation of AI-powered web applications. Whether you're building a chatbot API, a document processing pipeline, or a real-time streaming interface, ModelRiver integrates with your backend in minutes.
Why use ModelRiver with your backend framework?
- One endpoint for all providers: No per-provider SDK configuration
- Streaming support: SSE and WebSocket out of the box
- Background processing: Async tasks and webhook-driven workflows
- Full observability: Track every request across your application
Supported frameworks
| Framework | Language | Highlights | Difficulty | Guide |
|---|---|---|---|---|
| Phoenix | Elixir | LiveView streaming, GenServer integration | ⭐⭐ Medium | View guide → |
| Next.js | TypeScript | Vercel AI SDK, server actions, edge functions | ⭐⭐ Medium | View guide → |
| FastAPI | Python | Async REST, WebSocket, SSE, background tasks | ⭐⭐ Medium | View guide → |
| Django | Python | DRF, Celery, Django Channels | ⭐⭐ Medium | View guide → |
Phoenix (Elixir)
Build real-time AI features with Phoenix LiveView. Stream responses directly to the browser with built-in WebSocket support.
1config :my_app, :openai,2 base_url: "https://api.modelriver.com/v1",3 api_key: System.get_env("MODELRIVER_API_KEY")Next.js
Full-stack AI applications with the Vercel AI SDK. Server actions, edge functions, and React Server Components for the fastest possible UX.
1import OpenAI from "openai";2 3const client = new OpenAI({4 baseURL: "https://api.modelriver.com/v1",5 apiKey: process.env.MODELRIVER_API_KEY,6});FastAPI
High-performance async Python APIs. REST endpoints, WebSocket streaming, SSE, and background task processing.
1from openai import AsyncOpenAI2 3client = AsyncOpenAI(4 base_url="https://api.modelriver.com/v1",5 api_key=os.environ["MODELRIVER_API_KEY"],6)Django
Python's batteries-included framework. DRF views, Celery background tasks, and Django Channels for real-time streaming.
1from openai import OpenAI2 3client = OpenAI(4 base_url="https://api.modelriver.com/v1",5 api_key=settings.MODELRIVER_API_KEY,6)Next steps
- Knowledge & Memory: Store and query embeddings
- LLM Frameworks: LangChain, LlamaIndex, Haystack
- API reference: Endpoint documentation