Reliable webhooks with retries
Get notified when requests complete. Perfect for background processing without keeping connections open.
Visual
Webhook delivery flow
Events enter a queue, retry with backoff, and land in DLQ if they fail.
Event created
Completion or status change triggers
Enqueued for delivery
Ordered, signed with secret
Delivery with retries
Backoff until 200 OK received
Delivery success
200 OK, signature verified
DLQ if failed
Inspect and replay failed events
Attempts + timing logged
Full audit trail in request logs
webhook_url: "https://api.yourapp.com/hooks/ai" signature_header: "mr-signature" attempts: - status: 500, backoff_ms: 500 - status: 200, backoff_ms: 0 payload: { workflow, status, data, meta }
Queue + retry
Webhook jobs retry with backoff until your endpoint confirms delivery.
Signed and verifiable
Each request includes signatures so you can verify authenticity.
Full audit trail
Successes and failures appear in analytics and request logs with timing.
Retry policy
Backoff
Automatic retries until acknowledged.
Security
Signed
Verify integrity with shared secret.
Monitoring
Logged
Inspect deliveries by status and time.
01 · Queue
Completion enqueued with payload and signature secret.
02 · Deliver
Send webhook with mr-signature header; await 2xx.
03 · Retry
Backoff and retry until acknowledged or budget exhausted.
04 · Log
Record attempts in analytics and request logs.
Use cases
- ● Notify your app when async workflows finish.
- ● Keep CRM, billing, or analytics in sync.
- ● Power long-running jobs without keeping clients open.
What's unique
- ● Signed payloads for integrity.
- ● Retries with backoff and clear audit trail.
- ● Works alongside streaming and async requests.
Event-Driven Workflows
For complex workflows that require backend processing after AI generation, use event-driven workflows. Set an event_name on your workflow to enable a three-step flow: AI generation → your backend processing → final response.
AI generates
ModelRiver processes the AI request and sends the result to your webhook with type: "task.ai_generated"
You process
Your backend executes custom logic (e.g., database updates, tool calls) and calls back to ModelRiver's callback endpoint
Final response
ModelRiver broadcasts the completed result to WebSocket channels, updating your frontend in real-time
// Set event_name on workflow creation { "name": "movie_suggestion", "event_name": "new_movie_suggestion", ... } // Webhook payload includes callback_url { "type": "task.ai_generated", "event": "new_movie_suggestion", "ai_response": {...}, "callback_url": "https://api.modelriver.com/v1/callback/{channel_id}", "callback_required": true } // Your backend calls back with execution results POST /api/v1/callback/{channel_id} Authorization: Bearer YOUR_API_KEY { "data": {...}, "task_id": "abc123", "metadata": {...} }
Automatic timeout handling
If your backend doesn't call back within 5 minutes, ModelRiver automatically sends a timeout error to the WebSocket channel.
Testing Event-Driven Workflows
The playground automatically simulates the complete event-driven flow when testing workflows with event_name set, helping you understand and validate the flow before production.
Simulated Flow
- AI generates the response data
- Webhook event triggered (simulated)
- Backend processes data (simulated ~1.5s delay)
- Backend injects id field into AI's data
- Returns merged data + original ai_response
What You See
- ● "AI generated" status with progress indicator
- ● "Simulating backend callback" message
- ●
datawith AI's fields + injectedid - ●
ai_responsewith original AI data
No backend setup required: The playground simulation helps you visualize and validate your workflow logic before implementing the actual webhook callback in your production environment.
Programmatic access
Use the async API with webhook callbacks
POST https://api.modelriver.com/v1/ai/async Authorization: Bearer mr_live_your_key { "workflow": "order-processor", "messages": [...] } // Your webhook receives the completed response: { "channel_id": "550e8400-e29b-...", "status": "success", "data": { ... }, "meta": { ... } }
Configure webhook URLs in the console. Signed payloads with automatic retries and DLQ for failed deliveries.
Deliver even when clients disconnect
Use webhooks with streaming or async workflows to keep downstream systems up to date.