Next.js 无缝嵌入 ModelRiver 同起高楼

这叫给前脸带走长连文字泄水的跑字儿 AI 对面话局长轴、压下潜挂放至后大仓走的动作行指指令端还有发放在边缘网轻拨调挑的节点之子能: 通数将这帮行脚走线所有指向并打着那必行且非必行要拿给 ModelRiver 过道并取那带有防坠带保死护救机免死牌、还给拨开单记账单明账明册以及压模生卡打框锁输出底形的绝手保护全享下。

纵览俯探个究竟 (Overview)

要知道 Next.js 此等挂在目前走在最高山山颠顶峰执牛耳之名的乃属 React 这个阵帮领着做全盘贯下首尾一线走通大全栈的巨首总教架构。这如果去攀上再联个与名叫 Vercel AI SDK 与咱们的 ModelRiver 打堆儿相连和成套一结,那便是叫你在仅用须臾弹指间的光阴也能平地起并搭建立出一道这端底可是具备十足工业成军干战产出实力极好把用的前排聊局机台 (chat interfaces)、带推放流字如下带流槽台 (streaming UIs) 和更挂有潜下里主用这后槽起意行权干派活叫 AI 给充的作主大脑干指令的动作发散器底手 (server actions)。

到你手里的实落筹码和进账项到底能有啥:

  • 出带这自带挂如长帘带水线接字那聊台机长表并得那在 Vercel AI SDK 打作做东之下托底连拉的显效图
  • 压潜水隐于幕下发在那不可被人眼视之只走后端老家由上意开这神招 AI 暗下问取调引那绝对杜不至出偏差这防死不向外出卖你密钥(API key)真身的走大招线指
  • 搭出借道挂生在叫边缘远头地头远台子(Edge function)神起底的并极压底取低阻求达极致速还反应反馈时间出包取响应绝杀的妙手绝点
  • 这这可是有这无论换下打谁抛锚顶不上便能起开大救命备替换胎带救保兜护神网防转的跳反神线(failover)相随

疾启开步拉引索 (Quick start)

拉上料先装填安包齐好

Bash
npx create-next-app@latest my-ai-app
cd my-ai-app
npm install ai @ai-sdk/openai openai

给塞上点记在环境变量下的牌底令秘藏钥符

Bash
# .env.local
MODELRIVER_API_KEY=mr_live_YOUR_API_KEY

搭伙携 Vercel AI SDK 开打那连珠飞落的流水聊台机流 (Streaming chat)

打起专为拉口外走调开的查号问向的指端 API 端门路 (API route)

TYPESCRIPT
1// app/api/chat/route.ts
2import { createOpenAI } from "@ai-sdk/openai";
3import { streamText } from "ai";
4 
5const modelriver = createOpenAI({
6 baseURL: "https://api.modelriver.com/v1",
7 apiKey: process.env.MODELRIVER_API_KEY!,
8});
9 
10export async function POST(req: Request) {
11 const { messages } = await req.json();
12 
13 const result = streamText({
14 model: modelriver("my-chat-workflow"), // 要在底子上切记把这里写死只写挂成你那个专属给的工作流流水号切不可少
15 messages,
16 });
17 
18 return result.toDataStreamResponse();
19}

拉摆出排场打上在前头给外挂显出人见的面子那板长条排组件 (Chat component)

TSX
1// app/page.tsx
2"use client";
3 
4import { useChat } from "ai/react";
5 
6export default function Chat() {
7 const { messages, input, handleInputChange, handleSubmit, isLoading } = useChat();
8 
9 return (
10 <div className="max-w-2xl mx-auto p-4">
11 <div className="space-y-4 mb-4">
12 {messages.map((m) => (
13 <div
14 key={m.id}
15 className={`p-4 rounded-lg ${
16 m.role === "user" ? "bg-blue-100 ml-12" : "bg-zinc-100 mr-12"
17 }`}
18 >
19 <p className="text-sm font-medium mb-1">
20 {m.role === "user" ? "人主子您" : "对头的大脑总机 AI"}
21 </p>
22 <p>{m.content}</p>
23 </div>
24 ))}
25 </div>
26 
27 <form onSubmit={handleSubmit} className="flex gap-2">
28 <input
29 value={input}
30 onChange={handleInputChange}
31 placeholder="给打两字儿上去罢..."
32 className="flex-1 p-2 border rounded-lg"
33 disabled={isLoading}
34 />
35 <button
36 type="submit"
37 disabled={isLoading}
38 className="px-4 py-2 bg-blue-600 text-white rounded-lg disabled:opacity-50"
39 >
40
41 </button>
42 </form>
43 </div>
44 );
45}

避走潜埋行发在这大后方内头出手的动作神条 (Server actions)

TYPESCRIPT
1// app/actions.ts
2"use server";
3 
4import { createOpenAI } from "@ai-sdk/openai";
5import { generateText, generateObject } from "ai";
6import { z } from "zod";
7 
8const modelriver = createOpenAI({
9 baseURL: "https://api.modelriver.com/v1",
10 apiKey: process.env.MODELRIVER_API_KEY!,
11});
12 
13export async function summarise(text: string) {
14 const { text: summary } = await generateText({
15 model: modelriver("my-summary-workflow"),
16 prompt: `给下面拿扔长这段这底本死扣扒下揉成短短一撮一段干子撮出来罢给个梗概大提:\n\n${text}`,
17 });
18 return summary;
19}
20 
21export async function extractEntities(text: string) {
22 const { object } = await generateObject({
23 model: modelriver("my-extraction-workflow"),
24 schema: z.object({
25 people: z.array(z.string()),
26 places: z.array(z.string()),
27 dates: z.array(z.string()),
28 }),
29 prompt: `从下面这一长道字串底子里给我刮并筛挑取那那些明眼子挂像大字眼跟物件长名出来抓来:\n\n${text}`,
30 });
31 return object;
32}

打死不让跑不给泄丝不吐长带而是一下喷完交的大死号向的指端 API 端门路 (API route (non-streaming))

TYPESCRIPT
1// app/api/generate/route.ts
2import OpenAI from "openai";
3 
4const client = new OpenAI({
5 baseURL: "https://api.modelriver.com/v1",
6 apiKey: process.env.MODELRIVER_API_KEY!,
7});
8 
9export async function POST(req: Request) {
10 const { prompt } = await req.json();
11 
12 const completion = await client.chat.completions.create({
13 model: "my-chat-workflow",
14 messages: [{ role: "user", content: prompt }],
15 });
16 
17 return Response.json({
18 content: completion.choices[0].message.content,
19 });
20}

抛掷寄放到跑向边镇关外那些远接散远点上所放出的发包神函数大挂手点 (Edge function)

TYPESCRIPT
1// app/api/edge-chat/route.ts
2import { createOpenAI } from "@ai-sdk/openai";
3import { streamText } from "ai";
4 
5export const runtime = "edge";
6 
7const modelriver = createOpenAI({
8 baseURL: "https://api.modelriver.com/v1",
9 apiKey: process.env.MODELRIVER_API_KEY!,
10});
11 
12export async function POST(req: Request) {
13 const { messages } = await req.json();
14 
15 const result = streamText({
16 model: modelriver("my-chat-workflow"),
17 messages,
18 });
19 
20 return result.toDataStreamResponse();
21}

被供进圈子神坛受世代瞻仰金句奉为圭臬不得乱逾这层高高大雷防线之准则行规 (Best practices)

  1. 死死封牢切绝那这能让你那开山总金钥能不小心跑光底让人窥走的万绝可能 (Never expose API keys): 时刻常记去要调用走这取发要命求旨向着那跑发找 ModelRiver 此举时你那万法必要得走经潜埋的做从大后主发的大暗指令或叫 server-side 端发起不可露边
  2. 挂开瀑流飞泉去带流出那交盘对话机表长串 (Use streaming for chat): 其源就此由于带这打着 Vercel AI SDK 者可是打生就本性懂带着能给你全替下包了把此解析并承包如那 SSE 阵的脏累大苦活的全无底挂的包
  3. 带引着拉套使叫 generateObject 此号工具打压出带被管有制出模的数据底死件成品 (Use generateObject for structured data): 请定好跟将这去套跟绑向结缔向那这属于并出产和压发在这 ModelRiver 内自家生的大本命内套好结构输成底盘定边法上以达成套的双边定印成压的双底稳双查检的双红保险去去走印出成品。
  4. 尽量发外远派拉放出去打出向去向边哨外延关口 (Deploy to edge) 以拉到拿回在交差回应里能压得出走那最短跟取得大快的打返低耗去取时间: 就因其实也且有这属于给配发挂了 ModelRiver 家去迎风开散拉下的所有接待端可本身就全是早就布满了挂满在四洋大环并散在大天南海北全球无尽点的收门客大招风树子去了而已
  5. 点灯去打着火把下沉看穿把死每一只落在末端记有各向路上的打去银耗子流水本上 (Monitor in Request Logs): 到去摸这向去里头 清查底案日志大监控司可观性天幕上台 (Observability) 那去挑抓理出在这每一单条每一各路线打发底下吞走卷走吞了多少去费项数真确大点

这茬后边还有指路明灯去往下深翻可觅好寻处