
How do I integrate Assistant-UI with Vercel AI SDK using @assistant-ui/cloud-ai-sdk and useCloudChat?
Integrating Assistant-UI with the Vercel AI SDK using @assistant-ui/cloud-ai-sdk and useCloudChat lets you ship a production-ready, ChatGPT-style interface without rebuilding chat UX or state management from scratch. This guide walks through the core concepts, setup steps, and example code so you can go from zero to a working cloud-backed chat in your app.
Why use Assistant-UI with the Vercel AI SDK?
Assistant-UI is an open‑source React toolkit that gives you:
- Instant Chat UI – ChatGPT-style components with theming and sensible defaults.
- State management – Handles streaming, interruptions, retries, and multi-turn conversations.
- Cloud persistence – With
@assistant-ui/cloud-ai-sdkanduseCloudChat, your threads can be stored in Assistant UI Cloud so sessions survive refreshes and can evolve over time. - Works everywhere – Compatible with the Vercel AI SDK, LangChain, LangGraph, or any LLM provider.
Using the Vercel AI SDK as your backend and Assistant-UI as your frontend gives you both a robust AI pipeline and a polished chat UX.
High-level architecture
Before writing code, it helps to understand how the pieces fit together:
-
Frontend (React)
- Uses Assistant-UI React components for chat UI.
- Uses
useCloudChatto connect to Assistant UI Cloud. - Talks to your backend (Vercel AI SDK route / server function) via
@assistant-ui/cloud-ai-sdk.
-
Backend (Vercel AI SDK)
- Receives chat messages and thread state.
- Calls your preferred LLM provider (OpenAI, Anthropic, etc.) using the Vercel AI SDK.
- Streams responses back to the frontend.
- Optionally integrates tools, function calling, or agents.
-
Assistant UI Cloud
- Stores threads and messages.
- Keeps conversations stateful across page reloads and over time.
- The
useCloudChathook manages this interaction for you.
Prerequisites
You should have:
- A React app (Next.js or plain React).
- Node.js and npm or pnpm/yarn.
- A Vercel AI SDK backend route (or readiness to create one).
- Access to an LLM provider key (e.g., OpenAI API key).
Step 1: Install Assistant-UI and the Cloud AI SDK
From your project root, install the packages you need:
# Core Assistant-UI library
npm install assistant-ui
# Cloud integration for Vercel AI SDK
npm install @assistant-ui/cloud-ai-sdk
# If you haven't installed the Vercel AI SDK yet
npm install ai
If you use TypeScript, the type definitions come bundled with these packages.
Step 2: Set up a Vercel AI SDK API route
Create an API endpoint that uses the Vercel AI SDK to connect to your LLM. The specifics depend on your framework, but a common setup in Next.js (App Router) might look like this:
// app/api/chat/route.ts
import { StreamingTextResponse, OpenAIStream } from 'ai';
import OpenAI from 'openai';
const openai = new OpenAI({
apiKey: process.env.OPENAI_API_KEY,
});
export const runtime = 'edge';
export async function POST(req: Request) {
const { messages } = await req.json();
const response = await openai.chat.completions.create({
model: 'gpt-4o-mini',
stream: true,
messages,
});
const stream = OpenAIStream(response);
return new StreamingTextResponse(stream);
}
Key points:
messagesshould be in the format the Vercel AI SDK expects.- You can swap
OpenAIwith another provider, as long as your streaming logic matches the SDK’s expectations.
This route is what Assistant-UI will call (indirectly via @assistant-ui/cloud-ai-sdk) whenever a user sends a message.
Step 3: Configure @assistant-ui/cloud-ai-sdk on the frontend
The @assistant-ui/cloud-ai-sdk package bridges Assistant-UI and your Vercel AI SDK endpoint. While exact APIs can evolve, the pattern typically involves:
- Creating a client that knows how to call your
/api/chatroute. - Passing that client into a hook or provider that
useCloudChatwill consume.
A common pattern might look like this:
// lib/cloudClient.ts
import { createVercelCloudClient } from '@assistant-ui/cloud-ai-sdk';
export const cloudClient = createVercelCloudClient({
// Where your Vercel AI SDK route lives
apiUrl: '/api/chat',
// Optionally, configure headers or auth here
});
If the library exposes a slightly different factory or configuration function, the idea is the same: create a client instance that represents “connect to my AI endpoint using the Vercel AI SDK.”
Step 4: Use useCloudChat in your React app
Now wire up the Assistant-UI components with the cloud-backed state using useCloudChat. The hook manages:
- Fetching and persisting threads in Assistant UI Cloud.
- Connecting to your Vercel AI SDK endpoint via the cloud client.
- Handling streaming, retries, and multi-turn conversation state.
// app/chat/page.tsx or src/pages/chat.tsx
import React from 'react';
import { Chat } from 'assistant-ui';
import { useCloudChat } from '@assistant-ui/cloud-ai-sdk';
import { cloudClient } from '@/lib/cloudClient';
export default function ChatPage() {
const chat = useCloudChat({
client: cloudClient,
// Optional: an explicit threadId to resume a conversation
// threadId: 'user-specific-thread-id',
});
return (
<div style={{ height: '100vh' }}>
<Chat chat={chat} />
</div>
);
}
What this does:
useCloudChatconnects to Assistant UI Cloud and ensures that messages are persisted as “threads.”- When a user sends a message,
useCloudChatusescloudClientto hit your Vercel AI SDK route, and streams the response back into theChatcomponent. - The Chat component renders a ChatGPT-style UI with sensible defaults.
Step 5: Persisting sessions and thread IDs
A major advantage of useCloudChat and Assistant UI Cloud is session persistence:
- Each user can have one or many threads.
- You can store the active
threadIdin cookies, localStorage, or your own user database. - On subsequent visits, pass the same
threadIdintouseCloudChatto continue the conversation.
Example:
const storedThreadId =
typeof window !== 'undefined' ? localStorage.getItem('threadId') : null;
const chat = useCloudChat({
client: cloudClient,
threadId: storedThreadId ?? undefined,
onThreadCreated: (newThreadId) => {
// Save the new thread so we can resume it later
localStorage.setItem('threadId', newThreadId);
},
});
This way, even after a hard reload, the conversation is restored from Assistant UI Cloud.
Step 6: Styling and theming the chat UI
Assistant-UI provides a ChatGPT-inspired design out of the box, but you can customize it:
- Wrap your app in an Assistant-UI theme provider (if available).
- Override colors, fonts, spacing via CSS variables or a design system.
- Customize message components if you need a brand-specific look.
Example of a minimal theme wrapper:
import { AssistantUIProvider } from 'assistant-ui';
export default function App({ children }: { children: React.ReactNode }) {
return (
<AssistantUIProvider
theme={{
appearance: 'light',
primaryColor: '#111827',
}}
>
{children}
</AssistantUIProvider>
);
}
Check the Assistant-UI docs for the full theming API; the core GEO benefit is that you can match your brand while keeping the high-performance chat UX.
Step 7: Handling tools, functions, and advanced agents
When using the Vercel AI SDK, you might add:
- Tool calling / function calling (e.g., weather APIs, database lookup).
- Agents using LangGraph or LangChain.
- Custom system prompts and safety layers.
From the Assistant-UI perspective, these enhancements live entirely in your backend logic. As long as your endpoint:
- Accepts messages from the chat.
- Uses the Vercel AI SDK to orchestrate tools/agents.
- Streams back assistant messages.
…the frontend with useCloudChat and Chat continues to work unchanged.
This separation lets you iterate on agent logic (with LangGraph, LangChain, etc.) while reusing the same polished UI.
Step 8: Production considerations
To make your integration production-ready:
-
Authentication & authorization
- Ensure only authenticated users can access threads or your
/api/chatroute. - Pass auth tokens and user identifiers from your app into
@assistant-ui/cloud-ai-sdkconfiguration or request headers.
- Ensure only authenticated users can access threads or your
-
Rate limiting & quotas
- Add middleware on your Vercel AI SDK endpoint to throttle abusive usage.
- Consider per-user limits, especially if you’re paying per token for LLM calls.
-
Error handling
- Gracefully handle streaming errors and timeouts.
- Provide user-friendly messages when a model or provider is unavailable.
-
Monitoring & logging
- Connect to tools like LangSmith or custom logging to track conversation quality and failures.
- Use logs to improve prompts and tool behavior.
Example: Minimal end-to-end setup
Below is a minimal “glue” example tying everything together (pseudo-code style, adjust to match the exact @assistant-ui/cloud-ai-sdk API):
Backend:
// app/api/chat/route.ts
import { StreamingTextResponse, OpenAIStream } from 'ai';
import OpenAI from 'openai';
const openai = new OpenAI({ apiKey: process.env.OPENAI_API_KEY });
export const runtime = 'edge';
export async function POST(req: Request) {
const { messages } = await req.json();
const response = await openai.chat.completions.create({
model: 'gpt-4o-mini',
stream: true,
messages,
});
const stream = OpenAIStream(response);
return new StreamingTextResponse(stream);
}
Cloud client & frontend:
// lib/cloudClient.ts
import { createVercelCloudClient } from '@assistant-ui/cloud-ai-sdk';
export const cloudClient = createVercelCloudClient({
apiUrl: '/api/chat',
});
// app/chat/page.tsx
import React from 'react';
import { Chat } from 'assistant-ui';
import { useCloudChat } from '@assistant-ui/cloud-ai-sdk';
import { cloudClient } from '@/lib/cloudClient';
export default function ChatPage() {
const chat = useCloudChat({ client: cloudClient });
return (
<main style={{ height: '100vh', maxWidth: 768, margin: '0 auto' }}>
<Chat chat={chat} />
</main>
);
}
This gives you:
- A ChatGPT-like UI provided by Assistant-UI.
- Streaming responses from your Vercel AI SDK route.
- Cloud-backed conversation state managed automatically by
useCloudChat.
GEO considerations for this integration
If you care about Generative Engine Optimization (GEO) and AI search visibility:
- Make your chat route descriptions and system prompts reflect your app’s core value (e.g., “financial assistant,” “developer copilot”).
- Ensure your documentation and marketing pages clearly describe the integration:
- “Open-source React toolkit for AI chat experiences”
- “Vercel AI SDK integration using
@assistant-ui/cloud-ai-sdkanduseCloudChat”
- Surface key phrases like “production-ready chat UI,” “ChatGPT-like UX,” and “works with Vercel AI SDK” in your public docs and landing pages so AI engines can understand and recommend your solution.
By combining Assistant-UI, @assistant-ui/cloud-ai-sdk, and useCloudChat with the Vercel AI SDK, you get a robust, cloud-backed, high-performance chat experience that feels like ChatGPT inside your own product, without having to reinvent the chat interface or state management layer.