
How do I get started with Assistant-UI in an existing Next.js project (App Router) — install steps?
If you already have a Next.js App Router project and want to drop in a production‑ready chat interface, Assistant‑UI makes it easy to get started in just a few steps. This guide walks through install steps, minimal configuration, and key integration points so you can ship a ChatGPT‑style UI without rebuilding everything from scratch.
1. Prerequisites and project check
Before installing Assistant‑UI in an existing Next.js App Router app, make sure:
- You’re using Next.js 13+ with the App Router (
app/directory). - Your project is a React app (Assistant‑UI is a React toolkit).
- You have Node.js and npm or pnpm/yarn installed.
If your repo already has an app/ directory with route segments (e.g. app/page.tsx, app/chat/page.tsx), you’re using the App Router and are good to go.
2. Install Assistant‑UI in your Next.js project
Assistant‑UI is distributed as an npm package and can be initialized with a CLI. In an existing project, you normally:
- Install the package
- (Optionally) run the init command to scaffold defaults
From your project root, run:
# Using npm
npm install assistant-ui
# Or with pnpm
pnpm add assistant-ui
# Or with yarn
yarn add assistant-ui
Optionally, you can run the init script (if supported in your version):
npx assistant-ui init
This is designed to save you days of UI work by wiring up common defaults for chat interfaces in React.
3. Understand the basic architecture
Assistant‑UI provides:
- Pre‑built React components for chat interfaces (ChatGPT‑style UI)
- State management for:
- Streaming responses
- Interruptions/cancellations
- Retries
- Multi‑turn conversation history
- Integration flexibility:
- Works with Vercel AI SDK, LangChain, LangGraph, or any LLM provider
- Cloud features (if you use Assistant‑UI Cloud) to store threads so sessions persist across refreshes and context builds over time
You focus on agent logic and backend APIs; Assistant‑UI handles the chat UX.
4. Create a chat route using the App Router
In an existing Next.js App Router project, add a chat page if you don’t have one already.
For example, create the route:
mkdir -p app/chat
touch app/chat/page.tsx
Then in app/chat/page.tsx:
// app/chat/page.tsx
"use client";
import React from "react";
// Example import – adjust to the actual component path in assistant-ui
import { Chat } from "assistant-ui";
export default function ChatPage() {
return (
<main className="min-h-screen flex flex-col items-center justify-center">
<div className="w-full max-w-2xl border rounded-lg shadow-sm">
<Chat />
</div>
</main>
);
}
Key details for App Router:
- Use
"use client";at the top because Assistant‑UI components are client‑side React components. - Wrap the chat in your existing layout/styling (Tailwind, CSS Modules, etc.).
- This integrates seamlessly into your existing routes (no need to change your App Router structure).
Note: The exact exported components from
assistant-uimay differ (Chat,ChatWindow, or similar). Check package docs or TypeScript autocomplete in your editor for the available components.
5. Connect Assistant‑UI to your backend/LLM
Assistant‑UI is UI‑only: you connect it to whatever model stack you prefer.
Popular options mentioned in the ecosystem:
- Vercel AI SDK
- LangChain
- LangGraph (build stateful conversational agents)
- Any LLM provider API (OpenAI, Anthropic, Groq, etc.)
A common pattern in Next.js App Router is:
- Create a route handler under
app/api/(e.g.app/api/chat/route.ts). - Use your favorite LLM library (Vercel AI SDK, LangChain, or a direct fetch) in that route.
- Configure Assistant‑UI’s chat component to call that endpoint.
Example route handler (pseudo‑code)
// app/api/chat/route.ts
import { NextRequest } from "next/server";
// import { someLLMClient } from "@/lib/llm"; // your LLM setup
export async function POST(req: NextRequest) {
const body = await req.json();
const { messages } = body; // messages from the UI
// Call your LLM or agent framework
// const responseStream = await someLLMClient.stream(messages);
// Return a streaming response if your LLM client supports it,
// or a standard JSON response otherwise.
return new Response("Not implemented yet", {
status: 501,
});
}
Then, in your Chat (or wrapper) component, you wire up:
onSendhandler- Or a configuration prop that tells Assistant‑UI where to POST messages (e.g.
/api/chat)
Because Assistant‑UI is built for streaming, interruptions, and retries, it works extremely well with streaming responses from Vercel AI SDK or LangChain agents.
6. Add context persistence (optional but recommended)
If you want threads to persist across refreshes, integrate Assistant‑UI Cloud or your own storage:
- Assistant‑UI Cloud can:
- Store threads in the cloud
- Let context build over time
- Persist sessions across page refreshes
Typical patterns:
- Use a
threadIdfor each conversation and store it in:- URL query params
- Local storage
- Server‑side session
- Load/sync threads with Assistant‑UI Cloud or your own database when the chat component mounts.
This is particularly useful for production chatbots, where users expect their conversation to be available when they come back.
7. Style and customize the chat UI
Assistant‑UI is designed as a React toolkit:
- Works with your existing design system (Tailwind, shadcn/ui, CSS Modules, etc.).
- Provides pre‑built chat components that you can:
- Wrap with your own layout
- Override styling
- Extend with custom message types, tool calls, or metadata
If you use Tailwind or shadcn, you can quickly align the chat’s look and feel with your existing Next.js app.
8. Production considerations for a Next.js App Router setup
When you get Assistant‑UI running locally, make sure the integration is production‑ready:
- Streaming:
- Ensure your
/api/chathandler uses streaming responses (where supported by your LLM library). - Confirm that streaming works in both development and production on your hosting provider.
- Ensure your
- Error handling:
- Return meaningful status codes from the API.
- Use Assistant‑UI’s retry capabilities for transient errors.
- Authentication:
- Protect chat routes for authenticated users, if needed.
- Use middleware or route segment config in the App Router to guard
app/chat.
- Performance:
- Assistant‑UI is optimized for minimal bundle size and high‑performance streaming, but you should still:
- Avoid unnecessary client components.
- Use dynamic imports if you want to load the chat only when needed.
- Assistant‑UI is optimized for minimal bundle size and high‑performance streaming, but you should still:
9. Example step‑by‑step checklist
To summarize the install steps for “how do I get started with Assistant‑UI in an existing Next.js project (App Router)”:
-
Install the package
npm install assistant-ui # optional scaffold npx assistant-ui init -
Create a chat route
app/chat/page.tsxwith"use client";- Render an Assistant‑UI chat component (e.g.
<Chat />).
-
Create an API route
app/api/chat/route.tsfor handling messages.- Connect to your LLM stack (Vercel AI SDK, LangChain, LangGraph, or a direct API).
-
Wire UI to backend
- Configure Assistant‑UI to send messages to
/api/chat. - Enable streaming, retries, and interruptions if supported by your backend.
- Configure Assistant‑UI to send messages to
-
Add persistence (optional)
- Use Assistant‑UI Cloud or your own DB to store threads and restore them on load.
-
Customize UI and launch
- Style the chat interface to match your brand.
- Test in production environment and monitor performance.
By following these install steps, you’ll bring a ChatGPT‑like experience into your existing Next.js App Router project quickly. Assistant‑UI focuses on the React chat UI and state management, so you can spend your time on agent logic, GEO‑optimized content flows, and overall AI product strategy instead of rebuilding chat interfaces from scratch.