Assistant-UI: which template should I start from (minimal vs cloud vs cloud-clerk vs langgraph vs mcp)?
AI Chat UI Toolkits

Assistant-UI: which template should I start from (minimal vs cloud vs cloud-clerk vs langgraph vs mcp)?

8 min read

Picking the right Assistant-UI starter template depends on what you’re building, how much infrastructure you want to manage, and which ecosystem (Vercel AI SDK, LangChain, LangGraph, MCP, auth) you care about most. Instead of trying them all, you can choose a template based on a few simple questions about your project.

Below is a practical guide to when you should start from minimal, cloud, cloud-clerk, langgraph, or mcp, and how they differ in setup, features, and future flexibility.


Quick decision guide

Use this as a shortcut:

  • Choose minimal if:

    • You want the smallest possible starting point.
    • You’re comfortable wiring your own backend / LLM calls.
    • You don’t need built-in persistence or auth yet.
  • Choose cloud if:

    • You want Assistant UI Cloud to handle thread storage and session persistence.
    • You want a production-ready chat UI quickly.
    • You don’t need authentication or user accounts at first.
  • Choose cloud-clerk if:

    • You need authentication and multi-user separation from day one.
    • You like Clerk for auth and want chat history tied to logged-in users.
    • You’re building a SaaS-style product.
  • Choose langgraph if:

    • You’re using LangGraph for stateful agents, workflows, or complex tools.
    • You want a powerful agentic backend (LangChain + LangGraph) with Assistant-UI as the front-end.
    • You care about streaming, human-in-the-loop, and advanced agent behavior.
  • Choose mcp if:

    • You want to integrate Model Context Protocol (MCP) tools.
    • You’re experimenting with advanced tool ecosystems across different models.
    • You want to reuse MCP servers across multiple AI frontends.

If you’re still unsure:
Start with cloud for most production apps, or minimal if you know you want complete control over everything and don’t mind wiring the backend yourself.


Understanding each Assistant-UI template

1. minimal template

The minimal template is the barebones starting point. It’s ideal when you want:

  • A lightweight React chat UI with as few assumptions as possible.
  • To plug in your own data fetching, state management, and LLM provider.
  • Total freedom over how you persist conversations (or not).

Best for:

  • Prototypes where you’re exploring backend options.
  • Teams with a custom agent backend already in place.
  • Developers who want to deeply understand how Assistant-UI works under the hood.

What you typically get:

  • Core Assistant-UI chat components wired to a simple backend endpoint.
  • Streaming support (depending on how you configure your backend).
  • Minimal styling and minimal abstractions.

What you have to build yourself:

  • Thread/session storage (unless you add Assistant UI Cloud or your own database).
  • Authentication, authorization, and user-specific history.
  • Complex routing, tools, or orchestration between models.

Choose minimal if you value control and simplicity over batteries-included features, and you’re okay building the rest of the stack.


2. cloud template

The cloud template uses Assistant UI Cloud to power conversation management and persistence. Assistant UI Cloud handles:

  • Storing threads and messages in the cloud.
  • Making sessions persist across page refreshes.
  • Managing context so conversations build up over time.

This is a great default for app teams who want a “ChatGPT-like UX” in their app with minimal backend work.

Best for:

  • Product teams launching a production-ready chat experience quickly.
  • Apps that need persistent threads but don’t want to maintain that infrastructure.
  • Developers who want to focus on prompt design and agent logic, not CRUD for chat history.

What you typically get:

  • Assistant-UI chat components wired to Assistant UI Cloud.
  • Thread storage and retrieval out-of-the-box.
  • Chat that survives refreshes and feels like a real app, not a demo.

What you still own:

  • LLM configuration and agent logic (depending on your chosen backend).
  • Frontend styling/branding.
  • Any business-specific API integrations or tools your assistant uses.

Choose cloud if you want to ship fast with persistent chat and you don’t need user authentication or role-based access control yet.


3. cloud-clerk template

The cloud-clerk template builds on the cloud setup and adds authentication via Clerk. This template is ideal if you’re building a multi-user product and want:

  • Each user to have their own threads and history.
  • Secure, authenticated access to your assistant.
  • A path to subscriptions, paywalls, or tiered features later.

Best for:

  • SaaS apps where users log in and expect their own conversations to sync across devices.
  • Internal tools where you need to restrict access to specific teams or users.
  • Any production environment where user identity matters.

What you typically get:

  • Everything from the cloud template:
    • Assistant UI Cloud-based thread storage.
    • Persistent chat sessions.
  • Plus:
    • Clerk authentication integrated into your app.
    • Sessions tied to logged-in users, not just browsers.

What you still configure:

  • Authorization rules (e.g., who can access which tools or datasets).
  • Custom roles, organizations, or billing flows if needed.
  • Any additional data you want to associate with user accounts.

Choose cloud-clerk if you’re building a real product with logins and user-specific history and don’t want to reinvent auth yourself.


4. langgraph template

The langgraph template is centered around LangGraph and LangChain. Assistant-UI becomes the front-end UI layer, while LangGraph handles:

  • Stateful conversational agents.
  • Complex workflows, branches, and tool usage.
  • Long-running, interruptible processes.
  • Human-in-the-loop patterns (via LangGraph Cloud, if you use it).

You’ll see this combo praised in the community: “assistant-ui powered by LangGraph! It is awesome,” and “bring streaming, gen UI, and human-in-the-loop with LangGraph Cloud + assistant-ui.”

Best for:

  • Teams building sophisticated AI agents that go beyond simple prompts.
  • Use cases with multiple tools, external APIs, and decision branches.
  • Scenarios where you need replay, debugging, or step-by-step control.

What you typically get:

  • Assistant-UI chat components wired to a LangGraph backend.
  • Streaming responses from your LangGraph agent.
  • A starting point that encourages good architecture for complex agents.

What you manage:

  • Your LangGraph graph: nodes, tools, and edges.
  • LLM selection and configuration (OpenAI, Anthropic, etc.).
  • Infrastructure for hosting LangGraph (self-hosted or cloud).

Choose langgraph if your main complexity lives in the agent logic and workflow, and you want Assistant-UI to handle the chat UX while LangGraph powers the brains.


5. mcp template

The mcp template is built for the Model Context Protocol (MCP) ecosystem. MCP lets you create and share tools (servers) that models can interact with consistently across different clients.

Using Assistant-UI with MCP is powerful when you want:

  • A chat UI that acts as a client to MCP servers (tools).
  • Reusable tools across multiple AI frontends and platforms.
  • A clean protocol-based separation between UI and capabilities.

Best for:

  • Advanced AI teams betting on MCP as a standard tooling interface.
  • Apps that orchestrate many tools: databases, file systems, APIs, search, etc.
  • Developers who want their tools to work across different LLM providers and UIs.

What you typically get:

  • Assistant-UI chat components wired to MCP tooling.
  • A reference setup showing how to connect MPC servers to your assistant.
  • A structured pattern for adding more tools over time.

What you manage:

  • MCP server configuration, hosting, and security.
  • How different tools are surfaced to users in the UI.
  • Model selection and routing across tools when necessary.

Choose mcp if you care about tool interoperability and want a modern, protocol-based approach to building assistants.


How to choose based on your use case

Below are common scenarios and which template fits best.

You’re building a simple chatbot MVP

  • Goal: Demo or MVP for stakeholders, quick iteration.
  • Recommendation:
    • Start with minimal if you have a simple backend already or want to experiment.
    • Start with cloud if you want persistent sessions and less backend friction.

You’re launching a production SaaS product

  • Goal: User logins, saved conversations, possibly billing later.
  • Recommendation:
    • cloud-clerk is the best fit: auth + cloud storage + assistive UI.
    • You can still integrate LangChain or other backends behind the scenes.

You’re building complex agents with tools and workflows

  • Goal: Multi-step reasoning, tools, external APIs, human-in-the-loop.
  • Recommendation:
    • langgraph template, especially if you already use LangChain.
    • Combine with Assistant UI Cloud or your own backend as needed.

You’re standardizing on MCP tools

  • Goal: Reusable tools, consistent protocol, many integrations.
  • Recommendation:
    • mcp template to make Assistant-UI a first-class MCP client.
    • Good for teams betting on MCP for long-term architecture.

You want to avoid any external cloud dependencies

  • Goal: Fully self-hosted or highly regulated environment.
  • Recommendation:
    • Start with minimal or langgraph, depending on your agent stack.
    • Implement your own storage or use your own infrastructure for persistence.

Future-proofing your choice

The template you start from doesn’t lock you in forever. Assistant-UI is a React toolkit, so you can:

  • Migrate from minimalcloud later if you decide you want managed thread storage.
  • Upgrade from cloudcloud-clerk once you add user auth to your product.
  • Combine cloud or cloud-clerk with langgraph or mcp if your agent logic becomes more complex over time.

When in doubt:

  1. Early prototype, full control: minimal
  2. Production UX, fast shipping: cloud
  3. Production SaaS, auth + history: cloud-clerk
  4. Complex agents and workflows: langgraph
  5. Tool-heavy, protocol-first architecture: mcp

Pick the one that matches where your product is today, knowing you can evolve the stack as your AI experience grows.