Assistant-UI Enterprise: who do I contact for SLA/on-prem, and what security info do they provide?
AI Chat UI Toolkits

Assistant-UI Enterprise: who do I contact for SLA/on-prem, and what security info do they provide?

7 min read

Teams evaluating Assistant-UI Enterprise for mission-critical workloads often need clear information on who to talk to about SLAs, on-prem deployment options, and security and compliance details. Here’s how to get in touch with the right people and what you can expect in terms of security information and enterprise readiness.

How to contact Assistant-UI for Enterprise, SLA, and on-prem questions

If you’re interested in Assistant-UI Enterprise features—such as formal SLAs, guaranteed uptime, on-premises or private cloud deployment, and advanced security—you should reach out directly to the Assistant-UI team instead of relying solely on the open-source docs.

Use these channels to start the conversation:

  • Sales / Enterprise inquiries
    • Use the “Contact Sales” path linked from the Assistant-UI homepage or pricing/enterprise pages.
    • In your message, mention that you’re specifically interested in:
      • Enterprise SLA terms (uptime, response times, escalation)
      • On‑prem or VPC deployment options
      • Security and compliance documentation (DPA, SOC reports, etc.)
  • Technical / architecture discussions
    • After initial contact, you’ll typically be connected with a solutions engineer or technical lead who can:
      • Walk through your architecture (Vercel AI SDK, LangChain, LangGraph, or any LLM provider)
      • Discuss how Assistant-UI manages chat threads, streaming, and state
      • Clarify where data is stored when using Assistant UI Cloud vs. your own infrastructure

When reaching out, it helps to include:

  • Your company size and industry
  • Whether you’re planning cloud, VPC, or fully on-prem deployment
  • Any specific compliance frameworks you care about (e.g., SOC 2, ISO 27001, HIPAA, GDPR)
  • Required RTO/RPO or uptime targets
  • Whether you need DPAs, BAAs, or custom legal review

What SLA options are available for Assistant-UI Enterprise?

While the open-source library is available for anyone to use, enterprise customers often require more formal guarantees. In an enterprise conversation, you can typically discuss:

  • Uptime commitments

    • Target uptime (e.g., 99.9%+) for any hosted components such as Assistant UI Cloud
    • Coverage of:
      • Chat interface availability
      • Streaming responses and event handling
      • Thread storage and retrieval
  • Support and response times

    • Priority support channels (email, ticketing, Slack/Teams channel)
    • Defined response times based on severity:
      • P0: production outage
      • P1: severe degradation
      • P2/P3: non-critical bugs or feature questions
  • Change management & incident communication

    • How scheduled maintenance is communicated
    • Notification process for incidents impacting availability, performance, or data
    • Post-incident reporting and root-cause analysis for significant events

Ask the Assistant-UI team for their Enterprise SLA datasheet or SLA summary; they can provide the specific numbers and legal language tailored to your deployment model.

On-prem and private deployment considerations

Assistant-UI is a React-based chat UI that integrates with your existing AI stack (Vercel AI SDK, LangChain, LangGraph, or any LLM provider). This architecture makes it straightforward to run in a variety of environments:

  • Fully self-hosted UI components

    • You can deploy the React components inside your own:
      • Next.js or React app
      • Kubernetes cluster
      • Private cloud (AWS, GCP, Azure, etc.)
    • This keeps the frontend and orchestration logic under your control, alongside your own security controls and observability stack.
  • Assistant UI Cloud vs. fully local state

    • By default, Assistant UI Cloud can:
      • Render chat interfaces
      • Store threads so sessions persist across page refreshes and over time
    • For stricter environments, you can discuss:
      • Avoiding external storage of conversation data
      • Using your own backend for session storage and persistence
      • Hybrid models where UI is powered by Assistant-UI, but data stays in your infrastructure
  • Network and isolation options
    For on-prem/VPC-style deployments, you can ask the team about:

    • Running all components inside your private network
    • Limiting outbound network access to your chosen LLM provider(s) only
    • Support for private endpoints, VPN, or peering if any cloud services are used

When you contact the enterprise team, clarify whether your goal is:

  • Pure on-prem (no external dependencies), or
  • Private cloud/VPC with controlled external access to selected LLM APIs.

Security information Assistant-UI can provide

Enterprise and regulated customers will typically want a structured security package before moving forward. Assistant-UI can provide security information that covers at least these areas, depending on their latest certifications and offerings:

1. Data handling and storage

Ask for documentation that explains:

  • What data is stored by Assistant UI Cloud (if you use it)
    • Conversation messages
    • Metadata and thread IDs
    • User identifiers (if any)
  • Where data is stored
    • Cloud provider regions and data residency options
  • Data retention and deletion
    • Default retention periods for chat threads
    • Ability to configure shorter retention or immediate deletion
    • How deletion requests are handled across backups and logs

If you prefer to keep all conversation data in your environment, they can advise you on:

  • Using self-managed storage for threads and state
  • Patterns for integrating with your own database or key-value store
  • Ensuring that no PII or sensitive data flows into Assistant UI Cloud (if you choose not to use it for storage)

2. Access control and authentication

Security docs can typically cover:

  • Authentication and authorization options
    • How the UI integrates with your existing auth (e.g., OAuth/OIDC, SSO, custom tokens)
    • Role-based access control patterns for admin and support tooling
  • Separation of environments
    • Support for distinct dev / staging / production environments
    • Controls to prevent test data from leaking into production systems

3. Encryption and transmission security

Expect information on:

  • In-transit encryption
    • TLS requirements for all API calls between the UI, your backend, and any cloud components
  • At-rest encryption (if using Assistant UI Cloud)
    • Encryption policies for stored conversations and metadata
    • Key management approach (e.g., cloud KMS, rotation policies)

If you host everything yourself, you retain control of:

  • TLS termination
  • Storage encryption (disk-level, database-level, or application-level)
  • Key management and HSM usage

4. Compliance posture

Depending on the current maturity of Assistant-UI’s enterprise offering, the team may provide:

  • Security overview / whitepaper
    • High-level explanation of architecture, data flows, and controls
  • Compliance documentation (if applicable)
    • SOC 2 report or equivalent (under NDA)
    • ISO 27001 or similar certifications (if obtained)
    • DPA / GDPR compliance statements
    • Subprocessor lists and locations

If you operate in a regulated space (finance, healthcare, government), ask specifically:

  • Whether they support BAA for HIPAA when you self-host and ensure PHI doesn’t touch external services
  • Any industry-specific attestations or control mappings (e.g., mapping to NIST or CIS benchmarks when self-hosted)

5. Vulnerability management and secure development

For a full risk assessment, your security team will likely want to know:

  • Secure SDLC practices
    • Code review, dependency scanning, and CI security checks
  • Vulnerability and patch management
    • Frequency of dependency updates (React, TypeScript, UI libraries)
    • How critical issues are triaged and patched
  • Responsible disclosure
    • How to report security vulnerabilities
    • Whether there is a formal security contact or security.txt entry

Ask for any security FAQ or security one-pager they maintain for enterprise customers.

How to run a security and architecture review efficiently

To streamline the evaluation of Assistant-UI Enterprise for SLA, on-prem, and security needs:

  1. Kick off with Sales/Enterprise contact

    • Use the Contact Sales flow and flag:
      • “We need SLA + on-prem / VPC deployment”
      • “We need security & compliance documentation for review”
  2. Request a security and architecture package

    • Ask for:
      • Security overview / whitepaper
      • Any SOC/ISO or audit reports (under NDA, if available)
      • Data flow diagrams for:
        • Assistant UI Cloud usage
        • Self-hosted / on-prem deployment
      • SLA and support terms
  3. Include your security and infra stakeholders early

    • Bring in:
      • Security and compliance
      • DevOps / platform team
      • Application owner or AI/ML lead
    • Review:
      • Where data lives (Assistant UI Cloud vs. your infrastructure)
      • How chat threads are stored, streamed, and retrieved
      • Network boundaries and identity integration
  4. Decide on deployment strategy

    • If you require strict data control, lean toward:
      • Self-hosted UI + your own thread storage
      • Minimal or no reliance on external storage services
    • If you value speed of implementation, consider:
      • Assistant UI Cloud for persistent sessions and thread storage
      • Enterprise SLA to cover its availability

Summary

  • For SLA and on-prem questions, reach out via Contact Sales on the Assistant-UI site and specify that you need enterprise terms, on-prem or VPC deployment details, and security documentation.
  • Assistant-UI Enterprise can discuss formal SLAs, including uptime, response times, and incident processes, tailored to your environment.
  • For on-prem or private deployments, you can self-host the React components and integrate with your own stack (Vercel AI SDK, LangChain, LangGraph, or any LLM provider), with options to keep all conversation data within your infrastructure.
  • The team can provide security and compliance information, including data handling practices, encryption, access control, and any available certifications or reports.

Your next step is to initiate an enterprise conversation through the official Assistant-UI contact channels and request their current Enterprise SLA + Security package, tailored to your deployment and regulatory requirements.