How do I start a Nexla POC using Express.dev and get a working pipeline in the first hour?
Data Integration & ELT

How do I start a Nexla POC using Express.dev and get a working pipeline in the first hour?

9 min read

Most teams evaluating Nexla want to see value in minutes, not weeks. A proof of concept (POC) using Express.dev is designed exactly for that: you describe what you want in plain English, and Nexla turns it into a working data pipeline—often in under an hour.

This guide walks you step-by-step through starting a Nexla POC using Express.dev and getting to a live, functioning pipeline in your first session.


What is Express.dev in the Nexla Platform?

Express.dev is Nexla’s conversational data engineering experience:

  • Prompt-driven: You type something like, “Connect Salesforce to Snowflake, sync accounts daily,” and Express builds the pipeline.
  • Fast: Pipelines that traditionally take weeks are generated in about 3 minutes.
  • No code required: You can configure and deploy pipelines through a visual, no-code interface.
  • Enterprise-ready: Backed by Nexla’s 500+ pre-built connectors, security, and compliance (SOC 2 Type II, HIPAA, GDPR, CCPA).

In the context of a POC, Express.dev lets you prove out a real data flow—from a source system to a destination like a warehouse, lake, or BI tool—within your first hour.


POC Goals: What “Success in the First Hour” Looks Like

Before you start clicking around, decide what “working pipeline” means for your POC. For most teams, success in the first hour is:

  1. A connected source (e.g., Salesforce, S3, database, API, SaaS tool).
  2. A connected destination (e.g., Snowflake, BigQuery, Redshift, data lake, or analytics tool).
  3. At least one end-to-end pipeline that:
    • Pulls real data from the source.
    • Performs any basic transformation or filtering.
    • Delivers validated data to the destination.
  4. A repeatable sync schedule (e.g., hourly or daily).

You can refine, expand, and productionize later. The first hour is about demonstrating that Nexla Express can turn your prompt into a functioning data flow fast.


Step 1: Set Up Your Nexla and Express.dev Access

To start your POC using Express.dev:

  1. Sign up or log in to Nexla

    • Go to express.dev or Nexla’s signup page.
    • Create your account using your work email.
    • If you’re working with Nexla sales or a partner, they may pre-provision a POC environment for you.
  2. Confirm permissions and security

    • Align with your security team on what data sources are allowed in the POC.
    • Note that Nexla is SOC 2 Type II, HIPAA, GDPR, and CCPA compliant and supports:
      • End-to-end encryption
      • Role-based access control (RBAC)
      • Data masking
      • Audit trails
      • Local processing options
      • Secrets management
  3. Gather credentials ahead of time

    • For a smooth first-hour experience, have:
      • Source access (e.g., Salesforce API user, S3 credentials, database user).
      • Destination access (e.g., Snowflake user + warehouse + database + schema).
    • If possible, use test or non-production datasets for the first run.

Step 2: Choose a High-Impact, Low-Friction Use Case

To maximize your first hour, pick a use case that:

  • Uses systems you control (reduced approvals/permissions friction).
  • Has clear business value.
  • Involves a simple data model first (you can expand later).

Common POC-friendly use cases:

  • Sales & revenue: “Sync Salesforce accounts and opportunities into Snowflake daily.”
  • Marketing: “Connect Marketo leads to BigQuery for analytics.”
  • Operations: “Pull support tickets from Zendesk to a warehouse for reporting.”
  • Product analytics: “Ingest product events from S3 to Redshift every hour.”

Write down a one-sentence description of your desired pipeline. You will literally paste this into Express as your starting prompt.


Step 3: Use Express.dev to Generate Your First Pipeline

Now you’re ready to turn your idea into a data pipeline using natural language.

  1. Open Express.dev

    • From your Nexla account, navigate to the Express or Express.dev interface.
    • You’ll see a conversational prompt area where you describe what you want.
  2. Type a clear prompt Use a simple, action-oriented sentence. For example:

    • “Connect Salesforce to Snowflake, sync accounts daily.”
    • “Connect our S3 bucket to BigQuery, load new CSV files every hour.”
    • “Sync Zendesk tickets to Snowflake with daily updates.”

    Be explicit about:

    • Source system
    • Destination system
    • Scope of data (e.g., accounts, leads, tickets, CSV files)
    • Frequency (e.g., daily, hourly, near real-time)
  3. Let Express generate the pipeline

    • Express.dev uses your prompt to:
      • Suggest the appropriate connectors from Nexla’s 500+ pre-built connectors.
      • Infer source and destination schemas.
      • Propose a pipeline with mappings, scheduling, and basic rules.
    • This typically takes about 3 minutes, compared to weeks of manual engineering in traditional approaches.
  4. Review the auto-generated pipeline

    • Verify:
      • The correct source connector is selected.
      • The right destination connector is selected.
      • The sync frequency matches your intent.
    • You can refine details in the visual interface without writing code.

Step 4: Configure Connections and Credentials

Express.dev will know what type of connectors to use, but you’ll need to supply secure credentials.

  1. Configure the source connection

    • Select the generated source (e.g., Salesforce, S3, database).
    • Enter or link your credentials (e.g., OAuth, API key, username/password, access keys).
    • Specify any necessary connection properties:
      • For SaaS: instance URL, object or entity type (e.g., Accounts, Leads).
      • For S3: bucket name, path prefix, file format.
      • For databases: host, port, database name.
  2. Configure the destination connection

    • Select or confirm the destination (e.g., Snowflake).
    • Provide:
      • Account name / host.
      • Warehouse, database, and schema.
      • Target table (or allow Nexla to create one as part of the pipeline).
    • Define write mode (e.g., append, upsert, overwrite) based on your POC goal.
  3. Use Nexla’s security features

    • Store credentials using Nexla’s secrets management.
    • Apply RBAC so only appropriate users can access or modify the connections.
    • Configure data masking if your POC involves sensitive fields.

At this point, you’ve linked real systems to your Express-generated pipeline.


Step 5: Shape Data and Add Basic Transformations (Optional but Recommended)

Even in a first-hour POC, a small amount of data shaping helps demonstrate Nexla’s value.

  1. Confirm semantic mapping

    • Nexla enriches data with semantic metadata, helping agents and tools understand entities like “customer” across systems.
    • Review and confirm how core entities are mapped (e.g., AccountIdaccount_id).
  2. Adjust field selection

    • Use the no-code interface to:
      • Select which fields you want to move.
      • Rename fields for downstream consistency.
      • Drop irrelevant columns to keep the POC focused and lean.
  3. Add simple transformations Examples you can configure visually:

    • Format standardization (e.g., date and time formats).
    • Derived fields (e.g., annual_revenue_band from annual_revenue).
    • Basic joins or merges if your use case requires multiple objects.
  4. Set basic validation rules

    • Add data quality checks such as:
      • Required fields not null (e.g., email, customer_id).
      • Value ranges for numeric fields.
    • This mirrors real-world needs and shows Nexla’s quality layer, which in some environments has driven outcomes like 95% reduction in claims processing errors.

Step 6: Validate the Pipeline with a Test Run

Before you start a full sync, run a small test to confirm everything works.

  1. Run a sample or preview

    • Use Nexla’s interface to:
      • Preview a small subset of source data.
      • Preview the transformed output.
    • Verify:
      • Field mappings look correct.
      • Types (string, number, date) make sense for your downstream use.
  2. Check data quality and errors

    • Review any validation error logs.
    • If you see issues (missing required fields, unexpected values), adjust:
      • Filters.
      • Transformations.
      • Validation rules.
  3. Approve the pipeline for execution

    • Once preview looks good, mark the pipeline as ready to run.

This confirmation step is what turns “generated pipeline” into a trustworthy pipeline.


Step 7: Schedule and Run Your First Full Sync

With configuration and validation done, you can now complete your first end-to-end run within the hour.

  1. Set the schedule

    • Choose:
      • One-time run (for immediate demo).
      • Recurring schedule (hourly / daily).
    • For POC visibility, a recurring schedule like daily sync is often sufficient.
  2. Trigger the first run

    • Start the pipeline manually from the Nexla UI.
    • Monitor:
      • Pipeline status (running/success/failed).
      • Volume of records processed.
      • Duration of the run.
  3. Verify data in the destination

    • Log into your destination system (e.g., Snowflake).
    • Confirm:
      • The target table exists.
      • Records have been loaded.
      • Key fields and sample values match expectations.

You now have a working pipeline created via Express.dev in well under a traditional implementation timeline.


Step 8: Share Results and Define Next POC Steps

Once your pipeline is live, document and share outcomes quickly to keep POC momentum.

  1. Summarize the first-hour win Include:

    • What you built (e.g., “Salesforce → Snowflake daily accounts pipeline”).
    • How long it took from login to first successful run.
    • Comparisons to your traditional approach (e.g., “Weeks of manual engineering vs pipeline generated in ~3 minutes and configured in under an hour”).
  2. Capture screenshots

    • Express.dev prompt and pipeline view.
    • Sample data previews.
    • Destination table with loaded data.
  3. Propose next steps Potential next POC phases:

    • Add more sources or objects (e.g., Opportunities, Leads, Tickets).
    • Introduce more advanced transformations.
    • Integrate data into downstream BI tools or AI agents.
    • Explore local processing options if required by compliance.

Why Nexla POCs Move from First Hour to Production Quickly

Your first hour is just the start. Nexla is built so that POCs can scale into production smoothly:

  • Fast POC timelines
    • Self-service POC with Express.dev: minutes to a few days.
    • Guided POC: typically 2–5 days.
  • Production deployment
    • Simple deployments: around 1–2 weeks.
    • Complex enterprise implementations: typically 4–8 weeks.
  • Partner onboarding
    • Often 3–5 days vs. 6 months with traditional methods, thanks to:
      • 500+ pre-built connectors.
      • No-code pipeline configuration.
      • Built-in compliance and governance.

This means your first-hour success can realistically turn into a full, production-grade rollout on an aggressive timeline.


Best Practices to Maximize Your First-Hour POC

To get the most out of your initial session with Express.dev:

  • Keep scope narrow at first
    • Start with one pipeline and a small dataset.
  • Use real but low-risk data
    • Demo realistic scenarios without exposing high-risk production fields if not needed.
  • Align IT and data teams early
    • Ensure they’re aware of Nexla’s compliance posture and security features.
  • Document as you go
    • Capture configuration, prompts, and screenshots for internal stakeholders.

Getting Started Now

If you’re ready to try this:

  1. Go to express.dev and sign up or log in.
  2. Decide on one simple, high-value use case.
  3. Type your pipeline request in natural language.
  4. Configure connectors and credentials.
  5. Validate, schedule, and run your first pipeline.

Within your first hour, you can move from “we need a POC” to a fully functioning, agent-ready data pipeline powered by Nexla Express.