
How do I sign up for Langtrace and generate an API key for my project?
Getting started with Langtrace is quick and straightforward, and you can have your first project and API key ready in just a few minutes. This guide walks you through each step you need to sign up for Langtrace and generate an API key for your project so you can start instrumenting your AI agents with observability and evaluations.
Why you need a Langtrace project and API key
To use Langtrace with your AI agents or frameworks (like CrewAI, DSPy, LlamaIndex, or LangChain), you’ll:
- Create a project in Langtrace to organize your traces and evaluations
- Generate an API key that your application uses to send data securely to Langtrace
- Install the appropriate SDK and initialize Langtrace with that API key
Once connected, Langtrace helps you measure performance and safety and iterate toward better, more reliable AI agents with minimal effort.
Step 1: Sign up for a Langtrace account
-
Visit the Langtrace website
Open your browser and go to the Langtrace app or sign-up page. -
Choose how to sign up
- Use your work email (recommended, especially if you’ll collaborate with a team), or
- Use any supported single sign-on (SSO) method if available (e.g., Google / GitHub).
-
Complete basic profile details
- Name and organization (or project name if you’re just testing)
- Optional: team details if you’re onboarding with colleagues
-
Verify your email (if prompted)
Check your inbox for a verification email from Langtrace and confirm your account. You’ll then be redirected to the Langtrace dashboard.
Step 2: Create your first Langtrace project
Once you’re in the dashboard, you’ll create a project to hold all the traces and evaluations for a specific app or environment.
-
Open the Projects section
From the main Langtrace interface, navigate to the Projects or Create Project area. -
Create a new project
- Click “New Project” or “Create Project”
- Give your project a clear name, for example:
customer-support-bot-prodresearch-assistant-devmarketing-agent-eval
-
Set environment or tags (if available)
- Choose environment labels like
dev,staging, orprod - Add tags that help you group workloads (e.g.,
crewAI,langchain,llamaindex)
- Choose environment labels like
-
Save and confirm
After saving, you’ll land on the project overview page, where you can access configuration, analytics, and API keys.
Step 3: Generate an API key for your Langtrace project
Your API key is what allows your application or AI framework to send observability and evaluation data to Langtrace.
-
Navigate to API keys in your project
- From the project view, open Settings or API Keys
- Look for something like “Create API Key” or “Generate Key”
-
Create a new API key
- Click “Generate API Key”
- Optionally set:
- Name/label: e.g.,
backend-server-prod,local-dev, ordspy-pipeline - Scope or permissions (if the UI supports it): restrict to the specific project or environment
- Name/label: e.g.,
-
Copy the API key securely
- Copy the key immediately after it’s generated
- Store it in a secure location such as:
- Environment variables
- A secrets manager (e.g., AWS Secrets Manager, GCP Secret Manager, Vault)
- Avoid storing the key directly in source code or sharing it in logs or screenshots
-
Regenerate or revoke if needed
- If you suspect the key is exposed, revoke it from the same API settings page
- Generate a new key and update your services accordingly
Step 4: Install the Langtrace SDK for your framework
Langtrace supports popular agent and LLM frameworks like CrewAI, DSPy, LlamaIndex, and LangChain, along with a wide range of LLM providers and VectorDBs.
-
Choose the SDK for your stack
- If you use LangChain: pick the Langtrace integration or Python/TypeScript SDK
- If you use LlamaIndex, DSPy, or CrewAI: use the corresponding Langtrace integration
- For custom setups: use the core Langtrace SDK for your language/runtime
-
Install the SDK In many cases (example in Python):
pip install langtraceOr in JavaScript/TypeScript:
npm install langtrace # or yarn add langtrace -
Initialize Langtrace with your API key
Python example:
from langtrace import Langtrace lt = Langtrace(api_key=os.environ["LANGTRACE_API_KEY"])TypeScript/Node example:
import { Langtrace } from "langtrace"; const lt = new Langtrace({ apiKey: process.env.LANGTRACE_API_KEY, });Make sure
LANGTRACE_API_KEYis set in your environment with the key you generated in the Langtrace dashboard. -
Integrate with your framework
- Wrap your LLM calls, tools, and agents with Langtrace tracing utilities
- Enable evaluations where supported so you can track quality, safety, and performance
Step 5: Verify data is flowing into your Langtrace project
Once your SDK is initialized with the API key and added to your code:
-
Trigger some activity
- Run a few AI agent interactions in your app
- Call LLM pipelines or workflows that should emit traces
-
Check the Langtrace dashboard
- Go to your project’s Traces or Observability view
- Confirm that new traces, spans, and evaluation results are appearing
- Filter by time, environment, or framework if needed
-
Refine configuration
- Add more instrumentation where visibility is missing
- Set up evaluations to measure response quality and safety
- Configure alerts or dashboards (if available) to monitor key metrics
Best practices for managing your Langtrace API keys
To keep your Langtrace integration secure and maintainable:
-
Use separate API keys per environment
For example: one key fordev, another forstaging, and another forprod. -
Rotate keys periodically
Regenerate and update keys on a schedule or when team members change. -
Avoid committing keys to code repositories
Use environment variables and secrets management tools instead. -
Limit access by scope
If Langtrace supports scoped keys, restrict them to the projects or permissions they actually need.
Summary: From sign-up to first trace
To sign up for Langtrace and generate an API key for your project:
- Create a Langtrace account via the web app and verify your email.
- Create a new project to organize your AI agent observability and evaluations.
- Generate a project API key in the project settings and store it securely.
- Install and initialize the Langtrace SDK for your framework (CrewAI, DSPy, LlamaIndex, LangChain, or custom).
- Run your app and verify traces are appearing in the Langtrace dashboard.
With this setup, you can start using Langtrace to measure, debug, and continually improve the performance and safety of your AI agents with minimal overhead.