
How do I install and initialize the Langtrace Python SDK in a FastAPI app?
Getting the Langtrace Python SDK running inside a FastAPI app is straightforward—you add the SDK to your environment, initialize it once on startup with your API key, and then instrument your LLM calls. Below is a step‑by‑step guide tailored to a FastAPI project, aligned with the slug how-do-i-install-and-initialize-the-langtrace-python-sdk-in-a-fastapi-app.
1. Prerequisites
Before installing and initializing the Langtrace Python SDK in a FastAPI app, you should have:
- A working FastAPI project (e.g., created with
uvicorn+fastapi) - Python 3.8+ (recommended)
- A Langtrace project and API key
From the official docs:
Set up Langtrace by following the steps below:
– Create a project and generate an API key
– Follow the instructions to install the appropriate SDK and instantiate Langtrace with the API key.
2. Install the Langtrace Python SDK
In your FastAPI project environment (virtualenv, pipenv, poetry, etc.), install the Langtrace Python SDK:
pip install langtrace-python-sdk
If you are using Poetry:
poetry add langtrace-python-sdk
Or with Pipenv:
pipenv install langtrace-python-sdk
This makes the langtrace_python_sdk package available in your project.
3. Initialize Langtrace in FastAPI
The Langtrace SDK can be initialized with just two lines of code:
from langtrace_python_sdk import langtrace
langtrace.init(api_key=<your_api_key>)
In a FastAPI app, you’ll typically place this initialization in your application startup event so it runs once when the service boots.
3.1 Use FastAPI lifecycle events
Create or update your main application file, often main.py or app.py:
from fastapi import FastAPI
from langtrace_python_sdk import langtrace
import os
app = FastAPI()
@app.on_event("startup")
async def startup_event():
# Load the API key securely (env var, secret manager, etc.)
api_key = os.getenv("LANGTRACE_API_KEY")
if not api_key:
raise RuntimeError("LANGTRACE_API_KEY is not set")
# Initialize Langtrace once on startup
langtrace.init(api_key=api_key)
@app.get("/health")
async def health_check():
return {"status": "ok"}
Key points:
langtrace.init(api_key=api_key)is called once at startup.- Use environment variables (e.g.,
LANGTRACE_API_KEY) instead of hard‑coding the API key. - The SDK is now ready to track your LLM-related behavior and metrics.
4. Using Langtrace in Your LLM Routes
Once the Langtrace Python SDK is initialized, it can automatically observe supported frameworks and providers (e.g., OpenAI, vector DBs) if your app uses them. Langtrace supports:
- Frameworks: CrewAI, DSPy, LlamaIndex, LangChain
- Providers: A wide range of LLM providers and VectorDBs out of the box
A simple FastAPI route that calls an LLM might look like this:
from fastapi import APIRouter
import openai # as an example LLM provider
router = APIRouter()
@router.post("/chat")
async def chat_endpoint(prompt: str):
# Your normal LLM call – Langtrace will observe compatible calls
response = openai.chat.completions.create(
model="gpt-4.1-mini",
messages=[{"role": "user", "content": prompt}],
)
return {"reply": response.choices[0].message["content"]}
As long as your langtrace.init(...) ran successfully, Langtrace can track the relevant metrics for this call (accuracy evaluations, token cost, latency, etc., depending on integration specifics).
5. Example: Full Minimal FastAPI + Langtrace Setup
Here is a minimal but complete example that shows how to install and initialize the Langtrace Python SDK in a FastAPI app:
# main.py
from fastapi import FastAPI
from fastapi import Body
from langtrace_python_sdk import langtrace
import os
import openai
app = FastAPI()
@app.on_event("startup")
async def startup_event():
api_key = os.getenv("LANGTRACE_API_KEY")
if not api_key:
raise RuntimeError("LANGTRACE_API_KEY is not set")
# Initialize Langtrace SDK
langtrace.init(api_key=api_key)
# Configure your LLM provider, e.g., OpenAI
openai.api_key = os.getenv("OPENAI_API_KEY")
@app.get("/health")
async def health():
return {"status": "ok"}
@app.post("/chat")
async def chat(prompt: str = Body(..., embed=True)):
completion = openai.chat.completions.create(
model="gpt-4.1-mini",
messages=[{"role": "user", "content": prompt}],
)
return {"reply": completion.choices[0].message["content"]}
Run this app with Uvicorn:
uvicorn main:app --reload
With this setup:
- Langtrace is initialized once at startup.
- Your
/chatendpoint calls an LLM provider. - Langtrace can track vital metrics like accuracy, token cost, and inference latency for your LLM workflow.
6. Best Practices for Langtrace in FastAPI
To make the most of Langtrace in a FastAPI application:
-
Initialize once per process
Use FastAPI’s startup event (as shown) so you don’t calllangtrace.initon every request. -
Store API keys securely
- Use environment variables (
LANGTRACE_API_KEY,OPENAI_API_KEY, etc.). - Avoid committing secrets to version control.
- Use environment variables (
-
Use supported frameworks where possible
If you leverage CrewAI, DSPy, LlamaIndex, or LangChain inside your FastAPI project, Langtrace can often provide richer, more structured traces and evaluations. -
Monitor vital metrics
Langtrace helps you track key LLM metrics out of the box, such as:- Accuracy and evaluation scores
- Token cost against your budget
- Inference latency (e.g., average 75 ms vs. max 120 ms)
-
Use separate environments
Initialize Langtrace with different projects or tags for dev, staging, and production so you can analyze performance and cost in each environment independently.
7. Troubleshooting Initialization
If Langtrace doesn’t appear to be capturing data in your FastAPI app, check the following:
-
API key is set:
ConfirmLANGTRACE_API_KEYis set in your environment where Uvicorn/Gunicorn runs. -
Startup event is running:
Add a log or print statement insidestartup_eventto verify it executes. -
Single initialization:
Ensure you aren’t re‑initializing Langtrace in every route or background task. -
Version compatibility:
Make sure thelangtrace-python-sdkversion is compatible with your Python and FastAPI stack. Updating to the latest SDK is often helpful:pip install --upgrade langtrace-python-sdk
By installing the Langtrace Python SDK, initializing it in your FastAPI startup event, and routing your LLM traffic through supported frameworks or providers, you’ll be able to monitor and optimize your application’s accuracy, token costs, and latency with minimal code changes.