π¬ Lab 1: AI Instrumentation with OpenLLMetry
Duration: ~15 minutes
In this lab, youβll add OpenLLMetry instrumentation to the sample RAG application to send traces to Dynatrace.
π― Learning Objectives
- Understand how OpenLLMetry works with OpenTelemetry
- Add Traceloop instrumentation to a Python AI application
- Configure the exporter for Dynatrace OTLP ingestion
- Verify traces are being sent to Dynatrace
Step 1: Install OpenLLMetry Dependencies
First, we need to add the traceloop SDK and OpenTelemetry exporter packages.
1.1 Update requirements.txt
Open app/requirements.txt and uncomment the following lines (remove the #):
# Before (commented out):
# traceloop-sdk==0.50.1
# opentelemetry-exporter-otlp==1.39.1
# After (uncommented):
traceloop-sdk==0.50.1
opentelemetry-exporter-otlp==1.39.1
1.2 Install the Dependencies
In the terminal, run:
pip install traceloop-sdk opentelemetry-exporter-otlp opentelemetry-instrumentation-fastapi
Expected output:
Successfully installed traceloop-sdk-0.50.1 opentelemetry-exporter-otlp-1.39.1 ...
Step 2: Add Dynatrace Instrumentation
Now weβll add the instrumentation code to our application.
2.1 Open the Main Application File
Open app/main.py in VS Code.
2.2 Locate the Instrumentation Section
Find this comment block near the top of the file:
# ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
# β π¬ LAB 1: INSTRUMENTATION SECTION β
# β β
# β TODO: Add Dynatrace OpenLLMetry instrumentation here β
# β Follow the instructions in the workshop guide to add the β
# β Traceloop initialization code below this comment block. β
# β β
# ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
# ---> ADD YOUR INSTRUMENTATION CODE HERE <---
2.3 Add the Instrumentation Code
Replace # ---> ADD YOUR INSTRUMENTATION CODE HERE <--- with the following code:
from traceloop.sdk import Traceloop
# Get Dynatrace configuration from environment
ATTENDEE_ID = os.getenv("ATTENDEE_ID", "workshop-attendee")
DT_ENDPOINT = os.getenv("DT_ENDPOINT")
DT_API_TOKEN = os.getenv("DT_API_TOKEN")
# β οΈ IMPORTANT: Dynatrace requires Delta temporality for metrics
# This MUST be set before Traceloop.init()
os.environ["OTEL_EXPORTER_OTLP_METRICS_TEMPORALITY_PREFERENCE"] = "delta"
# Initialize Traceloop with Dynatrace endpoint
if DT_ENDPOINT and DT_API_TOKEN:
headers = {"Authorization": f"Api-Token {DT_API_TOKEN}"}
Traceloop.init(
app_name=f"ai-chat-service-{ATTENDEE_ID}",
api_endpoint=DT_ENDPOINT,
headers=headers
)
print(f"β
Traceloop initialized - sending traces to Dynatrace")
print(f" Service Name: ai-chat-service-{ATTENDEE_ID}")
print(f" Endpoint: {DT_ENDPOINT}")
else:
print("β οΈ Dynatrace configuration not found. Traceloop not initialized.")
print(" Please set DT_ENDPOINT and DT_API_TOKEN in your .env file")
Step 3: Understanding the Code
Letβs break down what each part does:
3.1 Import Statement
from traceloop.sdk import Traceloop
The Traceloop SDK provides automatic instrumentation for LLM frameworks like OpenAI, LangChain, and more.
3.2 Delta Temporality Setting
os.environ["OTEL_EXPORTER_OTLP_METRICS_TEMPORALITY_PREFERENCE"] = "delta"
β οΈ Critical for Dynatrace: Dynatrace expects metrics with Delta temporality, not Cumulative. This environment variable must be set before initializing Traceloop.
3.3 Traceloop Initialization
Traceloop.init(
app_name=f"ai-chat-service-{ATTENDEE_ID}", # Your unique service name
api_endpoint=DT_ENDPOINT, # Dynatrace OTLP endpoint
headers=headers # API Token authentication
)
| Parameter | Description |
|---|---|
app_name |
Service name that appears in Dynatrace (includes your attendee ID) |
api_endpoint |
The Dynatrace OTLP ingestion endpoint |
headers |
Authentication header with your API token |
Step 4: Restart the Application
Now letβs restart the application with instrumentation enabled.
4.1 Start the Application
If the application is still running, stop it with Ctrl+C, then start it again:
python app/main.py
4.2 Verify Instrumentation
You should see the new instrumentation messages:
β
Traceloop initialized - sending traces to Dynatrace
Service Name: ai-chat-service-{YOUR_ATTENDEE_ID}
Endpoint: https://abc12345.live.dynatrace.com/api/v2/otlp
ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β π AI Chat Service Starting... β
β β
β Attendee ID: {YOUR_ATTENDEE_ID} β
β Service: ai-chat-service-{YOUR_ATTENDEE_ID} β
ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β
RAG initialized successfully for attendee: {YOUR_ATTENDEE_ID}
INFO: Uvicorn running on http://0.0.0.0:8000 (Press CTRL+C to quit)
Step 5: Generate Test Traffic
Letβs create some traces by using the chat interface!
5.1 Open the Chat UI
Your application includes a beautiful chat interface. When you start the app, GitHub Codespaces will detect the port and show a popupβclick βOpen in Browserβ to access it.
If you miss the popup, you can:
- Click the Ports tab in the VS Code terminal panel
- Find port
8000and click the globe icon π to open in browser
5.2 Chat with the AI Assistant
Use the chat interface to send several messages and generate traces:
Try these example questions:
- βWhat is Dynatrace and what does it do?β
- βHow does OpenTelemetry work with Dynatrace?β
- βExplain observability for AI applicationsβ
- βWhat is the Dynatrace MCP?β
π‘ Tip: The UI has quick-action buttons for common questions. Click them to send pre-written queries!
5.3 Toggle RAG Mode
The chat interface includes a toggle for βUse RAG (Knowledge Base)β:
- β Checked: Uses the vector store to find relevant context before answering
- β Unchecked: Sends your question directly to the LLM without context
Try sending the same question with RAG on and off to see the difference!
5.4 What Gets Traced?
Each request generates traces for:
- HTTP Request - The incoming FastAPI request
- Embedding Generation - Azure OpenAI embeddings for vector search
- Vector Store Query - ChromaDB retrieval
- LLM Completion - Azure OpenAI chat completion
- Token Usage - Input/output token counts
Step 6: Verify in Terminal Logs
As traces are sent, you may see OpenTelemetry log messages in the terminal indicating successful exports.
β Checkpoint
Before proceeding to Lab 2, verify:
- The
traceloop-sdkandopentelemetry-exporter-otlppackages are installed - The instrumentation code is added to
main.py - The application starts with ββ Traceloop initializedβ message
- Youβve sent at least 3-4 chat requests
- No errors appear in the terminal
π Code Review: Your Updated main.py
The top of your main.py should now look like this:
"""
Dynatrace AI Observability Workshop
"""
import os
from dotenv import load_dotenv
# Load environment variables
load_dotenv()
# ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
# β π¬ LAB 1: INSTRUMENTATION SECTION β
# ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
from traceloop.sdk import Traceloop
# Get Dynatrace configuration from environment
ATTENDEE_ID = os.getenv("ATTENDEE_ID", "workshop-attendee")
DT_ENDPOINT = os.getenv("DT_ENDPOINT")
DT_API_TOKEN = os.getenv("DT_API_TOKEN")
# β οΈ IMPORTANT: Dynatrace requires Delta temporality for metrics
os.environ["OTEL_EXPORTER_OTLP_METRICS_TEMPORALITY_PREFERENCE"] = "delta"
# Initialize Traceloop with Dynatrace endpoint
if DT_ENDPOINT and DT_API_TOKEN:
headers = {"Authorization": f"Api-Token {DT_API_TOKEN}"}
Traceloop.init(
app_name=f"ai-chat-service-{ATTENDEE_ID}",
api_endpoint=DT_ENDPOINT,
headers=headers
)
print(f"β
Traceloop initialized - sending traces to Dynatrace")
# ... rest of init logging
else:
print("β οΈ Dynatrace configuration not found.")
# ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
from fastapi import FastAPI, HTTPException
# ... rest of imports and application code
π Troubleshooting
βModule traceloop not foundβ
Run the pip install command again:
pip install traceloop-sdk opentelemetry-exporter-otlp
β401 Unauthorizedβ in logs
Your API token may be incorrect. Double-check:
- The token in your
.envfile - That it has
openTelemetryTrace.ingestpermission - No extra spaces or quotes around the token
βTraceloop not initializedβ message
Check your .env file has both:
DT_ENDPOINTset correctlyDT_API_TOKENset correctly
Application crashes after adding instrumentation
- Check for syntax errors in your code
- Ensure imports are at the top of the file
- Verify the
load_dotenv()is called before accessing environment variables
π Excellent Work!
Youβve successfully instrumented an AI application for Dynatrace! Now letβs explore the traces in the Dynatrace UI.