Traceloop has announced it is joining ServiceNow (March 2026), with OpenLLMetry remaining open source while the commercial platform becomes part of ServiceNow's AI Control Tower. If you chose Traceloop because it was OTel-native and open source, you probably care that your next tool stays that way. Laminar is also built on OpenTelemetry, fully open source, and available self-hosted - so if you already emit OTel spans, the migration can be as small as changing your exporter endpoint. Traceloop combined OpenLLMetry instrumentation with a commercial platform for evaluators, monitors, experiments, and insights; this guide maps those features to Laminar equivalents. It covers three paths: swap your OTel exporter (2 minutes), let a coding agent handle the full SDK swap, or do it manually.
Fastest Path: Swap the OTLP Exporter (OTel Users)
If you're already emitting OpenTelemetry spans (the main reason many teams picked Traceloop), you can keep your span structure and just point your exporter at Laminar. Traceloop defaults to OTLP/HTTP, while Laminar defaults to OTLP/gRPC. You can stay on HTTP, but we recommend gRPC for stability. For HTTP, Authorization is fine; for gRPC metadata, use lowercase authorization.
# OTLP/HTTP (Traceloop default)
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
import os
exporter = OTLPSpanExporter(
endpoint="https://api.lmnr.ai/v1/traces",
headers={"Authorization": f"Bearer {os.environ['LMNR_PROJECT_API_KEY']}"},
)
// OTLP/HTTP (Traceloop default)
import { OTLPTraceExporter } from "@opentelemetry/exporter-trace-otlp-http";
const exporter = new OTLPTraceExporter({
url: "https://api.lmnr.ai/v1/traces",
headers: {
Authorization: `Bearer ${process.env.LMNR_PROJECT_API_KEY}`,
},
});
// OTLP/gRPC (recommended for Laminar)
import { Metadata } from "@grpc/grpc-js";
import { OTLPTraceExporter } from "@opentelemetry/exporter-trace-otlp-grpc";
const metadata = new Metadata();
metadata.set("authorization", `Bearer ${process.env.LMNR_PROJECT_API_KEY}`);
const exporter = new OTLPTraceExporter({
url: "https://api.lmnr.ai:8443/v1/traces",
metadata,
});
See OpenTelemetry for more context and Python gRPC examples.
Fast Path: Let the Agent Migrate It
If you use Claude Code, Cursor, Codex, or another coding agent that supports skills, this is the fastest route. Run the laminar-instrument-codebase skill (see Skills setup) with the following prompt:
Use the laminar-instrument-codebase skill to migrate this repo from Traceloop to Laminar.
Replace Traceloop initialization with Laminar.initialize, keep existing trace structure,
and verify traces in Laminar.
What it will do:
- Install the Laminar SDK in the right package
- Add Laminar.initialize(...) at the earliest safe startup point
- Remove Traceloop initialization and vendor-specific settings
- Keep existing spans and map context into Laminar tags/metadata
- Verify traces show up in the Laminar UI
Manual Migration (Full Control)
- Create a Laminar project and set LMNR_PROJECT_API_KEY.
- Install the SDK (TypeScript: npm add @lmnr-ai/lmnr; Python: pip install lmnr). See Hosting Options if you are self-hosting.
- Replace Traceloop initialization with Laminar initialization.
import { Laminar } from '@lmnr-ai/lmnr';
Laminar.initialize({
projectApiKey: process.env.LMNR_PROJECT_API_KEY!,
});
import os
from lmnr import Laminar
Laminar.initialize(project_api_key=os.environ["LMNR_PROJECT_API_KEY"])
-
If you're only using Traceloop for auto-instrumentation (no @workflow or @task decorators), you're done. This is a one-line migration: replace Traceloop.init() with Laminar.initialize().
-
If you use decorators or manual spans, map them to Laminar equivalents. Laminar collapses workflow/task into a single @observe() decorator - a simpler mental model with the same trace fidelity. Use the mapping table below.
-
Verify in the UI. Run a single request. You should see a full trace tree with LLM spans. Tool spans will appear if your tool layer is instrumented (e.g., LangChain/LlamaIndex) or if you wrap tool calls with observe({ spanType: 'TOOL' }) (TypeScript) or Laminar.start_as_current_span(..., span_type="TOOL") (Python). If you only see a root span, add observe(...) or @observe() around your agent entrypoint. See Viewing Traces.
Traceloop to Laminar Mapping
| What You Used in Traceloop | Laminar Equivalent | Notes |
|---|---|---|
Traceloop.init() auto-instrumentation | Laminar.initialize() | Same concept, both auto-instrument major providers. |
@workflow / @task decorators | @observe() | Simpler mental model - one decorator for everything. |
| OpenLLMetry instrumentations | Laminar instrumentations | Both cover common providers like OpenAI, Anthropic, and LangChain. |
| OTel span export | OTel-native ingestion | Redirect your exporter and keep your spans. |
| Built-in evaluators (faithfulness, relevance) | Laminar evals SDK + CLI | Different approach - run evals locally or in CI, not as a managed service. |
| Custom evaluators | Laminar eval functions | Write eval functions in code and run via SDK or CLI. |
| Monitors (production quality gates) | Signals | Describe patterns in natural language; Laminar detects them across traces. |
| Experiments / A/B model testing | Playground + evals | Compare model outputs and run evals against datasets. |
| Hub (LLM gateway) | No equivalent | Use LiteLLM or provider SDKs directly. |
| Insights layer / drift detection | Signals + SQL + dashboards | Different mechanism - SQL queries + Signals vs automated insights. |
What You Gain with Laminar
- Browser session recordings synced to traces
- SQL editor for ad-hoc analysis
- Signals for natural language monitoring
- Evals you can run locally or in CI
- Fully open source and self-hostable, independent of enterprise platform roadmaps
Once you're set up, explore Signals, SQL analysis, and real-time agent tracing in the docs.
If anything doesn't map cleanly, drop into our Discord and we'll help you sort it out.