Skip to main content

Documentation Index

Fetch the complete documentation index at: https://laminar.sh/docs/llms.txt

Use this file to discover all available pages before exploring further.

If you’re running a self-hosted Laminar instance, point the SDK at it by passing baseUrl, httpPort, and grpcPort to evaluate(). Evaluations talk to Laminar over two channels: HTTP for evaluation metadata, datapoints, and scores; gRPC for the OpenTelemetry traces.

Configuration

import { evaluate } from '@lmnr-ai/lmnr';

evaluate({
  data: evaluationData,
  executor: runExecutor,
  evaluators: { accuracy },
  config: {
    projectApiKey: process.env.LMNR_PROJECT_API_KEY,
    baseUrl: 'http://localhost',
    httpPort: 8000,
    grpcPort: 8001,
  },
});
baseUrl / base_url is the scheme + host only. Do not include a port in it; the ports go in httpPort / grpcPort.

evaluate parameters

ParameterDescriptionDefault
dataList of datapoints or a LaminarDataset instance.required
executor(Optionally async) function that takes data and returns the output to score.required
evaluatorsMap of name → scoring function. Each function returns a number or map of numbers.required
nameEvaluation name shown in the UI.random
groupName / group_nameGroup identifier. Only runs sharing a group name can be compared.default
metadataArbitrary JSON on the evaluation row for filtering.none
projectApiKey / project_api_keyOverrides LMNR_PROJECT_API_KEY.env
baseUrl / base_urlLaminar host (no port).https://api.lmnr.ai
httpPort / http_portHTTP port for evaluation metadata and datapoints.443
grpcPort / grpc_portgRPC port for OTel traces.8443
concurrencyLimit / batch_sizeParallel executor invocations.5
instrumentModules / instrument_modulesClient modules to instrument (e.g. OpenAI, Anthropic).none
See the SDK reference for the full TypeScript and Python signatures.

Why both HTTP and gRPC

  • HTTP (httpPort) carries the bookkeeping: evaluation name, group, datapoints, scores, executor outputs.
  • gRPC (grpcPort) carries the OpenTelemetry spans: every LLM call, tool call, evaluator, and the wrapping EVALUATION / EXECUTOR / EVALUATOR spans.
Both have to be reachable from the machine running the evaluation. If only HTTP is open, the evaluation run appears with datapoints and scores but no traces.

Hosting options

For how to actually run a self-hosted Laminar (Docker Compose for single-node, Helm for Kubernetes, and the hybrid data-plane variant) see Hosting options.

Next steps

Quickstart

Write your first evaluation against your self-hosted instance.

Hosting options

Docker Compose, Helm, and hybrid data-plane deployment options.

Manual API

Use LaminarClient.evals for finer control; the same base-URL + port config applies.

SDK reference

Full parameters for evaluate and related SDK types.