本页介绍的是追踪 AI SDK 运行的旧方法。若想使用无需配置 OTEL 的更简洁、通用方案,请参阅新版指南。
你可以结合 OpenTelemetry(OTEL)使用 LangSmith 追踪 Vercel AI SDK 的运行。本指南通过示例演示完整流程。
目前 JavaScript 生态中许多流行的 OpenTelemetry 实现 仍处于实验阶段,在生产环境中可能出现不稳定情况,尤其是同时对 LangSmith 和其他提供商进行埋点时。若你使用的是 AI SDK 5,强烈建议改用我们推荐的追踪方案。
0. 安装依赖
先安装 Vercel AI SDK 与所需的 OTEL 依赖。示例中使用其 OpenAI 集成,你也可以选择其他提供的选项。
npm install ai @ai-sdk/openai zod
npm install @opentelemetry/sdk-trace-base @opentelemetry/exporter-trace-otlp-proto @opentelemetry/context-async-hooks
1. 配置环境
export LANGSMITH_TRACING=true
export LANGSMITH_API_KEY=<your-api-key>
export LANGSMITH_OTEL_ENABLED=true
# This example uses OpenAI, but you can use any LLM provider of choice
export OPENAI_API_KEY=<your-openai-api-key>
2. 记录追踪
Node.js
要开始追踪,需要在程序入口导入并调用 initializeOTEL 方法:
import { initializeOTEL } from "langsmith/experimental/otel/setup";
const { DEFAULT_LANGSMITH_SPAN_PROCESSOR } = initializeOTEL();
随后,在希望追踪的 AI SDK 调用中添加 experimental_telemetry 参数。
别忘了在应用退出前执行 await DEFAULT_LANGSMITH_SPAN_PROCESSOR.shutdown();,以便将剩余的追踪数据刷新到 LangSmith。
import { generateText } from "ai";
import { openai } from "@ai-sdk/openai";
let result;
try {
result = await generateText({
model: openai("gpt-4.1-nano"),
prompt: "Write a vegetarian lasagna recipe for 4 people.",
experimental_telemetry: {
isEnabled: true,
},
});
} finally {
await DEFAULT_LANGSMITH_SPAN_PROCESSOR.shutdown();
}
现在,你应该能在 LangSmith 仪表盘里看到类似这样的追踪记录。
同样可以对包含工具调用的运行进行追踪:
import { generateText, tool } from "ai";
import { openai } from "@ai-sdk/openai";
import { z } from "zod";
await generateText({
model: openai("gpt-4.1-nano"),
messages: [
{
role: "user",
content: "What are my orders and where are they? My user ID is 123",
},
],
tools: {
listOrders: tool({
description: "list all orders",
parameters: z.object({ userId: z.string() }),
execute: async ({ userId }) =>
`User ${userId} has the following orders: 1`,
}),
viewTrackingInformation: tool({
description: "view tracking information for a specific order",
parameters: z.object({ orderId: z.string() }),
execute: async ({ orderId }) =>
`Here is the tracking information for ${orderId}`,
}),
},
experimental_telemetry: {
isEnabled: true,
},
maxSteps: 10,
});
这会生成类似这样的追踪记录。
With traceable
可以将 traceable 包裹在 AI SDK 工具调用的外层或内部。若采用这种方式,建议创建一个 LangSmith client 实例传入每个 traceable,最后调用 client.awaitPendingTraceBatches();,确保数据全部写入。这样就无需手动调用 DEFAULT_LANGSMITH_SPAN_PROCESSOR 的 shutdown() 或 forceFlush()。示例如下:
import { initializeOTEL } from "langsmith/experimental/otel/setup";
initializeOTEL();
import { Client } from "langsmith";
import { traceable } from "langsmith/traceable";
import { generateText, tool } from "ai";
import { openai } from "@ai-sdk/openai";
import { z } from "zod";
const client = new Client();
const wrappedText = traceable(
async (content: string) => {
const { text } = await generateText({
model: openai("gpt-4.1-nano"),
messages: [{ role: "user", content }],
tools: {
listOrders: tool({
description: "list all orders",
parameters: z.object({ userId: z.string() }),
execute: async ({ userId }) => {
const getOrderNumber = traceable(
async () => {
return "1234";
},
{ name: "getOrderNumber" }
);
const orderNumber = await getOrderNumber();
return `User ${userId} has the following order: ${orderNumber}`;
},
}),
},
experimental_telemetry: {
isEnabled: true,
},
maxSteps: 10,
});
return { text };
},
{ name: "parentTraceable", client }
);
let result;
try {
result = await wrappedText("What are my orders?");
} finally {
await client.awaitPendingTraceBatches();
}
生成的追踪结果类似这样。
Next.js
首先安装 @vercel/otel:
接着在项目根目录创建 instrumentation.ts,调用 initializeOTEL 并将返回的 DEFAULT_LANGSMITH_SPAN_PROCESSOR 传入 registerOTEL(...) 的 spanProcessors 字段,大致如下:
import { registerOTel } from "@vercel/otel";
import { initializeOTEL } from "langsmith/experimental/otel/setup";
const { DEFAULT_LANGSMITH_SPAN_PROCESSOR } = initializeOTEL({});
export function register() {
registerOTel({
serviceName: "your-project-name",
spanProcessors: [DEFAULT_LANGSMITH_SPAN_PROCESSOR],
});
}
最后,在 API 路由中同样调用 initializeOTEL,并为 AI SDK 调用添加 experimental_telemetry 字段:
import { generateText } from "ai";
import { openai } from "@ai-sdk/openai";
import { initializeOTEL } from "langsmith/experimental/otel/setup";
initializeOTEL();
export async function GET() {
const { text } = await generateText({
model: openai("gpt-4.1-nano"),
messages: [{ role: "user", content: "Why is the sky blue?" }],
experimental_telemetry: {
isEnabled: true,
},
});
return new Response(text);
}
如果想要更细粒度的观测,可以在代码中继续使用 traceable。
Sentry
若使用 Sentry,可将 LangSmith 追踪导出器接入 Sentry 默认的 OpenTelemetry 埋点,如下所示。
撰写本文时,Sentry 仅支持 OTEL v1 版本。LangSmith 同时支持 v1 与 v2,但你必须安装 OTEL v1 依赖才能让埋点正常运行。npm install @opentelemetry/sdk-trace-base@1.30.1 @opentelemetry/exporter-trace-otlp-proto@0.57.2 @opentelemetry/context-async-hooks@1.30.1
import { initializeOTEL } from "langsmith/experimental/otel/setup";
import { LangSmithOTLPTraceExporter } from "langsmith/experimental/otel/exporter";
import { BatchSpanProcessor } from "@opentelemetry/sdk-trace-base";
import { traceable } from "langsmith/traceable";
import { generateText, tool } from "ai";
import { openai } from "@ai-sdk/openai";
import { z } from "zod";
import * as Sentry from "@sentry/node";
import { Client } from "langsmith";
const exporter = new LangSmithOTLPTraceExporter();
const spanProcessor = new BatchSpanProcessor(exporter);
const sentry = Sentry.init({
dsn: "...",
tracesSampleRate: 1.0,
openTelemetrySpanProcessors: [spanProcessor],
});
initializeOTEL({
globalTracerProvider: sentry?.traceProvider,
});
const wrappedText = traceable(
async (content: string) => {
const { text } = await generateText({
model: openai("gpt-4.1-nano"),
messages: [{ role: "user", content }],
experimental_telemetry: {
isEnabled: true,
},
maxSteps: 10,
});
return { text };
},
{ name: "parentTraceable" }
);
let result;
try {
result = await wrappedText("What color is the sky?");
} finally {
await sentry?.traceProvider?.shutdown();
}
添加更多元数据
你可以在追踪中追加其他元数据,便于在 LangSmith 界面分类、过滤:
import { generateText } from "ai";
import { openai } from "@ai-sdk/openai";
await generateText({
model: openai("gpt-4.1-nano"),
prompt: "Write a vegetarian lasagna recipe for 4 people.",
experimental_telemetry: {
isEnabled: true,
metadata: { userId: "123", language: "english" },
},
});
元数据会在 LangSmith 仪表盘中展示,可用于筛选与搜索特定追踪。需要注意,AI SDK 也会在内部子跨度中传播这些元数据。
自定义运行名称
可以在 experimental_telemetry 中加入名为 ls_run_name 的元数据来自定义运行名称:
import { generateText } from "ai";
import { openai } from "@ai-sdk/openai";
await generateText({
model: openai("gpt-4o-mini"),
prompt: "Write a vegetarian lasagna recipe for 4 people.",
experimental_telemetry: {
isEnabled: true,
// highlight-start
metadata: {
ls_run_name: "my-custom-run-name",
},
// highlight-end
},
});