Vercel AI SDK

How to integrate Humanloop with the Vercel AI SDK

Observability integration

The Vercel AI SDK supports tracing via OpenTelemetry. You can export these traces to Humanloop by enabling telemetry and configuring the OpenTelemetry Exporter.

The Vercel AI SDK tracing feature is experimental and subject to change. You must enable it with the experimental_telemetry parameter on each AI SDK function call that you want to trace.

Learn how to add tracing to your AI SDK application below.

Prerequisites

The following steps assume you’re already using the AI SDK in your application. If not, follow Vercel’s quickstarts to get started.

Versions of Next < 15 must set experimental.instrumentationHook in next.config.js. Learn more here.

You can find an example Next.js application that uses the AI SDK to stream chat responses here.

1

Set up OpenTelemetry

Install dependencies.

$npm install @vercel/otel @opentelemetry/sdk-logs @opentelemetry/api-logs @opentelemetry/instrumentation

Create a file called instrumentation.ts in your root or /src directory and add the following:

instrumentation.ts
1import { registerOTel } from '@vercel/otel';
2import { OTLPTraceExporter } from "@opentelemetry/exporter-trace-otlp-http";
3
4export function register() {
5 registerOTel({
6 serviceName: 'humanloop-vercel-ai-sdk',
7 traceExporter: new OTLPTraceExporter(),
8 });
9}
2

Configure OpenTelemetry

Configure the OpenTelemetry exporter to forward logs to Humanloop.

.env.local
HUMANLOOP_API_KEY=<YOUR_HUMANLOOP_KEY>
# Configure the OpenTelemetry OTLP Exporter
OTEL_EXPORTER_OTLP_ENDPOINT=https://api.humanloop.com/v5/import/otel
OTEL_EXPORTER_OTLP_PROTOCOL=http/json
OTEL_EXPORTER_OTLP_HEADERS="X-API-KEY=<YOUR_HUMANLOOP_KEY>" # Humanloop API key
3

Trace AI SDK calls

Now add the experimental_telemetry parameter to your AI SDK function calls to trace them.

With a simple one-step generation, each call to streamText or generateText will be traced as a Prompt Log on Humanloop.

app/api/chat/route.ts
1import { openai } from '@ai-sdk/openai';
2import { streamText } from 'ai';
3
4// Allow streaming responses up to 30 seconds
5export const maxDuration = 30;
6
7export async function POST(req: Request) {
8 const { messages, id } = await req.json();
9
10 const result = streamText({
11 model: openai('gpt-4o'),
12 messages,
13 experimental_telemetry: {
14 isEnabled: true,
15 metadata: {
16 humanloopPromptPath: 'path/to/prompt',
17 },
18 },
19 });
20
21 // Respond with the stream
22 return result.toDataStreamResponse();
23}

You can also group each step of a multi-step generation into a Flow by passing the humanloopFlowPath metadata value.

app/api/chat/route.ts
1import { openai } from '@ai-sdk/openai';
2import { streamText } from 'ai';
3
4// Allow streaming responses up to 30 seconds
5export const maxDuration = 30;
6
7export async function POST(req: Request) {
8 const { messages, id } = await req.json();
9
10 const result = streamText({
11 model: openai('gpt-4o'),
12 messages,
13 maxSteps: 3,
14 toolCallStreaming: true,
15 system: "You are a helpful assistant that answers questions about the weather in a given city.",
16 experimental_telemetry: {
17 isEnabled: true,
18 metadata: {
19 humanloopPromptPath: 'path/to/prompt',
20 humanloopFlowPath: 'path/to/flow',
21 }
22 },
23 tools: {
24 getWeatherInformation: {
25 description: 'show the weather in a given city to the user',
26 parameters: z.object({ city: z.string() }),
27 execute: async ({}: { city: string }) => {
28 const weatherOptions = ['sunny', 'cloudy', 'rainy', 'snowy', 'windy'];
29 return {
30 weather:
31 weatherOptions[Math.floor(Math.random() * weatherOptions.length)],
32 temperature: Math.floor(Math.random() * 50 - 10),
33 };
34 }
35 },
36 },
37 });
38
39 // Respond with the stream
40 return result.toDataStreamResponse();
41}

Metadata parameters

Humanloop’s AI SDK OpenTelemetry Receiver will automatically extract the following metadata parameters from the experimental_telemetry metadata object:

  • humanloopPromptPath: [Required] The path to the prompt on Humanloop. Generation spans will create Logs for this Prompt on Humanloop.
  • humanloopFlowPath: [Optional] The path to the flow on Humanloop. Set this on a multi-step generation to group the steps into a single Flow Log on Humanloop.
  • humanloopTraceId: [Optional] The ID of a Flow Log on Humanloop. Set this to group multiple calls to the AI SDK into a single Flow Log on Humanloop.

Learn more

To see the integration in action, check out our Vercel AI SDK observability guide.

Built with