Logging from Vercel AI SDK

Instrument your Vercel AI SDK project with Humanloop.

The Vercel AI SDK is the TypeScript toolkit designed to help developers build AI-powered applications with React, Next.js, Vue, Svelte, Node.js, and more.

The AI SDK supports OpenTelemetry tracing, and you can collect your traces in Humanloop with a few simple steps.

This guide extends Vercel AI SDK’s Node.js example, adding Humanloop logging to a chat agent with tool calling.

Prerequisites

Create a Humanloop Account

If you haven’t already, create an account or log in to Humanloop

Add an OpenAI API Key

If you’re the first person in your organization, you’ll need to add an API key to a model provider.

  1. Go to OpenAI and grab an API key.
  2. In Humanloop Organization Settings set up OpenAI as a model provider.

Using the Prompt Editor will use your OpenAI credits in the same way that the OpenAI playground does. Keep your API keys for Humanloop and the model providers private.

To follow this guide, you’ll also need Node.js 18+ installed on your machine.

Install humanloop, ai, and @ai-sdk/openai, the AI SDK’s OpenAI provider, along with other necessary dependencies.

$npm install humanloop ai @ai-sdk/openai zod dotenv
>npm install -D @types/node tsx typescript

Add a .env file to your project with your Humanloop and OpenAI API keys.

$touch .env
.env
HUMANLOOP_API_KEY=xxxxxx
OPENAI_API_KEY=xxxxxx

Create the chat agent

We start with a simple chat agent that talks back to the user and uses a tool to get the weather in a given city. The full code is available at the bottom of this page.

agent.ts
1import { openai } from "@ai-sdk/openai";
2import { CoreMessage, streamText, tool } from "ai";
3import dotenv from "dotenv";
4import { z } from "zod";
5import * as readline from "node:readline/promises";
6
7dotenv.config();
8
9const terminal = readline.createInterface({
10 input: process.stdin,
11 output: process.stdout,
12});
13
14async function exit() {
15 console.log('Shutting down...');
16 process.exit(0);
17}
18
19const messages: CoreMessage[] = [{
20 role: "system",
21 content: "You are a helpful assistant. If the user asks you to exit, you should exit the program."
22}];
23
24async function main() {
25 while (true) {
26 const userInput = await terminal.question("You: ");
27
28 messages.push({ role: "user", content: userInput });
29
30 const result = streamText({
31 model: openai("gpt-4o"),
32 messages,
33 maxSteps: 5,
34 tools: {
35 weather: tool({
36 description: "Get the weather in a location (in Celsius)",
37 parameters: z.object({
38 location: z
39 .string()
40 .describe("The location to get the weather for"),
41 }),
42 execute: async ({ location }) => ({
43 location,
44 temperature: Math.round((Math.random() * 30 + 5) * 10) / 10, // Random temp between 5°C and 35°C
45 }),
46 }),
47 convertCelsiusToFahrenheit: tool({
48 description: "Convert a temperature from Celsius to Fahrenheit",
49 parameters: z.object({
50 celsius: z
51 .number()
52 .describe("The temperature in Celsius to convert"),
53 }),
54 execute: async ({ celsius }) => {
55 const fahrenheit = (celsius * 9) / 5 + 32;
56 return { fahrenheit: Math.round(fahrenheit * 100) / 100 };
57 },
58 }),
59 exit: tool({
60 description: "Exit the program",
61 parameters: z.object({}),
62 execute: async () => {
63 await exit();
64 return { success: true };
65 },
66 }),
67 },
68 });
69
70 let fullResponse = "";
71 process.stdout.write("\nAssistant: ");
72 for await (const delta of result.textStream) {
73 fullResponse += delta;
74 process.stdout.write(delta);
75 }
76 process.stdout.write("\n\n");
77
78 messages.push({ role: "assistant", content: fullResponse });
79 }
80}
81
82main().catch(console.error);

This agent can call functions to get the weather in a given city, convert to Fahrenheit, and exit the conversation.

The agent can call functions.

Log to Humanloop

The agent works and is capable of function calling. However, we rely on inputs and outputs to reason about the behavior. Humanloop logging allows you to observe the steps taken by the agent, which we will demonstrate below.

We’ll use Vercel AI SDK’s built-in OpenTelemetry tracing to log to Humanloop.

1

Install the OpenTelemetry SDK and relevant dependencies

$npm install @opentelemetry/sdk-node @opentelemetry/auto-instrumentations-node @opentelemetry/exporter-trace-otlp-http
2

Configure the OpenTelemetry OTLP Exporter options

Add the following lines to your .env file to configure the OpenTelemetry OTLP Exporter.

.env
$HUMANLOOP_API_KEY=xxxxxx
>OPENAI_API_KEY=xxxxxx
>OTEL_EXPORTER_OTLP_ENDPOINT=https://api.humanloop.com/v5/import/otel
>OTEL_EXPORTER_OTLP_HEADERS="X-API-KEY=${HUMANLOOP_API_KEY}"
3

Register the Exporter in Node

OpenTelemetry traces from the Vercel AI SDK will now be exported to Humanloop. The http/json exporter is necessary for Humanloop to properly ingest the traces.

agent.ts
1...
2
3import { NodeSDK } from "@opentelemetry/sdk-node";
4import { getNodeAutoInstrumentations } from "@opentelemetry/auto-instrumentations-node";
5import { OTLPTraceExporter } from "@opentelemetry/exporter-trace-otlp-http";
6
7dotenv.config();
8
9...
10
11const sdk = new NodeSDK({
12 traceExporter: new OTLPTraceExporter(),
13 instrumentations: [getNodeAutoInstrumentations()],
14});
15
16sdk.start();
17
18async function exit() {
19 console.log('Shutting down...');
20 await sdk.shutdown();
21 process.exit(0);
22}
23
24...
25
26main().catch(console.error);
4

Add Humanloop metadata to your OpenTelemetry traces

Logs will be added to your files on Humanloop. The humanloopPromptPath provides the path for a Prompt on Humanloop, and the humanloopFlowPath provides the path for a Flow to group the related Prompt Logs together.

agent.ts
1...
2
3async function main() {
4 while (true) {
5 ...
6
7 const result = streamText({
8 ...
9
10 experimental_telemetry: {
11 isEnabled: true,
12 metadata: {
13 humanloopFlowPath: "Vercel AI/Weather Agent",
14 humanloopPromptPath: "Vercel AI/Call Assistant",
15 },
16 },
17 });
18
19 ...
20 }
21}
22
23main().catch(console.error);

Run the code

$npx tsx agent.ts

Have a conversation with the agent, and try asking about the weather in a city (in Celsius or Fahrenheit). When you’re done, type exit to close the program.

Explore your logs on Humanloop

Now you can explore your logs on the Humanloop platform, and see the steps taken by the agent during your conversation.

You can see below the full trace of prompts and tool calls that were made.

The trace captures the agent calling its tools to answer the user's question.

Next steps

Logging is the first step to observing your AI product. Read these guides to learn more about evals on Humanloop:

Full code

1import { openai } from "@ai-sdk/openai";
2import { CoreMessage, streamText, tool } from "ai";
3import dotenv from "dotenv";
4import { z } from "zod";
5import * as readline from "node:readline/promises";
6
7import { NodeSDK } from "@opentelemetry/sdk-node";
8import { getNodeAutoInstrumentations } from "@opentelemetry/auto-instrumentations-node";
9import { OTLPTraceExporter } from "@opentelemetry/exporter-trace-otlp-http";
10
11dotenv.config();
12
13const terminal = readline.createInterface({
14 input: process.stdin,
15 output: process.stdout,
16});
17
18const sdk = new NodeSDK({
19 traceExporter: new OTLPTraceExporter(),
20 instrumentations: [getNodeAutoInstrumentations()],
21});
22
23sdk.start();
24
25async function exit() {
26 console.log('Shutting down...');
27 await sdk.shutdown();
28 process.exit(0);
29}
30
31const messages: CoreMessage[] = [{
32 role: "system",
33 content: "You are a helpful assistant. If the user asks you to exit, you should exit the program."
34}];
35
36async function main() {
37 while (true) {
38 const userInput = await terminal.question("You: ");
39
40 messages.push({ role: "user", content: userInput });
41
42 const result = streamText({
43 model: openai("gpt-4o"),
44 messages,
45 maxSteps: 5,
46 tools: {
47 weather: tool({
48 description: "Get the weather in a location (in Celsius)",
49 parameters: z.object({
50 location: z
51 .string()
52 .describe("The location to get the weather for"),
53 }),
54 execute: async ({ location }) => ({
55 location,
56 temperature: Math.round((Math.random() * 30 + 5) * 10) / 10, // Random temp between 5°C and 35°C
57 }),
58 }),
59 convertCelsiusToFahrenheit: tool({
60 description: "Convert a temperature from Celsius to Fahrenheit",
61 parameters: z.object({
62 celsius: z
63 .number()
64 .describe("The temperature in Celsius to convert"),
65 }),
66 execute: async ({ celsius }) => {
67 const fahrenheit = (celsius * 9) / 5 + 32;
68 return { fahrenheit: Math.round(fahrenheit * 100) / 100 };
69 },
70 }),
71 exit: tool({
72 description: "Exit the program",
73 parameters: z.object({}),
74 execute: async () => {
75 await exit();
76 return { success: true };
77 },
78 }),
79 },
80 experimental_telemetry: {
81 isEnabled: true,
82 metadata: {
83 humanloopFlowPath: "Vercel AI/Weather Agent",
84 humanloopFilePath: "Vercel AI/Call Assistant",
85 },
86 },
87 });
88
89 let fullResponse = "";
90 process.stdout.write("\nAssistant: ");
91 for await (const delta of result.textStream) {
92 fullResponse += delta;
93 process.stdout.write(delta);
94 }
95 process.stdout.write("\n\n");
96
97 messages.push({ role: "assistant", content: fullResponse });
98 }
99}
100
101main().catch(console.error);
Built with