Vercel AI SDK
How to integrate Humanloop with the Vercel AI SDK
Observability integration
The Vercel AI SDK supports tracing via OpenTelemetry. You can export these traces to Humanloop by enabling telemetry and configuring the OpenTelemetry Exporter.
The Vercel AI SDK tracing feature is experimental and subject to change. You
must enable it with the experimental_telemetry
parameter on each AI SDK
function call that you want to trace.
Learn how to add tracing to your AI SDK application below.
Prerequisites
The following steps assume you’re already using the AI SDK in your application. If not, follow Vercel’s quickstarts to get started.
Next.js
Node.js
Versions of Next < 15 must set experimental.instrumentationHook
in next.config.js
. Learn more here.
You can find an example Next.js application that uses the AI SDK to stream chat responses here.
Set up OpenTelemetry
Install dependencies.
Create a file called instrumentation.ts
in your root or /src directory and add the following:
Trace AI SDK calls
Now add the experimental_telemetry
parameter to your AI SDK function calls to trace them.
With a simple one-step generation, each call to streamText
or generateText
will be traced as a Prompt Log on Humanloop.
You can also group each step of a multi-step generation into a Flow by passing the humanloopFlowPath
metadata value.
Metadata parameters
Humanloop’s AI SDK OpenTelemetry Receiver will automatically extract the following metadata parameters from the experimental_telemetry
metadata object:
humanloopPromptPath
: [Required] The path to the prompt on Humanloop. Generation spans will create Logs for this Prompt on Humanloop.humanloopFlowPath
: [Optional] The path to the flow on Humanloop. Set this on a multi-step generation to group the steps into a single Flow Log on Humanloop.humanloopTraceId
: [Optional] The ID of a Flow Log on Humanloop. Set this to group multiple calls to the AI SDK into a single Flow Log on Humanloop.
Learn more
To see the integration in action, check out our Vercel AI SDK observability guide.