Overview

The Vercel AI SDK provides a consistent API for interacting with various AI providers. Gentrace integrates seamlessly to provide automatic tracing and error analysis for your AI application.

Prerequisites

This guide assumes you have already setup an interaction as shown in our quickstart.

Instrumentation

The AI SDK natively exports to OpenTelemetry, and Gentrace can consume this telemetry automatically. To enable telemetry, set the experimental_telemetry flag to true in your AI SDK calls. For example:
The AI SDK call must be inside the callstack of an interaction for Gentrace to capture the trace.
const { text } = await generateText({
  // Enable telemetry
  experimental_telemetry: {
    isEnabled: true,
  },
  model: openai("gpt-4.1-nano"),
  prompt: `Write a haiku about ${topic}`,
});

Full example

.env
GENTRACE_API_KEY=your-gentrace-api-key
GENTRACE_PIPELINE_ID=your-pipeline-id
OPENAI_API_KEY=your-openai-api-key
import { init, interaction } from "gentrace";
import { openai } from "@ai-sdk/openai";
import { generateText } from "ai";

init({
  apiKey: process.env.GENTRACE_API_KEY,
});

const writePoem = interaction(
  "write-poem",
  async (topic: string) => {
    const { text } = await generateText({
      model: openai("gpt-4.1-nano"),
      prompt: `Write a haiku about ${topic}`,
      experimental_telemetry: {
        isEnabled: true,
      }
    });
    return text;
  }
);

const main = async () => {
  const result = await writePoem("quantum computing");
  console.log(result);
}

main();
For an example in our SDK, see the OpenAI AI SDK example on GitHub.