Want to see the complete code? Check out our Next.js + Gentrace example on GitHub.
Next.js applications require a special setup to integrate Gentrace due to the framework’s built-in OpenTelemetry support. This guide covers the Next.js-specific configuration.

Prerequisites

Installation

npm install @vercel/otel @opentelemetry/sdk-trace-node@^1 @opentelemetry/[email protected] @opentelemetry/[email protected] @opentelemetry/[email protected] gentrace
It’s critical to use @opentelemetry/sdk-trace-node v1 (not v2) due to functionality and type compatibility issues with v2.

Configuration

Next.js requires a two-part setup to avoid conflicts between OpenTelemetry configurations.

Step 1: Create instrumentation.ts

Create an instrumentation.ts file in your project root:
instrumentation.ts
import {
  ConsoleSpanExporter,
  SimpleSpanProcessor,
} from "@opentelemetry/sdk-trace-node";
import { OTLPHttpJsonTraceExporter, registerOTel } from "@vercel/otel";

const traceExporter = new OTLPHttpJsonTraceExporter({
  url: "https://gentrace.ai/api/otel/v1/traces",
  headers: {
    authorization: `Bearer ${process.env.GENTRACE_API_KEY}`,
  },
});

export function register() {
  registerOTel({
    serviceName: "your-app-name",
    spanProcessors: [
      new SimpleSpanProcessor(traceExporter),
      new SimpleSpanProcessor(new ConsoleSpanExporter()),
    ],
  });
}
Next.js automatically runs instrumentation.ts during server startup. No additional configuration is needed.

Step 2: Initialize Gentrace SDK

In your API routes, initialize Gentrace with otelSetup: false:
app/api/your-route/route.ts
import { init } from "gentrace";

// Initialize once at the top of your route file
init({
  apiKey: process.env.GENTRACE_API_KEY!,
  otelSetup: false, // Critical: Prevents conflicts with Vercel's OTEL
});
Always set otelSetup: false in Next.js to avoid conflicts between Gentrace’s OpenTelemetry setup and Vercel’s.

Usage Example

Here’s a complete example of an API route with Gentrace tracing:
app/api/completion/route.ts
import { init, interaction } from "gentrace";
import { openai } from "@ai-sdk/openai";
import { streamText } from "ai";
import { NextRequest } from "next/server";

// Initialize Gentrace
init({
  apiKey: process.env.GENTRACE_API_KEY!,
  otelSetup: false,
});

export async function POST(req: NextRequest) {
  const { prompt } = await req.json();

  // Wrap your AI calls with interaction()
  const response = await interaction(
    "completion-request",
    async () => {
      return streamText({
        model: openai("gpt-4o-mini"),
        prompt,
      });
    },
    {
      pipelineId: process.env.GENTRACE_PIPELINE_ID!,
    }
  );

  return response.toDataStreamResponse();
}

Environment Variables

Add these to your .env.local:
.env.local
GENTRACE_API_KEY=your-api-key-here
GENTRACE_PIPELINE_ID=your-pipeline-id-here

Streaming Support

Next.js with Vercel AI SDK supports streaming responses. Gentrace automatically traces streaming interactions:
const response = await interaction(
  "streaming-conversation",
  async () => {
    return streamText({
      model: openai("gpt-4o-mini"),
      messages: conversation,
    });
  },
  {
    pipelineId: process.env.GENTRACE_PIPELINE_ID!,
  }
);

// Return streaming response to client
return response.toDataStreamResponse();