Skip to main content
Version: 4.7.28

PipelineRun class - OpenAI handler

The PipelineRun class instance exposes an OpenAI handler that simplifies capturing generative output in Gentrace.

For a guided walkthrough of the Gentrace OpenAI integration, visit our docs here.

Usage

When the PipelineRun instance is created, create a handle in two ways to communicate with OpenAI.

Simple

typescript
const openai = new OpenAI({
apiKey: process.env.OPENAI_KEY,
})
 
const chatCompletionResponse = await openai.chat.completions.create({
messages: [{ role: "user", content: "Hello! What's the capital of Maine?" }],
model: "gpt-3.5-turbo",
stream: true,
});

Advanced

The advanced method allows you to attach multiple steps to a single PipelineRun instance.

typescript
const runner = pipeline.start();
 
const openai = runner.openai;
 
const chatCompletionResponse = await openai.chat.completions.create({
messages: [{ role: "user", content: "Hello! What's the capital of Maine?" }],
model: "gpt-3.5-turbo",
stream: true,
});
 
await runner.submit();

Chat completion

The openai.chat.completions.create() method wraps the equivalent OpenAI Node.JS chat completion API.

Arguments

The original method parameters are supported. The below key-value pairs augment the defaults.

pipelineSlug?: string

For the Simple SDK, you can specify the pipeline slug here.

messages: { role: string, content: string, contentTemplate?: string, contentInputs?: object }[]

The difference between this and the original is that the messages array optionally allows templated values in each element with the contentTemplate and contentInputs keys.

typescript
const runner = pipeline.start();
 
const openai = runner.openai;
 
const chatCompletionResponse = await openai.chat.completions.create({
messages: [
{
role: "user",
contentTemplate: "Hello {{ name }}!",
contentInputs: { name: "Vivek" },
}
],
model: "gpt-3.5-turbo",
stream: true,
});

gentrace?: object

This object contains Gentrace context. Learn about context here.

Return value

Resolves to the original OpenAI response. If you're using the simple SDK, the response has an additional pipelineRunId.

pipelineRunId?: string (UUID)

Only available if you're using the Simple SDK.

Embedding

The openai.embeddings.create() method wraps the equivalent OpenAI Node.JS embedding API.

Arguments

The original method parameters are supported. The below key-value pairs augment the defaults.

pipelineSlug?: string

If you're using the Simple SDK, you can specify the pipeline slug here.

typescript
const runner = pipeline.start();
 
const openai = runner.openai;
 
const embeddingResponse = await openai.embeddings.create({
model: "text-embedding-ada-002",
input: "The capital of Maine is Augusta",
});
 
await runner.submit();

gentrace?: object

This object contains Gentrace context. Learn about context here.

Return value

Resolves to the original OpenAI response. If you're using the simple SDK, the response has an additional pipelineRunId.

pipelineRunId?: string (UUID)

Structured Outputs

The openai.beta.chat.completions.parse() method wraps the equivalent OpenAI structured output API for chat completions.

info

Structured outputs are currently in beta with OpenAI. This feature may be subject to changes or updates as OpenAI continues to develop and refine it.

This allows you to define a specific response structure, making it easier to use the generated content. Gentrace's OpenAI integration fully supports this feature. For more details, see the OpenAI documentation on structured outputs.

Arguments

The original method parameters are supported. The below key-value pairs augment the defaults.

pipelineSlug?: string

If you're using the Simple SDK, you can specify the pipeline slug here.

gentrace?: object

This object contains Gentrace context. Learn about context here.

Return value

Resolves to the parsed response according to the specified response_format. If you're using the simple SDK, the response has an additional pipelineRunId.

pipelineRunId?: string (UUID)

Example

typescript
const Step = z.object({
explanation: z.string(),
output: z.string(),
});
const MathReasoning = z.object({
steps: z.array(Step),
final_answer: z.string(),
});
// Omit Gentrace pipeline initialization...
const runner = pipeline.start();
const completion = await runner.openai.beta.chat.completions.parse({
model: "gpt-4o-2024-08-06",
messages: [
{
role: "system",
content:
"You are a helpful math tutor. Guide the user through the solution step by step.",
},
{ role: "user", content: "how can I solve 8x + 7 = -23" },
],
response_format: zodResponseFormat(MathReasoning, "math_reasoning"),
gentrace: {
metadata: {
problemType: {
type: "string",
value: "linear_equation",
},
},
},
});