Skip to main content
Version: 3.0.3

Getting Started

Welcome to the documentation for Gentrace. These resources walk through using our API and SDKs to evaluate and observe your generative data.

Evaluate

Gentrace Evaluate provides features to evaluate your generative AI output using AI, heuristics, and manual human grading. Our evaluation product was designed to be agnostic to the generative pipeline that you've built.

Observe

Gentrace Observe provides features to observe your production generative data. We have built custom SDK integrations with OpenAI and Pinecone at this time. We will support other foundation model providers and vector stores in the future.

API consumption

The Gentrace API follows the REST architecture. It uses resource-oriented URLs, accepts JSON-encoded request bodies, returns JSON-encoded responses, and adheres to standard HTTP response codes, authentication, and verbs.

We recommend using our TypeScript Node.JS or Python packages to instrument your application, since the integration is much simpler.

Integration philosophy

For our observability features, our backend service does not route requests to external services like OpenAI for you. Your servers will still dispatch the requests directly to the LLM or vector store provider to minimize latency and maximize privacy.

Our API and SDK are designed to asynchronously send performance and feedback data to our servers, after your core logic completes.

Developer philosophy

We designed our SDK to require as few lines as possible to minimize integration life.

For our observability SDK, our instrumentation is a near type-match of the services that we monitor. Keeping developer friction low is important to us.

Although our core service is closed source, our SDKs and OpenAPI specification are MIT licensed. We welcome pull requests for fixes and new language support.