Skip to main content
Version: 2.0.0

Human evaluators

Certain generative pipelines might require that you spend time manually reviewing before deploying your changes. Human evaluators provide manual annotation workflows for your team to grade.

When creating your evaluators, write instructions in markdown that your team can follow to grade your results.

Create human evaluator

Your team can then manually grade a generative output using the markdown instructions that you provide.