Human evaluators
Certain generative pipelines might require that you spend time manually reviewing before deploying your changes. Human evaluators provide manual annotation workflows for your team to grade.
When creating your evaluators, write instructions in markdown that your team can follow to grade your results.
Your team can then manually grade a generative output using the markdown instructions that you provide.