GuidesEvaluationEvaluation Overview

Evaluation sets help you quantify accuracy before promoting changes.

Create a set

  1. Visit https://app.algorythmos.fr/evaluations and click New evaluation.
  2. Upload labeled documents (JSON, CSV, or via the labeling UI).
  3. Choose processors or workflows to run against the ground truth.

Metrics

  • Precision / recall per field.
  • Table accuracy with row/column matching.
  • Latency percentiles to monitor performance regressions.

Automate via API

curl -X POST https://api.algorythmos.fr/evaluations \
 -H "x-api-key: $ALG_KEY" \
 -H "Content-Type: application/json" \
 -d '{"name":"invoice-regression","processor_id":"invoice-extractor","dataset_id":"ds_92F"}'

Track results using GET /evaluations/{evaluation_id} and promote the processor version once metrics meet your targets.