sacrebleuEasy Playground

BLEU score computation for machine translation evaluation

Getting started with sacrebleuRun locally
Install
pip install sacrebleu
Python CodeRun locally
Expected Output
# Expected output shown below
# (Run locally with: sacrebleu)

sacrebleu is a third-party package. BLEU score computation for machine translation evaluation. Install with: pip install sacrebleu

Challenge

Try modifying the code above to explore different behaviors. Can you extend the example to handle a new use case?