|
--- |
|
title: Srt Eval |
|
emoji: 🌍 |
|
colorFrom: green |
|
colorTo: indigo |
|
sdk: gradio |
|
sdk_version: 5.4.0 |
|
app_file: app.py |
|
pinned: false |
|
license: mit |
|
short_description: Visualize CER / WER for SRT subtitles |
|
--- |
|
|
|
# SRT Evaluation Tool |
|
|
|
This Gradio app compares two SRT files and calculates Character Error Rate (CER) and Word Error Rate (WER) metrics, with and without punctuation. It provides a detailed visualization of the differences between the files. |
|
|
|
## Features |
|
|
|
- Upload and compare two SRT files |
|
- Calculate CER/WER metrics |
|
- Visualize text differences |
|
- Download visualization as PNG or PDF |
|
- Example files included for testing |
|
|
|
## Usage |
|
|
|
1. Upload a reference (golden) SRT file |
|
2. Upload a target SRT file for comparison |
|
3. Click "Process Files" to see the results |
|
4. Or use "Load Example" to try with sample files |
|
|
|
## About |
|
|
|
This tool is particularly useful for evaluating machine-generated subtitles against human-created references, supporting both Chinese and English text. |
|
|