WER / README.md
piyushmaharana's picture
working demo for wer
92868fb

A newer version of the Gradio SDK is available: 5.21.0

Upgrade
metadata
title: WER Evaluation Tool
emoji: 🎯
colorFrom: blue
colorTo: red
sdk: gradio
sdk_version: 5.16.0
app_file: app.py
pinned: false

WER Evaluation Tool

This Gradio app provides a user-friendly interface for calculating Word Error Rate (WER) and related metrics between reference and hypothesis texts. It's particularly useful for evaluating speech recognition or machine translation outputs.

Features

  • Calculate WER, MER, WIL, and WIP metrics
  • Text normalization options
  • Custom word filtering
  • Detailed error analysis
  • Example inputs for testing

How to Use

  1. Enter or paste your reference text
  2. Enter or paste your hypothesis text
  3. Configure options (normalization, word filtering)
  4. Click "Calculate WER" to see results

Local Development

  1. Clone the repository:
git clone https://github.com/yourusername/wer-evaluation-tool.git
cd wer-evaluation-tool
  1. Create and activate a virtual environment using uv:
uv venv
source .venv/bin/activate  # On Unix/macOS
# or
.venv\Scripts\activate  # On Windows
  1. Install dependencies:
uv pip install -r requirements.txt
  1. Run the app locally:
uv run python app_gradio.py

Installation

You can install the package directly from PyPI:

uv pip install wer-evaluation-tool

Testing

Run the test suite using pytest:

uv run pytest tests/

Contributing

  1. Fork the repository
  2. Create a new branch (git checkout -b feature/improvement)
  3. Make your changes
  4. Run tests to ensure everything works
  5. Commit your changes (git commit -am 'Add new feature')
  6. Push to the branch (git push origin feature/improvement)
  7. Create a Pull Request

License

This project is licensed under the MIT License - see the LICENSE file for details.

Acknowledgments

  • Thanks to all contributors who have helped with the development
  • Inspired by the need for better speech recognition evaluation tools
  • Built with Gradio

Contact

For questions or feedback, please:

  • Open an issue in the GitHub repository
  • Contact the maintainers at [email/contact information]

Citation

If you use this tool in your research, please cite:

@software{wer_evaluation_tool,
  title = {WER Evaluation Tool},
  author = {Your Name},
  year = {2024},
  url = {https://github.com/yourusername/wer-evaluation-tool}
}