--- title: TransformersPipelinePlayground emoji: 💻 colorFrom: green colorTo: indigo sdk: gradio sdk_version: 5.19.0 app_file: app.py pinned: false license: mit --- ## Transformers Pipeline Playground 🎡🤖 **Welcome to the Transformers Pipeline Playground!** This project provides an interactive interface to explore and experiment with various transformer models using Hugging Face’s transformers library. Whether you’re a seasoned NLP practitioner or just getting started, this playground offers a hands-on experience with state-of-the-art models. [![PyPI version](https://badge.fury.io/py/my-python-package.svg)](https://pypi.org/project/my-python-package/) [![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT) [![Python 3.9+](https://img.shields.io/badge/python-%3E=3.9-blue.svg)](https://www.python.org/downloads) [![Code style: black](https://img.shields.io/badge/code%20style-black-000000.svg)](https://github.com/psf/black) [![Python CI](https://github.com/canstralian/transformers-pipeline-playground/actions/workflows/ci.yml/badge.svg)](https://github.com/canstralian/My-Python-Project-Template/actions/workflows/ci.yml) **Features** ✨ - Interactive Model Exploration: Load and test different transformer models directly in your browser. - User-Friendly Interface: Utilizes Gradio to create an accessible web-based UI. - Flexible Pipeline Selection: Choose from a variety of pipelines such as text generation, sentiment analysis, and more. ## Installation 🛠️ To set up the Transformers Pipeline Playground locally, follow these steps: 1. Clone the Repository: git clone https://github.com/canstralian/transformers-pipeline-playground.git cd transformers-pipeline-playground 2. Install Dependencies: It’s recommended to use a virtual environment: python3 -m venv env source env/bin/activate # On Windows, use `env\Scripts\activate` Then, install the required packages: pip install -r requirements.txt ## Usage 🚀 After installing the dependencies, you can launch the application with: python app.py This will start a local server. Open your browser and navigate to the displayed URL to access the interface. How It Works 🧠 The application leverages Hugging Face’s transformers library to load pre-trained models and create pipelines for various NLP tasks. The user interface is built with Gradio, providing an easy way to interact with the models. ## Contributing 🤝 Contributions are welcome! If you have ideas for improvements or new features, feel free to open an issue or submit a pull request. ## License 📄 This project is licensed under the Apache License 2.0. See the LICENSE file for details. Note: Remember, with great transformer power comes great responsibility. Use the models ethically and consider the implications of their outputs. Check out the configuration reference at https://huggingface.co/docs/hub/spaces-config-reference