Update README.md
Browse files
README.md
CHANGED
@@ -14,7 +14,7 @@ The model categorizes images into two classes:
|
|
14 |
- **Class 0:** "Unsafe Content" – indicating that the image contains vulgarity, nudity, or explicit content.
|
15 |
- **Class 1:** "Safe Content" – indicating that the image is appropriate and does not contain any unsafe elements.
|
16 |
|
17 |
-
# **Run with Transformers
|
18 |
|
19 |
```python
|
20 |
!pip install -q transformers torch pillow gradio
|
|
|
14 |
- **Class 0:** "Unsafe Content" – indicating that the image contains vulgarity, nudity, or explicit content.
|
15 |
- **Class 1:** "Safe Content" – indicating that the image is appropriate and does not contain any unsafe elements.
|
16 |
|
17 |
+
# **Run with Transformers🤗**
|
18 |
|
19 |
```python
|
20 |
!pip install -q transformers torch pillow gradio
|