πŸ”₯ Fine-Tuned BERT on GoEmotions Dataset

πŸ“– Model Overview

This model is a fine-tuned version of BERT (bert-base-uncased) on the GoEmotions dataset for multi-label emotion classification. It can predict multiple emotions per input text.

πŸ“Š Performance

Metric Score
Accuracy 46.57%
F1 Score 56.41%
Hamming Loss 3.39%

πŸ“‚ Model Usage

from transformers import AutoTokenizer, AutoModelForSequenceClassification
import torch

# Load model and tokenizer
model_name = "codewithdark/bert-Gomotions"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForSequenceClassification.from_pretrained(model_name)

# Emotion labels (adjust based on your dataset)
emotion_labels = [
    "Admiration", "Amusement", "Anger", "Annoyance", "Approval", "Caring", "Confusion",
    "Curiosity", "Desire", "Disappointment", "Disapproval", "Disgust", "Embarrassment",
    "Excitement", "Fear", "Gratitude", "Grief", "Joy", "Love", "Nervousness", "Optimism",
    "Pride", "Realization", "Relief", "Remorse", "Sadness", "Surprise", "Neutral"
]

# Example text
text = "I'm so happy today!"
inputs = tokenizer(text, return_tensors="pt")

# Predict
with torch.no_grad():
    outputs = model(**inputs)
    probs = torch.sigmoid(outputs.logits).squeeze(0)  # Convert logits to probabilities

# Get top 5 predictions
top5_indices = torch.argsort(probs, descending=True)[:5]  # Get indices of top 5 labels
top5_labels = [emotion_labels[i] for i in top5_indices]
top5_probs = [probs[i].item() for i in top5_indices]

# Print results
print("Top 5 Predicted Emotions:")
for label, prob in zip(top5_labels, top5_probs):
    print(f"{label}: {prob:.4f}")

'''
output:
Top 5 Predicted Emotions:
Joy: 0.9478
Love: 0.7854
Optimism: 0.6342
Admiration: 0.5678
Excitement: 0.5231
'''

πŸ‹οΈβ€β™‚οΈ Training Details

  • Model: bert-base-uncased
  • Dataset: GoEmotions
  • Optimizer: AdamW
  • Loss Function: BCEWithLogitsLoss (Binary Cross-Entropy for multi-label classification)
  • Batch Size: 16
  • Epochs: 3
  • Evaluation Metrics: Accuracy, F1 Score, Hamming Loss

πŸ“Œ How to Use in Hugging Face

from transformers import pipeline

classifier = pipeline("text-classification", model="codewithdark/bert-Gomotions", top_k=None)
classifier("I'm so excited about the trip!")

πŸ› οΈ Citation

If you use this model, please cite:

@misc{your_model,
  author = {codewithdark},
  title = {Fine-tuned BERT on GoEmotions},
  year = {2025},
  url = {https://huggingface.co/codewithdark/bert-Gomotions}
}
Downloads last month
19
Safetensors
Model size
110M params
Tensor type
F32
Β·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for codewithdark/bert-Gomotions

Finetuned
(3348)
this model