XLM-RoBERTa Base for Topic Classification

Model fine-tuned for Russian social media topic classification.

  • Trained on: 512_posts_24_topics.csv
  • Epochs: 17
  • Learning rate: 2e-05
Epoch Training Loss Validation Loss Accuracy F1 Macro F1 Weighted
1 No log 3.026107 0.107527 0.018696 0.033548
2 No log 2.870733 0.182796 0.052066 0.116982
3 No log 2.696235 0.225806 0.062794 0.135901
4 No log 2.468899 0.301075 0.117249 0.195119
5 No log 2.275375 0.365591 0.164985 0.264671
6 No log 2.103708 0.430108 0.246894 0.348294
7 No log 1.953564 0.494624 0.297129 0.416058
8 No log 1.905395 0.537634 0.356591 0.465515
9 No log 1.783440 0.516129 0.349648 0.457348
10 No log 1.733219 0.559140 0.385719 0.496481
11 No log 1.668128 0.537634 0.361045 0.470044
12 No log 1.633103 0.559140 0.387321 0.498477
13 No log 1.588825 0.569892 0.392370 0.508810
14 No log 1.581068 0.591398 0.410717 0.533581
15 No log 1.560641 0.591398 0.413310 0.533612
16 No log 1.545080 0.580645 0.403768 0.525466
17 No log 1.550952 0.591398 0.410717 0.533581
Downloads last month
5
Safetensors
Model size
278M params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support