ALBERT XLarge Spanish

This is an ALBERT model trained on a big spanish corpora. The model was trained on a single TPU v3-8 with the following hyperparameters and steps/time:

  • LR: 0.0003125
  • Batch Size: 128
  • Warmup ratio: 0.00078125
  • Warmup steps: 6250
  • Goal steps: 8000000
  • Total steps: 2775000
  • Total training time (aprox): 64.2 days.

Training loss

https://drive.google.com/uc?export=view&id=1rw0vvqZY9LZAzRUACLjmP18Fc6D1fv7x

Downloads last month
8
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and HF Inference API was unable to determine this model’s pipeline type.

Dataset used to train dccuchile/albert-xlarge-spanish