Pretrain BASE 512 masking context length DebertaV2 on Malaysian text

Special thanks to https://github.com/aisyahrzk for pretraining DebertaV2 Base.

WanDB at https://wandb.ai/aisyahrazak/deberta-base?nw=nwuseraisyahrazak

Downloads last month
34
Safetensors
Model size
114M params
Tensor type
BF16
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and HF Inference API was unable to determine this model’s pipeline type.

Model tree for mesolitica/malaysian-debertav2-base

Finetunes
1 model

Collection including mesolitica/malaysian-debertav2-base