STD-BPE-CEREBRAS

A standard CerebrasGPT-111M model using a pretrained Byte-Pair-Encoding (BPE) tokenizer. This model is used as a baseline for understanding how pretrained tokenizers perform on Danish text.

Downloads last month
211
Safetensors
Model size
111M params
Tensor type
FP16
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Collection including meelu/STD-BPE-CEREBRAS