AULE_GPT - GPT-Neo 2.7B Ajustado

Este modelo está basado en GPT-Neo 2.7B y ha sido ajustado para Aula Eléctrica.

Uso

Puedes cargarlo con transformers:

from transformers import AutoModelForCausalLM, AutoTokenizer

modelo = AutoModelForCausalLM.from_pretrained("AulaElectrica/AULE_GPT")
tokenizador = AutoTokenizer.from_pretrained("AulaElectrica/AULE_GPT")
Downloads last month
61
Safetensors
Model size
2.72B params
Tensor type
F32
·
U8
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no library tag.