Text Classification
Transformers
Safetensors
English
HHEMv2Config
custom_code

not loading from checkpoint

#18
by tcapelle - opened

This model is not packed correctly and I am not even sure it loads the model checkpoint shared on this repo.
It could inherit from T5 and also store the tokenizer here.

can you provide more details such as the code to show you tried and the error message?

I feel that the model should be packed as a T5 model, with the predict logic on the pipeline (not pulling a tokenizer from another repo on the init).
I will try to put together a PR with the changes.

Ok, did some repack here:

https://huggingface.co/tcapelle/hallu_scorer

The issue I was facing is that we shouldn't load a tokenizer during model init, that's what the pipeline should be doing.

Also, the underlying model is just a flan-t5-base for token classification, no need to subclass. We could put the prompt template in the tokenizer actually.

Sign up or log in to comment