KoBART-base-v2

With the addition of chatting data, the model is trained to handle the semantics of sequences longer than KoBART.

from transformers import PreTrainedTokenizerFast, BartModel

tokenizer = PreTrainedTokenizerFast.from_pretrained('hyunwoongko/kobart')
model = BartModel.from_pretrained('hyunwoongko/kobart')

Performance

NSMC

  • acc. : 0.901

hyunwoongko/kobart

  • Added bos/eos post processor
  • Removed token_type_ids
Downloads last month
73,382
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for hyunwoongko/kobart

Finetunes
4 models

Space using hyunwoongko/kobart 1