StevenTang commited on
Commit
fa08e14
·
1 Parent(s): 752a3ec

Update README

Browse files
Files changed (1) hide show
  1. README.md +3 -3
README.md CHANGED
@@ -29,7 +29,7 @@ The MVP-multi-task model was proposed in [**MVP: Multi-task Supervised Pre-train
29
  The detailed information and instructions can be found [https://github.com/RUCAIBox/MVP](https://github.com/RUCAIBox/MVP).
30
 
31
  ## Model Description
32
- MVP-multi-task is a prompt-based model that MVP is further equipped with prompts pre-trained using labeled open dialogue system datasets. It is a variant (MVP+M) of our main [MVP](https://huggingface.co/RUCAIBox/mvp) model. It follows a Transformer encoder-decoder architecture with layer-wise prompts.
33
 
34
  MVP is specially designed for natural language generation and can be adapted to a wide range of generation tasks, including but not limited to summarization, data-to-text generation, open-ended dialogue system, story generation, question answering, question generation, task-oriented dialogue system, commonsense generation, paraphrase generation, text style transfer, and text simplification. Our model can also be adapted to natural language understanding tasks such as sequence classification and (extractive) question answering.
35
 
@@ -39,7 +39,7 @@ For summarization:
39
  >>> from transformers import MvpTokenizer, MvpForConditionalGeneration
40
 
41
  >>> tokenizer = MvpTokenizer.from_pretrained("RUCAIBox/mvp")
42
- >>> model = MvpForConditionalGeneration.from_pretrained("RUCAIBox/mvp")
43
 
44
  >>> inputs = tokenizer(
45
  ... "Summarize: You may want to stick it to your boss and leave your job, but don't do it if these are your reasons.",
@@ -55,7 +55,7 @@ For data-to-text generation:
55
  >>> from transformers import MvpTokenizerFast, MvpForConditionalGeneration
56
 
57
  >>> tokenizer = MvpTokenizerFast.from_pretrained("RUCAIBox/mvp")
58
- >>> model = MvpForConditionalGeneration.from_pretrained("RUCAIBox/mvp")
59
 
60
  >>> inputs = tokenizer(
61
  ... "Describe the following data: Iron Man | instance of | Superhero [SEP] Stan Lee | creator | Iron Man",
 
29
  The detailed information and instructions can be found [https://github.com/RUCAIBox/MVP](https://github.com/RUCAIBox/MVP).
30
 
31
  ## Model Description
32
+ MVP-multi-task is a prompt-based model that MVP is further equipped with prompts pre-trained using a mixture of labeled datasets. It is a variant (MVP+M) of our main [MVP](https://huggingface.co/RUCAIBox/mvp) model. It follows a Transformer encoder-decoder architecture with layer-wise prompts.
33
 
34
  MVP is specially designed for natural language generation and can be adapted to a wide range of generation tasks, including but not limited to summarization, data-to-text generation, open-ended dialogue system, story generation, question answering, question generation, task-oriented dialogue system, commonsense generation, paraphrase generation, text style transfer, and text simplification. Our model can also be adapted to natural language understanding tasks such as sequence classification and (extractive) question answering.
35
 
 
39
  >>> from transformers import MvpTokenizer, MvpForConditionalGeneration
40
 
41
  >>> tokenizer = MvpTokenizer.from_pretrained("RUCAIBox/mvp")
42
+ >>> model = MvpForConditionalGeneration.from_pretrained("RUCAIBox/mvp-multi-task")
43
 
44
  >>> inputs = tokenizer(
45
  ... "Summarize: You may want to stick it to your boss and leave your job, but don't do it if these are your reasons.",
 
55
  >>> from transformers import MvpTokenizerFast, MvpForConditionalGeneration
56
 
57
  >>> tokenizer = MvpTokenizerFast.from_pretrained("RUCAIBox/mvp")
58
+ >>> model = MvpForConditionalGeneration.from_pretrained("RUCAIBox/mvp-multi-task")
59
 
60
  >>> inputs = tokenizer(
61
  ... "Describe the following data: Iron Man | instance of | Superhero [SEP] Stan Lee | creator | Iron Man",