dahara1 commited on
Commit
336d13a
·
verified ·
1 Parent(s): cad50b1

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -3
README.md CHANGED
@@ -47,12 +47,12 @@ Benchmarks show significantly better English-Japanese and Japanese-English trans
47
  ![image/png](c3tr-version3.png)
48
  翻訳タスクに関しては、より大きなモデルに負けない性能を発揮します
49
  元の画像クレジット Sebastian Ruder(@seb_ruder)
50
- (※FloRES実行時はwriting_style: journalistic、WMT23実行時はwriting_style: casualを指定。wmt23.ja-en時は一行だけ改行不揃いを手修正)
51
 
52
  For translation tasks, it performs as well as larger models.
53
  Original image credit: Sebastian Ruder (@seb_ruder)
54
- (*When running FloRES, specify writing_style: journalistic, and when running WMT23, specify writing_style: casual. When running wmt23.ja-en, one line was manually corrected for line breaks.)
55
 
 
 
56
 
57
  GoogleのウェブサービスColabを使うと無料でC3TR-Adapterを試す事が出来ます。リンク先でOpen In Colabボタンを押して起動してください。
58
  You can try C3TR-Adapter for free using Google's web service Colab. Please press the Open In Colab button on the link to activate it.
@@ -93,7 +93,7 @@ import json
93
  from transformers import AutoModelForCausalLM, AutoTokenizer
94
  from peft import PeftModel
95
 
96
- model_id = "unsloth/gemma-7b-bnb-4bit"
97
  peft_model_id = "webbigdata/C3TR-Adapter"
98
 
99
  model = AutoModelForCausalLM.from_pretrained(model_id, torch_dtype=torch.bfloat16, device_map="auto")
 
47
  ![image/png](c3tr-version3.png)
48
  翻訳タスクに関しては、より大きなモデルに負けない性能を発揮します
49
  元の画像クレジット Sebastian Ruder(@seb_ruder)
 
50
 
51
  For translation tasks, it performs as well as larger models.
52
  Original image credit: Sebastian Ruder (@seb_ruder)
 
53
 
54
+ 翻訳ベンチマークの実行方法やその他のベンチマーク結果については[JTransBench](https://github.com/webbigdata-jp/JTransBench)を参考にしてください。
55
+ For instructions on how to run the translation benchmark and other benchmark results, please refer to [JTransBench](https://github.com/webbigdata-jp/JTransBench).
56
 
57
  GoogleのウェブサービスColabを使うと無料でC3TR-Adapterを試す事が出来ます。リンク先でOpen In Colabボタンを押して起動してください。
58
  You can try C3TR-Adapter for free using Google's web service Colab. Please press the Open In Colab button on the link to activate it.
 
93
  from transformers import AutoModelForCausalLM, AutoTokenizer
94
  from peft import PeftModel
95
 
96
+ model_id = "unsloth/gemma-2-9b-it-bnb-4bit"
97
  peft_model_id = "webbigdata/C3TR-Adapter"
98
 
99
  model = AutoModelForCausalLM.from_pretrained(model_id, torch_dtype=torch.bfloat16, device_map="auto")