Text Generation
Transformers
Safetensors
llama
go
text-generation-inference
Inference Endpoints
kenhktsui commited on
Commit
458dbdf
·
verified ·
1 Parent(s): 38a0dc2

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +12 -2
README.md CHANGED
@@ -39,6 +39,10 @@ My research goals are that:
39
 
40
  P.S: it is an intial release of model, and it is expected not to perform very well. But as we have more data, we will see if it can stand a battle with MCTS based engine like [Leela Zero](https://github.com/leela-zero/leela-zero).
41
 
 
 
 
 
42
  ## Data Preprocessing
43
  We take the leftmost variation of the game tree in SGF format and translate it into PGN.
44
 
@@ -85,5 +89,11 @@ This model achieves an eval_loss of 0.419 at step 7,600 (approximately 10.90 epo
85
  [10] Radford, Alec et al. “Language Models are Unsupervised Multitask Learners.” (2019).
86
 
87
  ## Citation
88
-
89
- <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
 
 
 
 
 
 
 
39
 
40
  P.S: it is an intial release of model, and it is expected not to perform very well. But as we have more data, we will see if it can stand a battle with MCTS based engine like [Leela Zero](https://github.com/leela-zero/leela-zero).
41
 
42
+ ## How to Play against GoFormer?
43
+ I've written an UI, please visit [https://github.com/kenhktsui/goformer](https://github.com/kenhktsui/goformer).
44
+
45
+
46
  ## Data Preprocessing
47
  We take the leftmost variation of the game tree in SGF format and translate it into PGN.
48
 
 
89
  [10] Radford, Alec et al. “Language Models are Unsupervised Multitask Learners.” (2019).
90
 
91
  ## Citation
92
+ If you find this work, please use:
93
+ ```
94
+ @misc{ktsui2024goformer,
95
+ title={GoFormer - Language Model That Plays Go},
96
+ author={Ken Tsui},
97
+ year={2024},
98
+ }
99
+ ```