Text Generation
Transformers
Safetensors
English
rwkv7
custom_code
ZhangRC commited on
Commit
0477182
·
verified ·
1 Parent(s): 77244ad

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -37,7 +37,7 @@ This is RWKV-7 model under flash-linear attention format.
37
  <!-- Provide the basic links for the model. -->
38
 
39
  - **Repository:** https://github.com/fla-org/flash-linear-attention ; https://github.com/BlinkDL/RWKV-LM
40
- - **Paper:** With in Progress
41
 
42
  ## Uses
43
 
 
37
  <!-- Provide the basic links for the model. -->
38
 
39
  - **Repository:** https://github.com/fla-org/flash-linear-attention ; https://github.com/BlinkDL/RWKV-LM
40
+ - **Paper:** https://arxiv.org/abs/2503.14456
41
 
42
  ## Uses
43