xiaol commited on
Commit
3c37a5c
·
verified ·
1 Parent(s): 57af23c

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +122 -0
README.md ADDED
@@ -0,0 +1,122 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ language:
4
+ - en
5
+ - zh
6
+ base_model:
7
+ - deepseek-ai/DeepSeek-R1-Distill-Qwen-7B
8
+ - BlinkDL/rwkv-7-world
9
+ pipeline_tag: text-generation
10
+ library_name: transformers
11
+ ---
12
+
13
+ <div align="center">
14
+ <img src="https://huggingface.co/RWKV-Red-Team/ARWKV-7B-Preview-0.1/resolve/main/figures/banner-1.png" style="border-radius: 10px; width: 100%; height: 100%; object-fit: cover; box-shadow: 10px 10px 20px rgba(0, 0, 0, 0.5); border: 2px solid white;" alt="ARWKV" />
15
+ </div>
16
+
17
+
18
+ <h1 align="center">ARWKV🪿</h1>
19
+
20
+ <p align="center">
21
+ <a href="https://arxiv.org/abs/2501.15570"><b>Paper Link</b>👁️</a> | <a href="https://github.com/yynil/RWKVInside"><b>Github</b>✅</a>
22
+ </p>
23
+
24
+ # ARWKV-R1-7B (Preview 0.1)
25
+
26
+ <img src="https://huggingface.co/RWKV-Red-Team/ARWKV-7B-Preview-0.1/resolve/main/figures/architecture.png" alt="ARWKV Hybrid Architecture" width="30%">
27
+
28
+ *Preview version with **RWKV-7** time mixing and Transformer MLP*
29
+
30
+ ## 📌 Overview
31
+
32
+ **ALL YOU NEED IS RWKV**
33
+
34
+ This is an **early preview** of our 7B parameter hybrid RNN-Transformer model, trained on 2k context length **(only stage-2 applied, without SFT or DPO)** through 3-stage knowledge distillation from DeepSeek-R1-Distill-Qwen-1.5B. While being a foundational version, it demonstrates:
35
+
36
+ - ✅ RWKV-7's efficient recurrence mechanism
37
+ - ✅ No self-attention, fully O(n)
38
+ - ✅ Constant VRAM usage
39
+ - ✅ Single-GPU trainability
40
+
41
+ **Roadmap Notice**: We will soon open-source different enhanced versions with:
42
+ - 🚀 16k+ context capability
43
+ - 🧮 Math-specific improvements
44
+ - 📚 RL enhanced reasoning model
45
+
46
+ ## How to use
47
+
48
+ ```bash
49
+ pip3 install --upgrade rwkv-fla transformers
50
+ ```
51
+
52
+ ```python
53
+ from transformers import AutoModelForCausalLM, AutoTokenizer
54
+
55
+
56
+ model = AutoModelForCausalLM.from_pretrained(
57
+ "RWKV-Red-Team/ARWKV-R1-7B",
58
+ device_map="auto",
59
+ torch_dtype=torch.float16,
60
+ trust_remote_code=True,
61
+ )
62
+ tokenizer = AutoTokenizer.from_pretrained(
63
+ "RWKV-Red-Team/ARWKV-R1-7B"
64
+ )
65
+
66
+ system_prompt = "You are a world class trivia AI - provide accurate, succinct responses. "
67
+ messages = [
68
+ {"role": "system", "content": system_prompt},
69
+ {"role": "user", "content": prompt}]
70
+
71
+ text = tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
72
+ text = text + "<think>"
73
+ print(text)
74
+ model_inputs = tokenizer([text], return_tensors="pt").to(device)
75
+
76
+ streamer = TextIteratorStreamer(tokenizer, skip_prompt=False, skip_special_tokens=False)
77
+
78
+
79
+ generation_kwargs = dict(model_inputs, streamer=streamer, max_new_tokens=8192, do_sample=True,tokenizer=tokenizer,stop_strings=["<|end▁of▁sentence|>"])
80
+ thread = threading.Thread(target=model.generate, kwargs=generation_kwargs)
81
+ thread.start()
82
+
83
+ print("Streaming output:")
84
+ for new_text in streamer:
85
+ print(new_text, end="", flush=True)
86
+
87
+ thread.join()
88
+ ```
89
+
90
+ The output looks like :
91
+ ```bash
92
+ <|begin▁of▁sentence|>You are a world class trivia AI - provide accurate, succinct responses. <|User|>The world's largest rainforest, home to approximately three million species of plants and animals, is named after which river?<|Assistant|><think>
93
+ Okay, so I'm trying to solve this question about the world's largest rainforest and which river it's named after. Hmm, first, I think rainforest names often have links related to the region it's in. The most famous rainforest in the world is the Amazon. I remember hearing a lot about it being called that because rainforests are connected to specific river systems.
94
+
95
+ Now, I'm trying to recall which river is named after the Amazon. I think it's the Amazon River. But I want to be sure. Let me see... the Amazon is a major rainforest located in South America. The Amazon River flows through it, which is why it's named after it. That makes sense because it's a very important river. I recall reading somewhere that all the rainforests are named after rivers related to their regions. So if the Amazon is named after its River, then the name would naturally be related to its source.
96
+
97
+ I wonder if it's the Amazon itself that's named after it, or another river named after it. But the official name for the Amazon is the Amazon Rainforest. The most significant rainforest in the world is the Amazon, and its name probably started with river-sounding names.
98
+ </think>
99
+
100
+ The largest rainforest located in South America is the Amazon. It is named after the river named after it, which is the Amazon River. Therefore, the Amazon River is the name given to the Amazon Rain Forest.
101
+ ```
102
+
103
+
104
+
105
+ ## 🔑 Key Features
106
+ | Component | Specification | Note |
107
+ |-----------|---------------|------|
108
+ | Architecture | RWKV-7 TimeMix + SwiGLU | Hybrid design |
109
+ | Context Window | 2048 training CTX | *Preview limitation* |
110
+ | Training Tokens | 40M | Distillation-focused |
111
+ | Precision | FP16 inference recommended(16G Vram required) | 15%↑ vs BF16 |
112
+
113
+ ## 🏗️ Architecture Highlights
114
+ ### Core Modification Flow
115
+ ```diff
116
+ Transformer Decoder Layer:
117
+ - Multi-head Latent Attention(MLA)
118
+ + RWKV-7 Time Mixing (Eq.3)
119
+ - RoPE Positional Encoding
120
+ + State Recurrence
121
+ = Hybrid Layer Output
122
+ ```