Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
OpenMOSE
/
HRWKV7-Reka-Flash3-Preview
like
1
Text Generation
Transformers
causal-lm
linear-attention
rwkv
reka
knowledge-distillation
multilingual
arxiv:
2505.03005
License:
apache-2.0
Model card
Files
Files and versions
xet
Community
2
Train
Deploy
Use this model
Remove library name and Transformers code snippet
#2
by
nielsr
HF Staff
- opened
Jul 29
base:
refs/heads/main
←
from:
refs/pr/2
Discussion
Files changed
+0
-26
nielsr
Jul 29
No description provided.
Remove library name and Transformers code snippet
b96cdb80
Edit
Preview
Upload images, audio, and videos by dragging in the text input, pasting, or
clicking here
.
Tap or paste here to upload images
Ready to merge
This branch is ready to get merged automatically.
Comment
·
Sign up
or
log in
to comment