Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
OpenMOSE
/
HRWKV7-Reka-Flash3.1-Preview
like
1
Text Generation
Transformers
rwkv
linear-attention
reka
distillation
knowledge-distillation
hybrid-architecture
language-model
arxiv:
2505.03005
License:
apache-2.0
Model card
Files
Files and versions
xet
Community
1
Train
Deploy
Use this model
cceb8d3
HRWKV7-Reka-Flash3.1-Preview
Ctrl+K
Ctrl+K
2 contributors
History:
2 commits
OpenMOSE
Upload hxa079.png
cceb8d3
verified
about 2 months ago
.gitattributes
Safe
1.57 kB
Upload hxa079.png
about 2 months ago
README.md
Safe
31 Bytes
initial commit
about 2 months ago
hxa079.png
Safe
1.15 MB
xet
Upload hxa079.png
about 2 months ago