Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
OpenMOSE
/
HRWKV7-Reka-Flash3.1-Preview
like
1
Text Generation
Transformers
rwkv
linear-attention
reka
distillation
knowledge-distillation
hybrid-architecture
language-model
arxiv:
2505.03005
License:
apache-2.0
Model card
Files
Files and versions
xet
Community
1
Train
Deploy
Use this model
refs/pr/1
HRWKV7-Reka-Flash3.1-Preview
Ctrl+K
Ctrl+K
2 contributors
History:
5 commits
nielsr
HF Staff
Improve model card: Add metadata, paper abstract, links & transformers usage
a10fae9
verified
about 1 month ago
.gitattributes
Safe
1.57 kB
Upload hxa079.png
about 2 months ago
README.md
7.62 kB
Improve model card: Add metadata, paper abstract, links & transformers usage
about 1 month ago
hxa079-reka-flash3.1-stage2-hybrid.pth
42.7 GB
xet
Upload hxa079-reka-flash3.1-stage2-hybrid.pth with huggingface_hub
about 2 months ago
hxa079.png
Safe
1.15 MB
xet
Upload hxa079.png
about 2 months ago