Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
OpenMOSE
/
HRWKV7-Reka-Flash3.1-Preview
like
1
Text Generation
Transformers
rwkv
linear-attention
reka
distillation
knowledge-distillation
hybrid-architecture
language-model
arxiv:
2505.03005
License:
apache-2.0
Model card
Files
Files and versions
xet
Community
1
Train
Deploy
Use this model
main
HRWKV7-Reka-Flash3.1-Preview
/
hxa079.png
Commit History
Upload hxa079.png
cceb8d3
verified
OpenMOSE
commited on
Jul 18