Improve dataset card for DCLM-10B-Qwen2-binidx with paper, code, and usage

#1
by nielsr HF Staff - opened

This PR enhances the dataset card for DCLM-10B-Qwen2-binidx by:

  • Updating metadata with text-generation as the task category and adding relevant tags (linear-attention, distillation, language-modeling, llm, rwkv, qwen).
  • Linking the dataset to its associated paper, RADLADS: Rapid Attention Distillation to Linear Attention Decoders at Scale.
  • Including a link to the official RADLADS GitHub repository, which contains the training code and further details.
  • Providing clear sample usage instructions for downloading the dataset files.
  • Adding the BibTeX citation for the associated paper.
recursal org

Merged.

KaraKaraWitch changed pull request status to merged

Sign up or log in to comment