Text Generation
Transformers
Safetensors
English
stripedhyena
custom_code
Zymrael commited on
Commit
cabde32
·
1 Parent(s): e4713f6

chore: update readme

Browse files
Files changed (1) hide show
  1. README.md +8 -4
README.md CHANGED
@@ -6,18 +6,22 @@ language:
6
 
7
  ## StripedHyena-Hessian-7B (SH-7B)
8
 
 
 
 
 
9
  ### About
10
 
11
- One of the focus areas at Together Research is new architectures for long context, improved training, and inference performance over the Transformer architecture. Spinning out of a research program from our team and academic collaborators, with roots in signal processing-inspired sequence models, we are excited to introduce the StripedHyena models. StripedHyena is the first alternative model competitive with the best open-source Transformers of similar sizes in short and long-context evaluations.
12
 
13
- - Read more here in [our blog](https://together-ai.webflow.io/blog/stripedhyena-7b)
14
  - Play with the model on our playground!
15
- - Dive into the details of our [Standalone implementation](https://github.com/togethercomputer/stripedhyena)
16
 
17
  ### Model Architecture
18
 
19
  StripedHyena is a hybrid architecture composed of multi-head, grouped-query attention and gated convolutions arranged in [Hyena](https://arxiv.org/abs/2302.10866) blocks, different from traditional decoder-only Transformers.
20
  - Costant memory decoding in Hyena blocks via representation of convolutions as state-space models (modal or canonical form), or as truncated filters.
21
  - Low latency, faster decoding and higher throughput than Transformers.
22
- - Improvement to training and inference-optimal scaling laws, compared to Transformers.
23
  - Trained on sequences of up to 32k, allowing it to process longer prompts.
 
6
 
7
  ## StripedHyena-Hessian-7B (SH-7B)
8
 
9
+ <p align="center">
10
+ <img src="https://cdn-uploads.huggingface.co/production/uploads/62a1306bbe7fa896d2c8de44/Bfjh77emDsWOY-VmfvU9C.png" width="30%" />
11
+ </p>
12
+
13
  ### About
14
 
15
+ One of the focus areas at Together Research is new architectures for long context, improved training, and inference performance over the Transformer architecture. Spinning out of a research program from our team and academic collaborators, with roots in **signal processing-inspired sequence models**, we are excited to introduce the **StripedHyena** models. StripedHyena is the **first alternative model competitive with the best open-source Transformers** of similar sizes in short and long-context evaluations.
16
 
17
+ - Read more here in [our blog](https://www.together.ai/blog/stripedhyena-7b)
18
  - Play with the model on our playground!
19
+ - Dive into the details of our [standalone implementation](https://github.com/togethercomputer/stripedhyena), and our related research: [1](https://arxiv.org/abs/2302.10866), [2](https://arxiv.org/abs/2310.18780), [3](https://arxiv.org/abs/2311.05908).
20
 
21
  ### Model Architecture
22
 
23
  StripedHyena is a hybrid architecture composed of multi-head, grouped-query attention and gated convolutions arranged in [Hyena](https://arxiv.org/abs/2302.10866) blocks, different from traditional decoder-only Transformers.
24
  - Costant memory decoding in Hyena blocks via representation of convolutions as state-space models (modal or canonical form), or as truncated filters.
25
  - Low latency, faster decoding and higher throughput than Transformers.
26
+ - Improvement to training and inference-optimal scaling laws, compared to optimized Transformer architectures such as Llama.
27
  - Trained on sequences of up to 32k, allowing it to process longer prompts.