Text Generation
Transformers
Safetensors
English
stripedhyena
custom_code
Zymrael commited on
Commit
af9abca
1 Parent(s): 6be2da5

chore: info

Browse files
Files changed (1) hide show
  1. README.md +1 -3
README.md CHANGED
@@ -9,9 +9,7 @@ language:
9
 
10
  ### Model Architecture
11
 
12
- The architecture of StripedHyena-Hessian-7B is different from traditional decoder-only Transformers.
13
-
14
- StripedHyena is a hybrid architecture composed of multi-head, grouped-query attention and gated convolutions arranged in [Hyena](https://arxiv.org/abs/2302.10866) blocks.
15
  - Costant memory decoding in Hyena blocks via representation of convolutions as state-space models (modal or canonical form), or as truncated filters.
16
  - Lower latency to preprocess long prompts.
17
  - Improvements to training and inference compute-optimal scaling laws, compared to Transformers.
 
9
 
10
  ### Model Architecture
11
 
12
+ StripedHyena is a hybrid architecture composed of multi-head, grouped-query attention and gated convolutions arranged in [Hyena](https://arxiv.org/abs/2302.10866) blocks, different from traditional decoder-only Transformers.
 
 
13
  - Costant memory decoding in Hyena blocks via representation of convolutions as state-space models (modal or canonical form), or as truncated filters.
14
  - Lower latency to preprocess long prompts.
15
  - Improvements to training and inference compute-optimal scaling laws, compared to Transformers.