File size: 438 Bytes
dcf1577 |
1 2 3 4 5 |
Transformers : https://github.com/huggingface/transformers/blob/47fedc1665f14a60260b1b8357e682669300093a/src/transformers/models/llama/modeling_llama.py
Training Script: https://github.com/AnswerDotAI/fsdp_qlora/blob/3f7c583e985ff35e37a7b7497a7d4fedb77df695/experiments/cla/train.sh
This model shares KV activations every 3 layers. For example, layer 1 and 2 uses layer 0 kv activations, layer 4 and 5 uses layer 3 kv activations, etc.. |