bird-of-paradise commited on
Commit
99e0105
·
verified ·
1 Parent(s): 743f7a1

Added cross referencing more advanced architectures implementations

Browse files
Files changed (1) hide show
  1. README.md +14 -0
README.md CHANGED
@@ -6,6 +6,8 @@ license: mit
6
 
7
  This repository provides a detailed guide and implementation of the Transformer architecture from the ["Attention Is All You Need"](https://arxiv.org/abs/1706.03762) paper. The implementation focuses on understanding each component through clear code, comprehensive testing, and visual aids.
8
 
 
 
9
  ## Table of Contents
10
  1. [Summary and Key Insights](#summary-and-key-insights)
11
  2. [Implementation Details](#implementation-details)
@@ -213,3 +215,15 @@ The implementation includes visualizations of:
213
  These visualizations help understand the inner workings of the transformer and verify correct implementation.
214
 
215
  For detailed code and interactive examples, please refer to the complete implementation notebook.
 
 
 
 
 
 
 
 
 
 
 
 
 
6
 
7
  This repository provides a detailed guide and implementation of the Transformer architecture from the ["Attention Is All You Need"](https://arxiv.org/abs/1706.03762) paper. The implementation focuses on understanding each component through clear code, comprehensive testing, and visual aids.
8
 
9
+ For implementions of more recent architectural innovations from DeepSeek, see the **Related Implementations** section.
10
+
11
  ## Table of Contents
12
  1. [Summary and Key Insights](#summary-and-key-insights)
13
  2. [Implementation Details](#implementation-details)
 
215
  These visualizations help understand the inner workings of the transformer and verify correct implementation.
216
 
217
  For detailed code and interactive examples, please refer to the complete implementation notebook.
218
+
219
+ ## Related Implementations
220
+
221
+ This repository is part of a series implementing the key architectural innovations from the DeepSeek paper:
222
+
223
+ 1. **[Transformer Implementation Tutorial](https://huggingface.co/datasets/bird-of-paradise/transformer-from-scratch-tutorial)**(This Tutorial): A detailed tutorial on implementing transformer architecture with explanations of key components.
224
+
225
+ 2. **[DeepSeek Multi-head Latent Attention](https://huggingface.co/bird-of-paradise/deepseek-mla)**: Implementation of DeepSeek's MLA mechanism for efficient KV cache usage during inference.
226
+
227
+ 3. **[DeepSeek MoE](https://huggingface.co/bird-of-paradise/deepseek-moe)**: Implementation of DeepSeek's Mixture of Experts architecture that enables efficient scaling of model parameters.
228
+
229
+ Together, these implementations cover the core innovations that power DeepSeek's state-of-the-art performance. By combining the MoE architecture with Multi-head Latent Attention, you can build a complete DeepSeek-style model with improved training efficiency and inference performance.