ssmits commited on
Commit
95f7b40
1 Parent(s): 63034bd

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +8 -0
README.md CHANGED
@@ -10,6 +10,14 @@ license: apache-2.0
10
  language:
11
  - fr
12
  ---
 
 
 
 
 
 
 
 
13
  # sliced
14
 
15
  This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
 
10
  language:
11
  - fr
12
  ---
13
+ ## Why prune?
14
+
15
+ Falcon-11B is still undertrained, as can be seen by this graph:
16
+ ![image/png](https://cdn-uploads.huggingface.co/production/uploads/660c0a02cf274b3ab77dd6b7/QeaL9bOrPskustzFpjMUP.png)
17
+ This is why the choice is made by prune 50% of the layers.
18
+ Note that \~1B of continued pre-training (\~1M rows of 1k tokens) is still required to restore the perplexity of this model in the desired language.
19
+ I'm planning on doing that for certain languages, depending on how much compute will be available.
20
+
21
  # sliced
22
 
23
  This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).