PY007 commited on
Commit
47ae335
1 Parent(s): bf1a0ed

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +0 -15
README.md CHANGED
@@ -24,21 +24,6 @@ We adopted exactly the same architecture and tokenizer as Llama 2. This means Ti
24
  #### This Model
25
  This is an intermediate checkpoint with 240K steps and 503B tokens.
26
 
27
- #### Releases Schedule
28
- We will be rolling out intermediate checkpoints following the below schedule. We also include some baseline models for comparison.
29
-
30
- | Date | HF Checkpoint | Tokens | Step | HellaSwag Acc_norm |
31
- |------------|-------------------------------------------------|--------|------|---------------------|
32
- | Baseline | [StableLM-Alpha-3B](https://huggingface.co/stabilityai/stablelm-base-alpha-3b)| 800B | -- | 38.31 |
33
- | Baseline | [Pythia-1B-intermediate-step-50k-105b](https://huggingface.co/EleutherAI/pythia-1b/tree/step50000) | 105B | 50k | 42.04 |
34
- | Baseline | [Pythia-1B](https://huggingface.co/EleutherAI/pythia-1b) | 300B | 143k | 47.16 |
35
- | 2023-09-04 | [TinyLlama-1.1B-intermediate-step-50k-105b](https://huggingface.co/PY007/TinyLlama-1.1B-step-50K-105b) | 105B | 50k | 43.50 |
36
- | 2023-09-16 | -- | 500B | -- | -- |
37
- | 2023-10-01 | -- | 1T | -- | -- |
38
- | 2023-10-16 | -- | 1.5T | -- | -- |
39
- | 2023-10-31 | -- | 2T | -- | -- |
40
- | 2023-11-15 | -- | 2.5T | -- | -- |
41
- | 2023-12-01 | -- | 3T | -- | -- |
42
 
43
  #### How to use
44
  You will need the transformers>=4.31
 
24
  #### This Model
25
  This is an intermediate checkpoint with 240K steps and 503B tokens.
26
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
27
 
28
  #### How to use
29
  You will need the transformers>=4.31