shanearora
commited on
Commit
•
cd28930
1
Parent(s):
501c498
Update README.md
Browse files
README.md
CHANGED
@@ -197,7 +197,7 @@ AdamW optimizer parameters are shown below.
|
|
197 |
Optimizer settings comparison with peer models.
|
198 |
|
199 |
| | **OLMo 7B July 2024** | [OLMo 1.0 7B](https://huggingface.co/allenai/OLMo-7B-hf) | [Llama 2 7B](https://huggingface.co/meta-llama/Llama-2-7b) | [OpenLM 7B](https://laion.ai/blog/open-lm/) | [Falcon 7B](https://huggingface.co/tiiuae/falcon-7b) |
|
200 |
-
|
201 |
| warmup steps | 2500 | 5000 | 2000 | 2000 | 1000 |
|
202 |
| peak LR | 3.0E-04 | 3.0E-04 | 3.0E-04 | 3.0E-04 | 6.0E-04 |
|
203 |
| minimum LR | 3.0E-05 | 3.0E-05 | 3.0E-05 | 3.0E-05 | 1.2E-05 |
|
|
|
197 |
Optimizer settings comparison with peer models.
|
198 |
|
199 |
| | **OLMo 7B July 2024** | [OLMo 1.0 7B](https://huggingface.co/allenai/OLMo-7B-hf) | [Llama 2 7B](https://huggingface.co/meta-llama/Llama-2-7b) | [OpenLM 7B](https://laion.ai/blog/open-lm/) | [Falcon 7B](https://huggingface.co/tiiuae/falcon-7b) |
|
200 |
+
|-----------------------|------------------|------------------|---------------------|--------------------|--------------------|
|
201 |
| warmup steps | 2500 | 5000 | 2000 | 2000 | 1000 |
|
202 |
| peak LR | 3.0E-04 | 3.0E-04 | 3.0E-04 | 3.0E-04 | 6.0E-04 |
|
203 |
| minimum LR | 3.0E-05 | 3.0E-05 | 3.0E-05 | 3.0E-05 | 1.2E-05 |
|