Commit History

feat(train): overhead from 70% to 1% 🥳
2b7f5f1

boris commited on

feat(pjit): follow t5x style
7b5868f

boris commited on

fix(train): grads spec
00710bc

boris commited on

feat(train): improve pjit speed
f254058

boris commited on

fix(train): consider correct batch size
b7c7458

boris commited on

feat(train): custom start_preconditioning_step
8149924

boris commited on

feat(train): handle distributed_shampoo in pjit
032f623

boris commited on

feat: update distributed_shampoo + fix None spec
8a9e367

boris commited on

feat(train): distributed_shampoo with pjit
cc34d07

boris commited on

fix style
f044cb8

boris commited on

feat(train): restore opt_state efficiently
1bfc1b5

boris commited on

feat(model): clean way to load on cpu
12f323d

boris commited on

feat(train): load model on CPU
3d43591

boris commited on

feat(train): different rng per node
2d212d8

boris commited on

feat(train): no batch dimension with pjit
df1fe19

boris commited on

feat(train): progress on pjit
49597a2

boris commited on

feat(train): start pjit support
0081723

boris commited on

Load from wandb artifact (#121)
f69b21b
unverified

boris commited on

feat(train): update sweep config
bbbf7c8

boris commited on

Use DalleBartTokenizer. State restoration reverted to previous method:
ae983d7

Pedro Cuenca commited on

fix(train): variable not defined
4c87adf

boris commited on

feat(train): cleanup args
a2bf605

boris commited on

refactor(train): cleanup
274ba73

boris commited on

feat: custom gradient accumulation
2d07559

boris commited on

fix: style
df01fa8

boris commited on

feat(train): use MultiSteps for gradient accumulation
4fa53a5

boris commited on

Accept changes suggested by linter.
9f522b8

Pedro Cuenca commited on

Update help string for `model_name_or_path`.
290e443

Pedro Cuenca commited on

Update `resume_from_checkpoint` to use `from_pretrained`.
bb3f53e

Pedro Cuenca commited on

Load tokenizer associated to the model checkpoint, if possible.
a77c0d4

Pedro Cuenca commited on

Use model configuration unless a specific one is supplied.
5ec61cc

Pedro Cuenca commited on

fix: style
25862e8

boris commited on

feat: add more config of distributed_shampoo
89cf9ea

boris commited on

feat(train): refactor learning rate params
e2781bc

boris commited on

fix(train): handle seed_dataset
8b72ed8

boris commited on

feat: refactor TrainingArguments
adbdff9

boris commited on

fix: push_to_hub deprecated
23389f6

boris commited on

style: isort
531cd78

boris commited on

style: apply to distributed_shampoo
e669c1b

boris commited on

feat: add best_effort_memory_usage_reduction
4d518c7

boris commited on

feat: update distributed_shampoo
b90198c

boris commited on

fix: weight decay Adam + speed logging
7143593

boris commited on

feat: add micro config
e501f71

boris commited on

fix: shampoo -> distributed shampoo
edae62d

boris commited on

feat: update params
604a65d

boris commited on

feat: add shampoo optimizer
0b87452

boris commited on

feat: update sweep
e1555d4

boris commited on

feat: create config files
dc5c024

boris commited on

feat: allow abstract_init
772415c

boris commited on

fix: typo
5c84978

boris commited on