Commit History

feat: use_artifact if run existing
a5ed112

boris commited on

Load from wandb artifact (#121)
f69b21b
unverified

boris commited on

Style (isort).
f9d51f7

Pedro Cuenca commited on

feat(train): update sweep config
bbbf7c8

boris commited on

Use DalleBartTokenizer. State restoration reverted to previous method:
ae983d7

Pedro Cuenca commited on

Tokenizer, config, model can be loaded from wandb.
7e48337

Pedro Cuenca commited on

fix(train): variable not defined
4c87adf

boris commited on

feat(train): cleanup args
a2bf605

boris commited on

Merge pull request #122 from borisdayma/feat-acccum
c91ceb7
unverified

boris commited on

feat(data): support accumulation in non-streaming
88c8e06

boris commited on

refactor(train): cleanup
274ba73

boris commited on

feat: custom gradient accumulation
2d07559

boris commited on

fix: style
df01fa8

boris commited on

feat(train): use MultiSteps for gradient accumulation
4fa53a5

boris commited on

Change import order again.
2b2be9b

Pedro Cuenca commited on

Fix import order to make isort happy.
64d99b2

Pedro Cuenca commited on

Accept changes suggested by linter.
9f522b8

Pedro Cuenca commited on

Update help string for `model_name_or_path`.
290e443

Pedro Cuenca commited on

Update `resume_from_checkpoint` to use `from_pretrained`.
bb3f53e

Pedro Cuenca commited on

Never consider local dirs as remote wandb references.
08dd098

Pedro Cuenca commited on

Load tokenizer associated to the model checkpoint, if possible.
a77c0d4

Pedro Cuenca commited on

Store resolved path after loading model.
55a631d

Pedro Cuenca commited on

Use model configuration unless a specific one is supplied.
5ec61cc

Pedro Cuenca commited on

Override from_pretrained to support wandb artifacts.
1023afa

Pedro Cuenca commited on

Merge pull request #118 from borisdayma/feat-optim
193c88c
unverified

boris commited on

fix: style
25862e8

boris commited on

feat: add more config of distributed_shampoo
89cf9ea

boris commited on

fix(data): no shuffling of validation data
ddcbc6a

boris commited on

feat(train): refactor learning rate params
e2781bc

boris commited on

fix(train): handle seed_dataset
8b72ed8

boris commited on

feat: refactor TrainingArguments
adbdff9

boris commited on

fix: push_to_hub deprecated
23389f6

boris commited on

feat: support pypi
f5dba1e

boris commited on

doc: update contributions
e3b1b56

boris commited on

Merge pull request #117 from borisdayma/fix-inference
ef985be
unverified

boris commited on

fix(inference): use float32 + flatten logits
71c4de3

boris commited on

Merge pull request #115 from borisdayma/feat-shampoo
3a3d375
unverified

boris commited on

feat: update inference pipeline
af807f7

boris commited on

style: isort
531cd78

boris commited on

style: apply to distributed_shampoo
e669c1b

boris commited on

feat: add best_effort_memory_usage_reduction
4d518c7

boris commited on

doc: add reference to Distributed Shampoo
db882b8

boris commited on

feat: update distributed_shampoo
b90198c

boris commited on

fix: weight decay Adam + speed logging
7143593

boris commited on

feat: add micro config
e501f71

boris commited on

fix: shampoo -> distributed shampoo
edae62d

boris commited on

feat: update params
604a65d

boris commited on

feat: add shampoo optimizer
0b87452

boris commited on

feat: update sweep
e1555d4

boris commited on

feat: create config files
dc5c024

boris commited on