Commit History

update with Colab Badge
889b443

Dean commited on

Merge branch 'simplifying-savta-depth' of OperationSavta/SavtaDepth into pipeline-setup
737e99e

Dean commited on

Merge branch 'pipeline-setup' into simplifying-savta-depth
9296b0b

Dean commited on

Update '.dvc/config'
a005e08

Dean commited on

Update 'README.md'
07a1ed8

Dean commited on

Added a simplified version of the colab notebook, which doesnt have a clean env setup. This simplifies the setup process greatly at the cost of hurting reproducibility. Further research into ways to get a clean env on colab is required
912d076

Dean commited on

train model on colab after fixing normalization bug
9c03436

dean commited on

remove secondary requirements (i.e. not things that are explicitly installed by the user), fix normalization problem, and use tqdm for image processing progress bar
068408a

Dean commited on

Add hosted storage remote and make it default
c6368bf

Dean commited on

Update 'README.md'
13584d1

Dean commited on

Merge branch 'pipeline-setup' of https://dagshub.com/OperationSavta/SavtaDepth into pipeline-setup
68475c1

Dean commited on

Fixing colab after feedback
a2c34a6

Dean commited on

Update 'README.md'
86f80fd

Dean commited on

Update 'README.md'
04605e5

Dean commited on

Update readme to include google colab setup + remove problematic packages from requirements.txt
715606b

Dean commited on

adding more setup options to readme
ee797b2

Dean commited on

removed problematic requirements
9cbee66

Dean commited on

Added dvc pull instruction as dvc checkout only works locally.
42fa488

Dean commited on

Added escaping slash to run_dev_env.sh so that it works in windows as well
9293d40

Dean commited on

Successfully configured the dataloader and trained for one epoch. Results are not so good, but it's something. Still the Fastaiv1 looked better qualitatively
13f0309

Dean commited on

Finished training of model, saving before qualitative testing. Seems model has actually learned something. Need to add metrics and params to the pipeline.
479e632

Dean commited on

Fixed a bug in the training stage where the model was not saved, commiting before training on colab
79fd7d0

Dean commited on

Training stage seems to work, creating a non-run commit to use colab as an orchestration machine
0b86a0a

Dean commited on

Seems like we are now using the correct format for fastai2. Still there is a strange bug where the signal is killed in training
34a1202

Dean commited on

Merge branch 'split_commands_in_readme' of OperationSavta/SavtaDepth into pipeline-setup
fec2513

Dean commited on

*Split commands of preparing environment in README file > *added -y for conda env creation for less interactions for the user
c8e7168

galbraun commited on

Migrated to fastai2, creating the DataLoader now works but I'm stuck on not being able to change the batch_size or num_workers as the interface seems to have changed
eeb74de

Dean commited on

Hopefully finished with the requirements debacle, now using conda but freezing requirements with pip as usual
f24654e

Dean commited on

Fixed requirements.txt to comply with conda format
50b2019

Dean commited on

Update readme to include aws cli
92de89b

Dean commited on

Finished data import and processing setup, bug in training step
3c0c5aa

Dean commited on

Transition to MLWorkspace docker and setup makefile with environment commands
9cd8f4a

Dean commited on

Merge branch 'master' of https://dagshub.com/OperationSavta/SavtaDepth
cdbc70e

Dean commited on

Added notebook for baseline - starting conversion to python modules + pipeline
15a285a

Dean commited on

Fix to do in readme
581a725

Dean commited on

Initial commit for setting up DS environment
47cfa83

Dean commited on

added BTS demo as a qualitative baseline for SavtaDepth
f7a3443

Dean commited on