Commit History

Training stage seems to work, creating a non-run commit to use colab as an orchestration machine
0b86a0a

Dean commited on

Seems like we are now using the correct format for fastai2. Still there is a strange bug where the signal is killed in training
34a1202

Dean commited on

Merge branch 'split_commands_in_readme' of OperationSavta/SavtaDepth into pipeline-setup
fec2513

Dean commited on

*Split commands of preparing environment in README file > *added -y for conda env creation for less interactions for the user
c8e7168

galbraun commited on

Migrated to fastai2, creating the DataLoader now works but I'm stuck on not being able to change the batch_size or num_workers as the interface seems to have changed
eeb74de

Dean commited on

Hopefully finished with the requirements debacle, now using conda but freezing requirements with pip as usual
f24654e

Dean commited on

Fixed requirements.txt to comply with conda format
50b2019

Dean commited on

Update readme to include aws cli
92de89b

Dean commited on

Finished data import and processing setup, bug in training step
3c0c5aa

Dean commited on

Transition to MLWorkspace docker and setup makefile with environment commands
9cd8f4a

Dean commited on

Merge branch 'master' of https://dagshub.com/OperationSavta/SavtaDepth
cdbc70e

Dean commited on

Added notebook for baseline - starting conversion to python modules + pipeline
15a285a

Dean commited on

Fix to do in readme
581a725

Dean commited on

Initial commit for setting up DS environment
47cfa83

Dean commited on

added BTS demo as a qualitative baseline for SavtaDepth
f7a3443

Dean commited on