GPU usage for model training

#6
by Li9889 - opened

Hi there,

what's the gpu you used for training your MedFinder model or what's the minimal gpu requirement to train the model? and how long does the training process take?

kind regards,
Li

Hi Li,

Thank you for reaching out!

For training our MedFinder model, we resize the entire 3D volume to 224×224×96. We also enable AMP (automatic mixed precision), which helps optimize memory usage and computational efficiency. With these settings, a single batch requires around 20GB of VRAM.

Given this, NVIDIA RTX 3090 or 4090 GPUs should be sufficient for training. If you’re looking for better performance, you can adjust the resize dimensions and report window size accordingly.

Let us know if you need further details!

Best regards,
Yinda

Sign up or log in to comment