GPU requirements for finetuning

#42
by hiiamsid - opened

I have 8*A100 (80 memory) still i am not able to finetune on bf16 setting. What is the required memory to finetune it on mixed precision? I want to try without PEFT LORA

At bf16 - the model and the optimizer themselves would require 34*20 = 680GB of memory. 8xA100 will have 640. You will need to use two such nodes to fully finetune

As it seems like your question has been answered, if there is nothing else we can help you with on this matter, we will be closing this discussion for now.

If you have any further questions, feel free to reopen this discussion or start a new one.

Thank you for your contribution to this community!

richardllin changed discussion status to closed

Sign up or log in to comment