Fine-tuning question about openai-community/gpt2
#102
by
the-zooid
- opened
I have an Nvidia GeForce RTX 4060 16gb, is this enough for fine-tuning and training a "spawn" of gpt2 to understand and generate code in a new custom scripting programming language?
If answer is yes, where can I find sample boilerplate code for fine-tuning with the proper hyperparameters and custom dataset class example? Preferably using Python and Pytorch.