Train with different language model in BLIP2
#7
by
Upyaya
- opened
In the original paper, have used opt-2.7b and flan-t5-xxl as a large language model.
Would like to train using different language models like BERT, or GPT2.
Is it possible to use the Blip2ForConditionalGeneration class in the transformer library?