text generation in batches
#12
by
nerner94
- opened
Hello,
I wish to generate text in batches given a prompt. The length of prompt changes as I provide different input each time. Currently I use padding to accommodate for the difference in token length but it padding distorts text generation heavily. Is there anyway I can avoid padding and still introduce different length prompts in a batch? Thanks, groetjes :)