Do you have a demo notebook on how to use this ?

#1
by 0xrushi - opened

Hi, I used the transformers template and its not working.

from transformers import AutoTokenizer, AutoModelForSeq2SeqLM

tokenizer = AutoTokenizer.from_pretrained("nev/dalle-mini-pytorch")

model = AutoModelForSeq2SeqLM.from_pretrained("nev/dalle-mini-pytorch")

Hi,

you need to create a Bart model with a special config. It has a few changes added on top too - I will be able to publish the notebook around the 25th of June.

Hi,
Thanks a lot for posting the model!
I have a brief question: Is this a pytorch port of the whole trained dalle-mini model?
I would also be very grateful if you could share a notebook demonstrating the usage - I couldn't get it to run either.

It seems to work, but doesn't really match the prompt - maybe something changed in the six months https://colab.research.google.com/drive/1Blh-hTfhyry-YvitH8A95Duzwtm17Xz-?usp=sharing
I fixed it now with some parameters

This can be closed now? Tell me if there are any issues

nev changed discussion status to closed

Hi, @nev I tried running the notebook(https://colab.research.google.com/drive/1Blh-hTfhyry-YvitH8A95Duzwtm17Xz-?usp=sharing), but cannot make it work to match the prompt. Have you modified the notebook somehow to eventually match the prompt or you did not fix that one?
image.png

The encoder does appear to be broken in some way, the decoder still works fine as you can see in the fine-tuning example below. I'll try figuring out the root cause by converting the new DALL-E Mega

A decoder-only version of the notebook (with image prompts):

image.png

https://colab.research.google.com/drive/15AshZ9_cjgkvbcN6KSkJhAKC63E7tSZi

Converting DALL-E Mega into the default Huggingface BART is not possible, I might try with fairseq or fastseq.

Sign up or log in to comment