Inference API
Ah i see yours isn't able to work as well. The inference API. I opened an issue in the book's git repo. But idk how to fix it yet, but I managed to get inference running again from deleting "mask_token" in special_token_map.json.
Ok, I tried your suggestion of deleting "mask_token" in special_token_map.json. in Hugging face itself .....but still i am getting an error as Can't load tokenizer using from_pretrained, please update its configuration: tokenizers.AddedToken() got multiple values for keyword argument 'special(same error I got as before)....did you do any other changes as well?
I am trying another method right now. I noticed that the working versions of "xlm-roberta-base-finetuned-panx-de " are made from different versions. I am re-running the libraries now and trying to find bugs.
AHAHHHHHHH!!! I managed to get it to work. basically the install file on colab (idk if ur using colab) is faulty, or if you can call it that. basically it's installation requirement is installing older versions, when newer ones exists. And the ones that works are the newer libraries. You ca try and run this after running the default installation file:
#%%capture
!pip install transformers==4.41.2
!pip install datasets==2.20.0
!pip install pyarrow==16.0
!pip install requests==2.32.3
!pip install torch==2.3.0 torchvision==0.18.0 torchaudio==2.3.0
!pip install importlib-metadata
!pip install accelerate -U
And you can mainly just try and refer to my file too: https://colab.research.google.com/drive/1F5L_vL1o6WC3DxGWDF_g6ZPKTJ7dcmxR#scrollTo=r1SReYWcdRjZ
Have a nice day.
You have socials? Discord or anyt? Would be nice to have someone to communicate to to debug.