Not loading word embedding weights after the addition of safetensors
I'm getting this for code that properly loaded the model before the PR that added safetensors 8 days ago.
Some weights of NorbertModel were not initialized from the model checkpoint at ltg/norbert3-base and are newly initialized: ['embedding.word_embedding.weight']
You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference.
Can you look into it? This is a major issue.
Should be replicated by just doing the lines below:
tokenizer = AutoTokenizer.from_pretrained('ltg/norbert3-base')
model = AutoModel.from_pretrained('ltg/norbert3-base', trust_remote_code=True)
This might have some more information https://stackoverflow.com/questions/78019134/how-to-properly-save-the-finetuned-transformer-model-in-safetensors-without-losi
Hi, thanks for letting us know, I have deleted the safetensors checkpoint for now. Note to myself: don't blindly trust the official HuggingFace conversion code :)
Nps :)