How to solve the "Some weights were not used " problem.

#27
by Zhuxing169 - opened

The code I used are listed below:

from transformers import pipeline

checkpoint = r'E:\微调\模型\bert-base-chinese'
mask_model = pipeline("fill-mask",model=checkpoint)
input = '生活的真谛是[MASK]。'
output = mask_model(input)
print(output)

then the error occured.

Some weights of the model checkpoint at E:\微调\模型\bert-base-chinese were not used when initializing BertForMaskedLM: ['bert.pooler.dense.bias', 'bert.pooler.dense.weight', 'cls.seq_relationship.bias', 'cls.seq_relationship.weight']
- This IS expected if you are initializing BertForMaskedLM from the checkpoint of a model trained on another task or with another architecture (e.g. initializing a BertForSequenceClassification model from a BertForPreTraining model).
- This IS NOT expected if you are initializing BertForMaskedLM from the checkpoint of a model that you expect to be exactly identical (initializing a BertForSequenceClassification model from a BertForSequenceClassification model).

Sign up or log in to comment