Vocab size seems to be 151851, but not 151936

#1
by Carol0110 - opened

Error: Trying to set a tensor of shape torch.Size([151851, 4096]) in "weight" (which has shape torch.Size([151936, 4096])), this look incorrect.

Maybe author called model.resize_token_embeddings before training. Update 151936 -> 151851 in config.json file may fix it.

Sign up or log in to comment