我做SFT微调时,发现tokenizer save的问题,需要改下

#13
by shibing624 - opened

tokenization_chatglm.py 文件,137行 self.vocab_file 找不到对应的文件, 需要在73行前加一行代码 self.vocab_file = vocab_file

deleted
This comment has been hidden
Knowledge Engineering Group (KEG) & Data Mining at Tsinghua University org

Fixed.

zxdu20 changed discussion status to closed

Sign up or log in to comment