Adding `safetensors` variant of this model

#4
by ct-2 - opened

This is an automated PR created with https://huggingface.co/spaces/safetensors/convert

This new file is equivalent to pytorch_model.bin but safe in the sense that
no arbitrary code can be put into it.

These files also happen to load much faster than their pytorch counterpart:
https://colab.research.google.com/github/huggingface/notebooks/blob/main/safetensors_doc/en/speed.ipynb

The widgets on your model page will run using this model even if this is not merged
making sure the file actually works.

If you find any issues: please report here: https://huggingface.co/spaces/safetensors/convert/discussions

Feel free to ignore this PR.

ct-2 changed pull request status to closed

Sorry, not accepting any PRs on Synthia models, to ensure that the model is actually what I trained.. I'm also conscious of supply chain attacks and don't want to leave room for any of that πŸ’©..

Ah, apologies for the confusion, I was only trying to use these safetensors models for exllamav2, in which the conversion script required .safetensors versions of the model. I can grab the converted models in the PR branch, which is public in case others need it.

It looks like the 70B model won't convert though as it's quite large.

Sign up or log in to comment