English

Clarification on models and checkpoints linked in the GitHub repo

#1
by Filippo - opened

Dear Nomic,

what is the difference between:

  • the "quantized gpt4all model checkpoint: gpt4all-lora-quantized.bin."
  • the "Trained LoRa Weights: gpt4all-lora (four full epochs of training)" available here?

Aren't "trained weights" and "model checkpoints" the same thing?

Thank you.

Sign up or log in to comment