Link to the Python Script or C-compiled Code to Inference-mode run the Model checkpoint?

#1
by MartialTerran - opened

Hi. Can you please modify the Readme to explicitly specify a URL to the specific model code (Llama_model.py python script or compiled C-code that will inference-mode operate the parameters (binary weights and bias and embeddings) that you have posted here? Please take a few minutes and disambiguate the model code that can be easily used to inference-mode run the provided model parameter file(s) that you have published. Can you also explicitly disclose the vocabulary (BPE file) file that has been employed to develop the parameters? Thank you.

Sign up or log in to comment