LLamaSharpSamples / README.md
AsakusaRinne's picture
Update README.md
111be1d
|
raw
history blame
1.53 kB
metadata
license: other
language:
  - en
pipeline_tag: text2text-generation
tags:
  - code

Introduction

This is a model repo mainly for LLamaSharp to provide samples for each version. The models can also be used by llama.cpp or other engines.

Since llama.cpp always have break changes, it takes much time for users (of LLamaSharp and others) to find a suitable model to run. This model repo would provide some convenience for users.

Models

We will appreciate it if you'd like to provide some info about the incompleted models (such as links, model sources, etc.).

Usages

At first, choose a branch with the same name of your LLamaSharp Backend version. For example, if you're using LLamaSharp.Backend.Cuda11 v0.3.0, please use v0.3.0 branch of this repo.

Then download a model you like and follow the instructions of LLamaSharp to run it.

Contributing

Any kind of contribution is welcomed! It's not necessary to upload a model, providing some information can also help a lot! For example, if you know where to download the pth file of Vicuna, please tell us via community and we'll add it to the list!