TheBloke commited on
Commit
fe62f57
1 Parent(s): 95c64cd

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -15,7 +15,7 @@ I can't guarantee that the two 128g files will work in only 40GB of VRAM.
15
 
16
  I haven't specifically tested VRAM requirements yet but will aim to do so at some point. If you have any experiences to share, please do so in the comments.
17
 
18
- If you want to try CPU inference instead, you can try my GGML repo instead: [TheBloke/alpaca-lora-65B-GGML](https://huggingface.co/TheBloke/alpaca-lora-65B-GGML).
19
 
20
  ## GIBBERISH OUTPUT IN `text-generation-webui`?
21
 
 
15
 
16
  I haven't specifically tested VRAM requirements yet but will aim to do so at some point. If you have any experiences to share, please do so in the comments.
17
 
18
+ If you want to try CPU inference instead, check out my GGML repo: [TheBloke/alpaca-lora-65B-GGML](https://huggingface.co/TheBloke/alpaca-lora-65B-GGML).
19
 
20
  ## GIBBERISH OUTPUT IN `text-generation-webui`?
21