Spaces:
Paused
Paused
Daniel Marques
commited on
Commit
•
d0445bc
1
Parent(s):
2322597
fix: memory error
Browse files- SOURCE_DOCUMENTS/dataset.txt +0 -5
- constants.py +2 -2
SOURCE_DOCUMENTS/dataset.txt
CHANGED
@@ -26055,8 +26055,3 @@ Nigeria;177;5.00;5.00
|
|
26055 |
Niger;178;1.50;1.50
|
26056 |
Central African Republic;179;NA;NA
|
26057 |
Chad;179;NA;NA
|
26058 |
-
|
26059 |
-
|
26060 |
-
|
26061 |
-
|
26062 |
-
|
|
|
26055 |
Niger;178;1.50;1.50
|
26056 |
Central African Republic;179;NA;NA
|
26057 |
Chad;179;NA;NA
|
|
|
|
|
|
|
|
|
|
constants.py
CHANGED
@@ -37,8 +37,8 @@ MAX_NEW_TOKENS = 1024 # int(CONTEXT_WINDOW_SIZE/4)
|
|
37 |
|
38 |
#### If you get a "not enough space in the buffer" error, you should reduce the values below, start with half of the original values and keep halving the value until the error stops appearing
|
39 |
|
40 |
-
N_GPU_LAYERS =
|
41 |
-
N_BATCH =
|
42 |
|
43 |
### From experimenting with the Llama-2-7B-Chat-GGML model on 8GB VRAM, these values work:
|
44 |
# N_GPU_LAYERS = 20
|
|
|
37 |
|
38 |
#### If you get a "not enough space in the buffer" error, you should reduce the values below, start with half of the original values and keep halving the value until the error stops appearing
|
39 |
|
40 |
+
N_GPU_LAYERS = 20 # Llama-2-70B has 83 layers
|
41 |
+
N_BATCH = 512
|
42 |
|
43 |
### From experimenting with the Llama-2-7B-Chat-GGML model on 8GB VRAM, these values work:
|
44 |
# N_GPU_LAYERS = 20
|