Merge branch 'main' of https://huggingface.co/cicdatopea/Llama-3.2-11B-Vision-Instruct-int4-sym-inc into main
Browse files
README.md
CHANGED
@@ -99,13 +99,13 @@ pip3 install git+https://github.com/open-compass/VLMEvalKit.git@7de2dcb. The eva
|
|
99 |
```bash
|
100 |
auto-round-mllm --eval --model OPEA/Llama-3.2-11B-Vision-Instruct-int4-sym-inc --tasks MMBench_DEV_EN_V11,ScienceQA_VAL,TextVQA_VAL,POPE --output_dir "./eval_result"
|
101 |
```
|
102 |
-
|Metric |16bits|
|
103 |
-
|
104 |
-
|avg |66.05 |67.81 |
|
105 |
-
|MMBench_DEV_EN_V11 |52.86 |53.48 |
|
106 |
-
|ScienceQA_VAL |68.86 |70.39 |
|
107 |
-
|TextVQA_VAL |54.49 |59.62 |
|
108 |
-
|POPE |88.00 |87.76 |
|
109 |
|
110 |
### Generate the model
|
111 |
Here is the sample command to reproduce the model.
|
|
|
99 |
```bash
|
100 |
auto-round-mllm --eval --model OPEA/Llama-3.2-11B-Vision-Instruct-int4-sym-inc --tasks MMBench_DEV_EN_V11,ScienceQA_VAL,TextVQA_VAL,POPE --output_dir "./eval_result"
|
101 |
```
|
102 |
+
|Metric |16bits|Llava Calib INT4|
|
103 |
+
|:-------------------|:------|:------|
|
104 |
+
|avg |66.05 |67.81 |
|
105 |
+
|MMBench_DEV_EN_V11 |52.86 |53.48 |
|
106 |
+
|ScienceQA_VAL |68.86 |70.39 |
|
107 |
+
|TextVQA_VAL |54.49 |59.62 |
|
108 |
+
|POPE |88.00 |87.76 |
|
109 |
|
110 |
### Generate the model
|
111 |
Here is the sample command to reproduce the model.
|