OPEA
/

Safetensors
mllama
4-bit precision
intel/auto-round
weiweiz1 commited on
Commit
ef3c186
2 Parent(s): 7866ac7 c63c6ac

Merge branch 'main' of https://huggingface.co/cicdatopea/Llama-3.2-11B-Vision-Instruct-int4-sym-inc into main

Browse files
Files changed (1) hide show
  1. README.md +7 -7
README.md CHANGED
@@ -99,13 +99,13 @@ pip3 install git+https://github.com/open-compass/VLMEvalKit.git@7de2dcb. The eva
99
  ```bash
100
  auto-round-mllm --eval --model OPEA/Llama-3.2-11B-Vision-Instruct-int4-sym-inc --tasks MMBench_DEV_EN_V11,ScienceQA_VAL,TextVQA_VAL,POPE --output_dir "./eval_result"
101
  ```
102
- |Metric |16bits|Pile Calib INT4 |Llava Calib INT4|
103
- |:-------------------|:------|:------|:------|
104
- |avg |66.05 |67.81 |66.02 |
105
- |MMBench_DEV_EN_V11 |52.86 |53.48 |52.17 |
106
- |ScienceQA_VAL |68.86 |70.39 |69.15 |
107
- |TextVQA_VAL |54.49 |59.62 |55.07 |
108
- |POPE |88.00 |87.76 |87.71 |
109
 
110
  ### Generate the model
111
  Here is the sample command to reproduce the model.
 
99
  ```bash
100
  auto-round-mllm --eval --model OPEA/Llama-3.2-11B-Vision-Instruct-int4-sym-inc --tasks MMBench_DEV_EN_V11,ScienceQA_VAL,TextVQA_VAL,POPE --output_dir "./eval_result"
101
  ```
102
+ |Metric |16bits|Llava Calib INT4|
103
+ |:-------------------|:------|:------|
104
+ |avg |66.05 |67.81 |
105
+ |MMBench_DEV_EN_V11 |52.86 |53.48 |
106
+ |ScienceQA_VAL |68.86 |70.39 |
107
+ |TextVQA_VAL |54.49 |59.62 |
108
+ |POPE |88.00 |87.76 |
109
 
110
  ### Generate the model
111
  Here is the sample command to reproduce the model.