jiajunlong commited on
Commit
9a338dc
1 Parent(s): 9b164ba

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -29,7 +29,7 @@ print('runing time:', genertaion_time)
29
  | model_name | gqa | textvqa | sqa | vqav2 | MME | MMB | MM-VET |
30
  | :----------------------------------------------------------: | ----- | ------- | ----- | ----- | ------- | ----- | ------ |
31
  | [TinyLLaVA-1.5B](https://huggingface.co/bczhou/TinyLLaVA-1.5B) | 60.3 | 51.7 | 60.3 | 76.9 | 1276.5 | 55.2 | 25.8 |
32
- | [TinyLLaVA-0.55B](https://huggingface.co/jiajunlong/TinyLLaVA-0.89B) | 53.87 | 44.02 | 54.09 | 71.74 | 1118.75 | 37.8 | 20 |
33
 
34
  P.S. [TinyLLaVA Factory](https://github.com/TinyLLaVA/TinyLLaVA_Factory) is an open-source modular codebase for small-scale LMMs with a focus on simplicity of code implementations, extensibility of new features, and reproducibility of training results. This code repository provides standard training&evaluating pipelines, flexible data preprocessing&model configurations, and easily extensible architectures. Users can customize their own LMMs with minimal coding effort and less coding mistake.
35
  TinyLLaVA Factory integrates a suite of cutting-edge models and methods.
 
29
  | model_name | gqa | textvqa | sqa | vqav2 | MME | MMB | MM-VET |
30
  | :----------------------------------------------------------: | ----- | ------- | ----- | ----- | ------- | ----- | ------ |
31
  | [TinyLLaVA-1.5B](https://huggingface.co/bczhou/TinyLLaVA-1.5B) | 60.3 | 51.7 | 60.3 | 76.9 | 1276.5 | 55.2 | 25.8 |
32
+ | [TinyLLaVA-0.89B](https://huggingface.co/jiajunlong/TinyLLaVA-OpenELM-450M-SigLIP-0.89B) | 53.87 | 44.02 | 54.09 | 71.74 | 1118.75 | 37.8 | 20 |
33
 
34
  P.S. [TinyLLaVA Factory](https://github.com/TinyLLaVA/TinyLLaVA_Factory) is an open-source modular codebase for small-scale LMMs with a focus on simplicity of code implementations, extensibility of new features, and reproducibility of training results. This code repository provides standard training&evaluating pipelines, flexible data preprocessing&model configurations, and easily extensible architectures. Users can customize their own LMMs with minimal coding effort and less coding mistake.
35
  TinyLLaVA Factory integrates a suite of cutting-edge models and methods.