Link model to paper

#10
by nielsr HF staff - opened
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -19,7 +19,7 @@ library_name: transformers
19
  # Molmo 72B
20
 
21
  Molmo is a family of open vision-language models developed by the Allen Institute for AI. Molmo models are trained on PixMo, a dataset of 1 million, highly-curated image-text pairs. It has state-of-the-art performance among multimodal models with a similar size while being fully open-source. You can find all models in the Molmo family [here](https://huggingface.co/collections/allenai/molmo-66f379e6fe3b8ef090a8ca19).
22
- **Learn more** about the Molmo family [in our announcement blog post](https://molmo.allenai.org/blog).
23
 
24
  Molmo 72B is based on [Qwen2-72B](https://huggingface.co/Qwen/Qwen2-72B) and uses [OpenAI CLIP](https://huggingface.co/openai/clip-vit-large-patch14-336) as vision backbone.
25
  Molmo-72B achieves the highest academic benchmark score and ranks second on human evaluation, just slightly behind GPT-4o.
 
19
  # Molmo 72B
20
 
21
  Molmo is a family of open vision-language models developed by the Allen Institute for AI. Molmo models are trained on PixMo, a dataset of 1 million, highly-curated image-text pairs. It has state-of-the-art performance among multimodal models with a similar size while being fully open-source. You can find all models in the Molmo family [here](https://huggingface.co/collections/allenai/molmo-66f379e6fe3b8ef090a8ca19).
22
+ **Learn more** about the Molmo family [in our announcement blog post](https://molmo.allenai.org/blog). See also the [paper](https://huggingface.co/papers/2409.17146).
23
 
24
  Molmo 72B is based on [Qwen2-72B](https://huggingface.co/Qwen/Qwen2-72B) and uses [OpenAI CLIP](https://huggingface.co/openai/clip-vit-large-patch14-336) as vision backbone.
25
  Molmo-72B achieves the highest academic benchmark score and ranks second on human evaluation, just slightly behind GPT-4o.