numbmelon commited on
Commit
cd2abba
1 Parent(s): 36e4da5

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +8 -5
README.md CHANGED
@@ -9,7 +9,7 @@ pipeline_tag: image-text-to-text
9
 
10
  <div align="center">
11
 
12
- [\[🏠Homepage\]](https://osatlas.github.io) [\[💻Code\]](https://github.com/OS-Copilot/OS-Atlas) [\[🚀Quick Start\]](#quick-start) [\[📝Paper\]](https://arxiv.org/abs/2410.23218) [\[🤗Models\]](https://huggingface.co/collections/OS-Copilot/os-atlas-67246e44003a1dfcc5d0d045) [\[🤗ScreenSpot-v2\]](https://huggingface.co/datasets/OS-Copilot/ScreenSpot-v2)
13
 
14
  </div>
15
 
@@ -23,14 +23,16 @@ For GUI grounding tasks, you can use:
23
  - [OS-Atlas-Base-4B](https://huggingface.co/OS-Copilot/OS-Atlas-Base-4B)
24
 
25
  For generating single-step actions in GUI agent tasks, you can use:
26
- - [OS-Atlas-Action-7B](https://huggingface.co/OS-Copilot/OS-Atlas-Action-7B)
27
- - [OS-Atlas-Action-4B](https://huggingface.co/OS-Copilot/OS-Atlas-Action-4B)
28
 
29
 
30
  ## OS-Atlas-Action-7B
31
 
32
  `OS-Atlas-Action-7B` is a GUI action model finetuned from OS-Atlas-Base-7B. By taking as input a system prompt, basic and custom actions, and a task instruction, the model generates thoughtful reasoning (`thought`) and executes the appropriate next step (`action`).
33
 
 
 
34
  ### Installation
35
  To use `OS-Atlas-Action-7B`, first install the necessary dependencies:
36
  ```bash
@@ -39,8 +41,9 @@ pip install qwen-vl-utils
39
  ```
40
 
41
  ### Example Inference Code
42
- Below is an example of how to perform inference using the model:
43
 
 
44
  ```python
45
  from transformers import Qwen2VLForConditionalGeneration, AutoProcessor
46
  from qwen_vl_utils import process_vision_info
@@ -134,7 +137,7 @@ messages = [
134
  },
135
  {
136
  "type": "image",
137
- "image": "https://github.com/OS-Copilot/OS-Atlas/blob/main/exmaples/images/action_example_1.jpg",
138
  },
139
  {"type": "text", "text": "Task instruction: to allow the user to enter their first name\nHistory: null" },
140
  ],
 
9
 
10
  <div align="center">
11
 
12
+ [\[🏠Homepage\]](https://osatlas.github.io) [\[💻Code\]](https://github.com/OS-Copilot/OS-Atlas) [\[🚀Quick Start\]](#quick-start) [\[📝Paper\]](https://arxiv.org/abs/2410.23218) [\[🤗Models\]](https://huggingface.co/collections/OS-Copilot/os-atlas-67246e44003a1dfcc5d0d045)[\[🤗Data\]](https://huggingface.co/datasets/OS-Copilot/OS-Atlas-data) [\[🤗ScreenSpot-v2\]](https://huggingface.co/datasets/OS-Copilot/ScreenSpot-v2)
13
 
14
  </div>
15
 
 
23
  - [OS-Atlas-Base-4B](https://huggingface.co/OS-Copilot/OS-Atlas-Base-4B)
24
 
25
  For generating single-step actions in GUI agent tasks, you can use:
26
+ - [OS-Atlas-Pro-7B](https://huggingface.co/OS-Copilot/OS-Atlas-Pro-7B)
27
+ - [OS-Atlas-Pro-4B](https://huggingface.co/OS-Copilot/OS-Atlas-Pro-4B)
28
 
29
 
30
  ## OS-Atlas-Action-7B
31
 
32
  `OS-Atlas-Action-7B` is a GUI action model finetuned from OS-Atlas-Base-7B. By taking as input a system prompt, basic and custom actions, and a task instruction, the model generates thoughtful reasoning (`thought`) and executes the appropriate next step (`action`).
33
 
34
+ Note that the released `OS-Atlas-Pro-7B model` is described in the Section 5.4 of the paper. Compared to the OS-Atlas model in Tables 4 and 5, the Pro model demonstrates superior generalizability and performance. Critically, it is not constrained to specific tasks or training datasets merely to satisfy particular experimental conditions like OOD and SFT. Furthermore, this approach prevents us from overdosing HuggingFace by uploading over 20+ distinct model checkpoints.
35
+
36
  ### Installation
37
  To use `OS-Atlas-Action-7B`, first install the necessary dependencies:
38
  ```bash
 
41
  ```
42
 
43
  ### Example Inference Code
44
+ First download the [example image](https://github.com/OS-Copilot/OS-Atlas/blob/main/examples/images/action_example_1.jpg) and save it to the current directory.
45
 
46
+ Below is an example of how to perform inference using the model:
47
  ```python
48
  from transformers import Qwen2VLForConditionalGeneration, AutoProcessor
49
  from qwen_vl_utils import process_vision_info
 
137
  },
138
  {
139
  "type": "image",
140
+ "image": "./action_example_1.jpg",
141
  },
142
  {"type": "text", "text": "Task instruction: to allow the user to enter their first name\nHistory: null" },
143
  ],