MarcusLoren commited on
Commit
c0525bf
1 Parent(s): eb1d0bc

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +18 -9
README.md CHANGED
@@ -12,7 +12,12 @@ For all purposes and definitions the autoencoder is the **world first** publishe
12
  ## Model Details
13
  The autoencoder (tokenizer) is a relative small model using 50M parameters and the transformer model uses 184M parameters and the core is based on GPT2-small.
14
  Due to hardware contraints it's trained using a codebook/vocabablity size of 2048.<br/>
15
- Devoloped by: Me (with credits for MeshGPT codebase to [Phil Wang](https://github.com/lucidrains))
 
 
 
 
 
16
 
17
  ### Warning:
18
  This model has been created without any sponsors or renting any GPU hardware, so it has a very limited capability in terms what it can generate.
@@ -36,15 +41,20 @@ device = "cuda" if torch.cuda.is_available() else "cpu"
36
  transformer = MeshTransformer.from_pretrained("MarcusLoren/MeshGPT_tiny_alpha").to(device)
37
 
38
  output = []
39
- for text in [ 'bed' , "chair"]:
40
- face_coords, face_mask = transformer.generate(texts = [text], temperature = 0.0)
41
- # (batch, num faces, vertices (3), coordinates (3)), (batch, num faces)
42
- output.append(face_coords)
 
43
 
44
- mesh_render.combind_mesh(f'./render.obj', output)
45
 
46
  ```
47
-
 
 
 
 
48
  Random samples generated by text only:
49
  ![image/png](https://cdn-uploads.huggingface.co/production/uploads/657e233acec775bfe0d5cbc6/UH1r5s9Lfj4sUSgClqhrf.png)
50
 
@@ -69,5 +79,4 @@ The tokens generated by the transformer can then be converted into 3D mesh using
69
  The idea for MeshGPT came from the paper ( https://arxiv.org/abs/2311.15475 ) but the creators didn't release any code or model.
70
  Phil Wang (https://github.com/lucidrains) drew inspiration from the paper and did a ton of improvements over the papers implementation and created the repo : https://github.com/lucidrains/meshgpt-pytorch
71
  My goal has been to figure out how to train and implement MeshGPT into reality. <br/>
72
- See my github repo for a notebook on how to get started training your own MeshGPT! [MarcusLoppe/meshgpt-pytorch](https://github.com/MarcusLoppe/meshgpt-pytorch/)
73
-
 
12
  ## Model Details
13
  The autoencoder (tokenizer) is a relative small model using 50M parameters and the transformer model uses 184M parameters and the core is based on GPT2-small.
14
  Due to hardware contraints it's trained using a codebook/vocabablity size of 2048.<br/>
15
+ Devoloped & trained by: Me with credits for MeshGPT codebase to [Phil Wang](https://github.com/lucidrains)
16
+
17
+ ## Preformance:
18
+ CPU 10 triangles/s<br/>
19
+ 3060 GPU: 40 triangles/s<br/>
20
+ 4090 GPU: 110 triangles/s<br/>
21
 
22
  ### Warning:
23
  This model has been created without any sponsors or renting any GPU hardware, so it has a very limited capability in terms what it can generate.
 
41
  transformer = MeshTransformer.from_pretrained("MarcusLoren/MeshGPT_tiny_alpha").to(device)
42
 
43
  output = []
44
+ output.append((transformer.generate(texts = ['sofa','bed', 'computer screen', 'bench', 'chair', 'table' ] , temperature = 0.0) ))
45
+ output.append((transformer.generate(texts = ['milk carton', 'door', 'shovel', 'heart', 'trash can', 'ladder'], temperature = 0.0) ))
46
+ output.append((transformer.generate(texts = ['hammer', 'pedestal', 'pickaxe', 'wooden cross', 'coffee bean', 'crowbar'], temperature = 0.0) ))
47
+ output.append((transformer.generate(texts = ['key', 'minecraft character', 'dragon head', 'open book', 'minecraft turtle', 'wooden table'], temperature = 0.0) ))
48
+ output.append((transformer.generate(texts = ['gun', 'ice cream cone', 'axe', 'helicopter', 'shotgun', 'plastic bottle'], temperature = 0.0) ))
49
 
50
+ mesh_render.save_rendering(f'./render.obj', output)
51
 
52
  ```
53
+ ## Expected output:
54
+
55
+ ![image/png](https://cdn-uploads.huggingface.co/production/uploads/657e233acec775bfe0d5cbc6/K04Qj_xgwmNT_MldTA1l8.png)
56
+
57
+
58
  Random samples generated by text only:
59
  ![image/png](https://cdn-uploads.huggingface.co/production/uploads/657e233acec775bfe0d5cbc6/UH1r5s9Lfj4sUSgClqhrf.png)
60
 
 
79
  The idea for MeshGPT came from the paper ( https://arxiv.org/abs/2311.15475 ) but the creators didn't release any code or model.
80
  Phil Wang (https://github.com/lucidrains) drew inspiration from the paper and did a ton of improvements over the papers implementation and created the repo : https://github.com/lucidrains/meshgpt-pytorch
81
  My goal has been to figure out how to train and implement MeshGPT into reality. <br/>
82
+ See my github repo for a notebook on how to get started training your own MeshGPT! [MarcusLoppe/meshgpt-pytorch](https://github.com/MarcusLoppe/meshgpt-pytorch/)