codestella commited on
Commit
2809e52
1 Parent(s): ac55c81

README change

Browse files
.idea/.gitignore ADDED
@@ -0,0 +1,8 @@
 
 
 
 
 
 
 
 
 
1
+ # Default ignored files
2
+ /shelf/
3
+ /workspace.xml
4
+ # Datasource local storage ignored files
5
+ /dataSources/
6
+ /dataSources.local.xml
7
+ # Editor-based HTTP Client requests
8
+ /httpRequests/
.idea/inspectionProfiles/Project_Default.xml ADDED
@@ -0,0 +1,18 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ <component name="InspectionProjectProfileManager">
2
+ <profile version="1.0">
3
+ <option name="myName" value="Project Default" />
4
+ <inspection_tool class="Eslint" enabled="true" level="WARNING" enabled_by_default="true" />
5
+ <inspection_tool class="PyPackageRequirementsInspection" enabled="true" level="WARNING" enabled_by_default="true">
6
+ <option name="ignoredPackages">
7
+ <value>
8
+ <list size="4">
9
+ <item index="0" class="java.lang.String" itemvalue="tensorboard" />
10
+ <item index="1" class="java.lang.String" itemvalue="shapely" />
11
+ <item index="2" class="java.lang.String" itemvalue="geopandas" />
12
+ <item index="3" class="java.lang.String" itemvalue="sklearn" />
13
+ </list>
14
+ </value>
15
+ </option>
16
+ </inspection_tool>
17
+ </profile>
18
+ </component>
.idea/inspectionProfiles/profiles_settings.xml ADDED
@@ -0,0 +1,6 @@
 
 
 
 
 
 
 
1
+ <component name="InspectionProjectProfileManager">
2
+ <settings>
3
+ <option name="USE_PROJECT_PROFILE" value="false" />
4
+ <version value="1.0" />
5
+ </settings>
6
+ </component>
.idea/misc.xml ADDED
@@ -0,0 +1,4 @@
 
 
 
 
 
1
+ <?xml version="1.0" encoding="UTF-8"?>
2
+ <project version="4">
3
+ <component name="ProjectRootManager" version="2" project-jdk-name="Remote Python 3.8.8 (sftp://stella@147.46.112.43:8022/home/stella/anaconda3/envs/python3.8-pytorch17.1/bin/python)" project-jdk-type="Python SDK" />
4
+ </project>
.idea/modules.xml ADDED
@@ -0,0 +1,8 @@
 
 
 
 
 
 
 
 
 
1
+ <?xml version="1.0" encoding="UTF-8"?>
2
+ <project version="4">
3
+ <component name="ProjectModuleManager">
4
+ <modules>
5
+ <module fileurl="file://$PROJECT_DIR$/.idea/putting-nerf-on-a-diet.iml" filepath="$PROJECT_DIR$/.idea/putting-nerf-on-a-diet.iml" />
6
+ </modules>
7
+ </component>
8
+ </project>
.idea/putting-nerf-on-a-diet.iml ADDED
@@ -0,0 +1,8 @@
 
 
 
 
 
 
 
 
 
1
+ <?xml version="1.0" encoding="UTF-8"?>
2
+ <module type="PYTHON_MODULE" version="4">
3
+ <component name="NewModuleRootManager">
4
+ <content url="file://$MODULE_DIR$" />
5
+ <orderEntry type="inheritedJdk" />
6
+ <orderEntry type="sourceFolder" forTests="false" />
7
+ </component>
8
+ </module>
.idea/vcs.xml ADDED
@@ -0,0 +1,6 @@
 
 
 
 
 
 
 
1
+ <?xml version="1.0" encoding="UTF-8"?>
2
+ <project version="4">
3
+ <component name="VcsDirectoryMappings">
4
+ <mapping directory="$PROJECT_DIR$" vcs="Git" />
5
+ </component>
6
+ </project>
README.md CHANGED
@@ -1,16 +1,13 @@
1
  # Putting NeRF on a Diet: Semantically Consistent Few-Shot View Synthesis Implementation
 
 
2
 
3
- <img width="586" alt="스크린샷 2021-07-04 오후 4 11 51" src="https://user-images.githubusercontent.com/77657524/124376591-b312b780-dce2-11eb-80ad-9129d6f5eedb.png">
 
 
4
 
5
- the Pytorch, JAX/Flax based code implementation of this paper : https://arxiv.org/abs/2104.00677
6
- The model gives the 3D neural scene representation (NeRF: Neural Radiances Field) estimated from a few images.
7
- Which is based on extracting the semantic information using a pre-trained visual encoder such as CLIP, a Vision Transformer
8
-
9
- Our Project is started in the HuggingFace X GoogleAI (JAX) Community Week Event.
10
- https://discuss.huggingface.co/t/open-to-the-community-community-week-using-jax-flax-for-nlp-cv/7104
11
-
12
- ## Hugging Face Hub Repo URL:
13
 
 
14
  We will also upload our project on the Hugging Face Hub Repository.
15
  [https://huggingface.co/flax-community/putting-nerf-on-a-diet/](https://huggingface.co/flax-community/putting-nerf-on-a-diet/)
16
 
@@ -49,7 +46,7 @@ Our JAX/Flax implementation currently supports:
49
  </tbody>
50
  </table>
51
 
52
- ## Installation
53
 
54
  ```bash
55
  # Clone the repo
@@ -68,12 +65,27 @@ pip install --upgrade jax jaxlib==0.1.57+cuda101 -f https://storage.googleapis.c
68
  pip install flax transformer[flax]
69
  ```
70
 
71
- ## Dataset
72
  Download the datasets from the [NeRF official Google Drive](https://drive.google.com/drive/folders/128yBriW1IG_3NJ5Rp7APSTZsJqdJdfc1).
73
  Please download the `nerf_synthetic.zip` and unzip them
74
  in the place you like. Let's assume they are placed under `/tmp/jaxnerf/data/`.
75
 
76
- ## How to use
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
77
  ```
78
  python -m train \
79
  --data_dir=/PATH/TO/YOUR/SCENE/DATA \ % e.g., nerf_synthetic/lego
@@ -82,33 +94,78 @@ python -m train \
82
  ```
83
  You can toggle the semantic loss by “use_semantic_loss” in configuration files.
84
 
85
- ## Rendered examples by 8-shot learned Diet-NeRF
86
- - Lego
87
- - Chair
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
88
 
89
- ## Rendered examples by occluded 14-shot learned NeRF and Diet-NeRF
 
 
 
 
 
 
 
 
 
 
 
 
90
  This result is on the quite initial state and expected to be improved.
91
 
92
- ### Training poses
93
  <img width="1400" src="https://user-images.githubusercontent.com/26036843/126111980-4f332c87-a7f0-42e0-a355-8e77621bbca4.png">
94
 
95
- ### Rendered novel poses
96
  <img width="800" src="https://user-images.githubusercontent.com/26036843/126113080-a6a48f3d-2629-4efc-a740-fe908ca6b5c3.png">
97
 
98
 
99
- ## Demo
 
 
100
  [https://huggingface.co/spaces/flax-community/DietNerf-Demo](https://huggingface.co/spaces/flax-community/DietNerf-Demo)
101
 
102
- ## Our Teams
 
103
 
104
  | Teams | Members |
105
  |------------------|----------------------------------------------------------------------------------------------------------------------------------------------------------------|
106
- | NeRF Team | Leader : [JaeYoung Chung](https://github.com/robot0321), Members : [Stella Yang](https://github.com/codestella), [Alex Lau](https://github.com/riven314), [Haswanth Aekula](https://github.com/hassiahk), [Hyunkyu Kim](https://github.com/minus31) |
107
- | CLIP Team | Leader : [Sunghyun Kim](https://github.com/MrBananaHuman), Members : [Seunghyun Lee](https://github.com/sseung0703), [Sasikanth Kotti](https://github.com/ksasi), [Khali Sifullah](https://github.com/khalidsaifullaah) |
108
- | Cloud TPU Team | Leader : [Alex Lau](https://github.com/riven314), Members : [JaeYoung Chung](https://github.com/robot0321), [Sunghyun Kim](https://github.com/MrBananaHuman), [Aswin Pyakurel](https://github.com/masapasa) |
109
  | Project Managing | [Stella Yang](https://github.com/codestella) To Watch Our Project Progress, Please Check [Our Project Notion](https://www.notion.so/Putting-NeRF-on-a-Diet-e0caecea0c2b40c3996c83205baf870d) |
 
 
 
 
 
 
110
 
111
- ## References
 
112
  This project is based on “JAX-NeRF”.
113
  ```
114
  @software{jaxnerf2020github,
@@ -132,5 +189,17 @@ This project is based on “JAX-NeRF”.
132
  }
133
  ```
134
 
135
- ## License
136
- [Apache License 2.0](https://github.com/codestella/putting-nerf-on-a-diet/blob/main/LICENSE)
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  # Putting NeRF on a Diet: Semantically Consistent Few-Shot View Synthesis Implementation
2
+ ###** WARNING : it is not the completed REAME (Until Thursday)**
3
+ <p align="center"><img width="450" alt="스크린샷 2021-07-04 오후 4 11 51" src="https://user-images.githubusercontent.com/77657524/126361638-4aad58e8-4efb-4fc5-bf78-f53d03799e1e.png"></p>
4
 
5
+ the Pytorch, JAX/Flax based code implementation of this paper [Putting NeRF on a Diet : Ajay Jain, Matthew Tancik, Pieter Abbeel, Arxiv : https://arxiv.org/abs/2104.00677]
6
+ The model generates the novel view synthesis redering (NeRF: Neural Radiances Field) base on Fewshot learning.
7
+ The semantic loss using pre-trained CLIP Vision Transformer embedding is used for 2D supervision for 3D. It outperforms the Original NeRF in 3D reconstruction for
8
 
 
 
 
 
 
 
 
 
9
 
10
+ ## 🤗 Hugging Face Hub Repo URL:
11
  We will also upload our project on the Hugging Face Hub Repository.
12
  [https://huggingface.co/flax-community/putting-nerf-on-a-diet/](https://huggingface.co/flax-community/putting-nerf-on-a-diet/)
13
 
 
46
  </tbody>
47
  </table>
48
 
49
+ ## 💻 Installation
50
 
51
  ```bash
52
  # Clone the repo
 
65
  pip install flax transformer[flax]
66
  ```
67
 
68
+ ## Dataset & Methods
69
  Download the datasets from the [NeRF official Google Drive](https://drive.google.com/drive/folders/128yBriW1IG_3NJ5Rp7APSTZsJqdJdfc1).
70
  Please download the `nerf_synthetic.zip` and unzip them
71
  in the place you like. Let's assume they are placed under `/tmp/jaxnerf/data/`.
72
 
73
+ <p align="center"><img width="400" alt="스크린샷 2021-07-04 오후 4 11 51" src="https://user-images.githubusercontent.com/77657524/124376591-b312b780-dce2-11eb-80ad-9129d6f5eedb.png"></p>
74
+
75
+ Based on the principle
76
+ that “a bulldozer is a bulldozer from any perspective”, our proposed DietNeRF supervises the radiance field from arbitrary poses
77
+ (DietNeRF cameras). This is possible because we compute a semantic consistency loss in a feature space capturing high-level
78
+ scene attributes, not in pixel space. We extract semantic representations of renderings using the CLIP Vision Transformer, then
79
+ maximize similarity with representations of ground-truth views. In
80
+ effect, we use prior knowledge about scene semantics learned by
81
+ single-view 2D image encoders to constrain a 3D representation.
82
+
83
+ You can check detail information on the author's paper. Also, you can check the CLIP based semantic loss structure on the following image.
84
+ <p align="center"><img width="600" alt="스크린샷 2021-07-04 오후 4 11 51" src="https://user-images.githubusercontent.com/77657524/126386709-a4ce7ff8-2a68-442f-b4ed-26971fb90e51.png"></p>
85
+
86
+ Our code used JAX/FLAX framework for implementation. So that it can achieve much speed up than other NeRF code. Moreover, we implemented multiple GPU distribution ray code. it helps much smaller training time. At last, our code used hugging face, transformer, CLIP model library.
87
+
88
+ ## 🤟 How to use
89
  ```
90
  python -m train \
91
  --data_dir=/PATH/TO/YOUR/SCENE/DATA \ % e.g., nerf_synthetic/lego
 
94
  ```
95
  You can toggle the semantic loss by “use_semantic_loss” in configuration files.
96
 
97
+ ## 💎 Performance
98
+
99
+ ### Performance Tables
100
+ #### 4 Shot Blender Dataset PSNR Result
101
+
102
+ | Scene | Chair | Drums | Ficus | Hotdog | Lego | Materials | Mic | Ship | Mean |
103
+ |---------|:---------:|:---------:|:---------:|:---------:|:---------:|:---------:|:---------:|:---------:|:---------:|
104
+ | NeRF | 33.00 | 25.01 | 30.13 | 36.18 | 32.54 | 29.62 | 32.91 | 28.65 | 31.01 |
105
+ | DietNeRF | **34.08** | **25.03** | **30.43** | **36.92** | **33.28** | **29.91** | **34.53** | **29.36** | **31.69** |
106
+
107
+ #### Loss Graph Comparison btw NeRF vs DietNeRF in Drum Scene
108
+
109
+ <p align="center"><img width="400" alt="스크린샷 2021-07-04 오후 4 11 51" src="https://user-images.githubusercontent.com/77657524/126384510-423b9070-a3e5-4e18-8b4e-30c15c5b39c6.png">
110
+ </p>
111
+
112
+
113
+ ### - Rendering GIF images by 8-shot learned Diet-NeRF
114
+
115
+ DietNeRF has a strong capacity to generalise on novel and challenging views with EXTREMELY SMALL TRAINING SAMPLES!
116
+ The animations below shows the performance difference between DietNeRF (left) v.s. NeRF (right) with only 4 training images:
117
+
118
+ #### SHIP
119
+ ![Text](./assets/ship-dietnerf.gif) ![Alt Text](./assets/ship-nerf.gif)
120
+
121
+ #### LEGO
122
+ ![Text](./assets/ship-dietnerf.gif) ![Alt Text](./assets/ship-nerf.gif)
123
+
124
+ #### HOTDOG
125
+ ![Text](./assets/ship-dietnerf.gif) ![Alt Text](./assets/ship-nerf.gif)
126
 
127
+
128
+ ### - Rendered Rendering images by 4-shot learned Diet-NeRF vs Vanilla-NeRF
129
+
130
+ #### SHIP
131
+ @ will be filled
132
+
133
+ #### LEGO
134
+ @ will be filled
135
+
136
+ #### HOTDOG
137
+ @ will be filled
138
+
139
+ ### - Rendered examples by occluded 14-shot learned NeRF and Diet-NeRF
140
  This result is on the quite initial state and expected to be improved.
141
 
142
+ #### Training poses
143
  <img width="1400" src="https://user-images.githubusercontent.com/26036843/126111980-4f332c87-a7f0-42e0-a355-8e77621bbca4.png">
144
 
145
+ #### Rendered novel poses
146
  <img width="800" src="https://user-images.githubusercontent.com/26036843/126113080-a6a48f3d-2629-4efc-a740-fe908ca6b5c3.png">
147
 
148
 
149
+ ## 🤩 Demo
150
+
151
+ You can check our Streamlit Space Demo on following site !
152
  [https://huggingface.co/spaces/flax-community/DietNerf-Demo](https://huggingface.co/spaces/flax-community/DietNerf-Demo)
153
 
154
+ ## 👨‍👧‍👦 Our Teams
155
+
156
 
157
  | Teams | Members |
158
  |------------------|----------------------------------------------------------------------------------------------------------------------------------------------------------------|
 
 
 
159
  | Project Managing | [Stella Yang](https://github.com/codestella) To Watch Our Project Progress, Please Check [Our Project Notion](https://www.notion.so/Putting-NeRF-on-a-Diet-e0caecea0c2b40c3996c83205baf870d) |
160
+ | NeRF Team | [Stella Yang](https://github.com/codestella), [Alex Lau](https://github.com/riven314), [Seunghyun Lee](https://github.com/sseung0703), [Hyunkyu Kim](https://github.com/minus31), [Haswanth Aekula](https://github.com/hassiahk), [JaeYoung Chung](https://github.com/robot0321) |
161
+ | CLIP Team | [Seunghyun Lee](https://github.com/sseung0703), [Sasikanth Kotti](https://github.com/ksasi), [Khali Sifullah](https://github.com/khalidsaifullaah) , [Sunghyun Kim](https://github.com/MrBananaHuman) |
162
+ | Cloud TPU Team | [Alex Lau](https://github.com/riven314), [Aswin Pyakurel](https://github.com/masapasa) , [JaeYoung Chung](https://github.com/robot0321), [Sunghyun Kim](https://github.com/MrBananaHuman) |
163
+
164
+ * Extremely Don't Sleep Contributors 🤣 : [Seunghyun Lee](https://github.com/sseung0703), [Alex Lau](https://github.com/riven314), [Stella Yang](https://github.com/codestella)
165
+
166
 
167
+
168
+ ## 🌱 References
169
  This project is based on “JAX-NeRF”.
170
  ```
171
  @software{jaxnerf2020github,
 
189
  }
190
  ```
191
 
192
+ ## 🔑 License
193
+ [Apache License 2.0](https://github.com/codestella/putting-nerf-on-a-diet/blob/main/LICENSE)
194
+
195
+ ## ❤️ Special Thanks
196
+
197
+
198
+ Our Project is started in the HuggingFace X GoogleAI (JAX) Community Week Event.
199
+ https://discuss.huggingface.co/t/open-to-the-community-community-week-using-jax-flax-for-nlp-cv/7104
200
+
201
+ Thank you for Our Mentor Suraj and Organizers in JAX/Flax Community Week!
202
+ Our team grows up with this community learning experience. It was wonderful time!
203
+
204
+ <p align="center"><img width="250" alt="스크린샷 2021-07-04 오후 4 11 51" src="https://user-images.githubusercontent.com/77657524/126369170-5664076c-ac99-4157-bc53-b91dfb7ed7e1.jpeg"></p>
205
+
assets/lego-nerf.gif ADDED
assets/ship-dietnerf.gif ADDED
assets/ship-nerf.gif ADDED