Spaces:
Runtime error
Could you please improve the export?
Hello,
A glb export, with the default camera (the one used in the preview), would be great!
Because the default view camera is the one used for reconstruction, and is equivalent to the image uploaded, but you don't always know the focal length and angles, so this way you get a real comparison between the image and the object.
Thanks
Have a great day :)
Hi,
I've just added .glb export to this demo. please give it a try.
Not sure I understood you correctly about the camera distance. You mean, the gradio mesh viewer should be using such a camera distance so that object's scales are equal for the input image and the visualised mesh?
Is it possible to export texture? thanks
Also would like to have the texture in the export, thanks!
Also would like to have the texture in the export, thanks!
@tttr65656 and @Pepn , it does export the texture. If you don't see it, it's your import that does not import it (or show it). You can see that the textures are exported using this online viewer:
Find a software that successes the texture import and export again in another format. Which software are you using? I'm not seeing textures in Blender.
I get the texture in Blender, many thanks!
Not sure I understood you correctly about the camera distance. You mean, the gradio mesh viewer should be using such a camera distance so that object's scales are equal for the input image and the visualised mesh?
Thank you for the glb export!
Sorry I didn't see your reply
@dmitriitochilkin
.
Currently, building a mesh based on an image implies a camera distance, 3 angles of a virtual camera (azimuth, elevation and dutch, let's forget about the last one for now) and a focal length (more or less perspective).
For example, the iso house image is a 2.5D iso image, 45° azimuthal and 45° elevation with a infinite focal length (orthogonal camera, no perspective), but real life shots have more tricky angles and a defined focal length, like the burger, roughly 0°/15° 35mm from a 50cm distance.
The 3D viewer "sees" the mesh using this virtual camera information, examples:
Like the image, the iso-house in the viewer is seen from 45°/45°.
In the burger image, TripoSR gets the plane of the table and the plate, because the mesh is clearly aligned with it (the burger isolines are parallel to the plate), and we see the resulting mesh in the viewer with a default "camera" (view) that matches the picture (angles, distance and perspective). So the camera information exists somewhere.
The glb should export this "default" camera used for reconstruction and the viewer, to be able to open the mesh and match with the image again, this way you can really compare the 3D textured object with the 2D image you used.
Hope it makes more sense :)
If some persons have issues to run TripoSR, you can also run it locally by installing Pinokio, then install TripoSR in Pinokio.
You can also try Stable Fast 3d or Unique3D instead, it generates better results.