I used the same video as for the PG model in Luma. (PG stands for Photogrammetry)
I simply downloaded the result of this Luma beta version NERF service and cleaned the stuff floating around. I am impressed by the ability to capture dark object with strong sunlight in the background even if it involved a lot of approximation. In comparison it took 854 pictures extracted from the same video to get my PG software to produce a goodish 3D model with a big rework in some areas due to the difficulty to build the dark trousers and the sun from all the windows in the background.
Of course for a real job I would put myself in ideal conditions to make a good model with PG but this exercise was to see what more can NeRF bring to 3D capture. When trying to merge both model I found out that the approximation form the current 3D model produced by Luma is still very far from the PG model quality (where it works) and I was not able simply to use the Luma model to fix my PG model quality holes.
Comments