Greetings. My name is Lassi Kaukonen and I come from the former capital city of Finland called Turku. I am an old school gamer from the early 80’s and my training in 3D has been self-taught. I have always been a very tech-savvy person and I have a pretty good knowledge of photography. I work in the hospital district of southwest Finland as a system specialist and currently I’m working on a project that aims to ease people’s navigation within the hospital and its surroundings via photogrammetry and other things. My passion for photogrammetry started about three years ago when I managed to scan things from my backyard and place them in a virtual environment that I had created in Unity. My desire to make better models grew stronger last year when I had the chance to create a virtual environment for psychiatric healthcare. The goal was to make a VR app to help people with performance anxiety.
I stumbled upon this specific live orchid flower in IKEA. I was waiting for my wife and noticed a bunch of orchids on a stand. I remembered the countless hours that I had spent in my basement last summer and autumn trying to make models from leaves and decided to scan one of these beautiful and well-patterned flowers.
My inspiration for this model came from the thought that I could learn something by scanning it. I could compare the results between those leaves I scanned back in summer and maybe find new pointers or methods that I could use when scanning something challenging.
I do most of my scans indoors with a tripod and the subject on a turntable. This is because you can control things like light and background easily. This orchid flower was shot on a turntable, but I did not take the shots in my usual place because my usual shooting space was under renovations.
I took the photos in front of a cabinet with a Nikon D5300 + 40mm Micro Nikkor, ISO 100, F/22, Kenko circular polarization filter, and a remote. The remote is a very handy tool to have and I noticed that without it the camera tends to shake or move even on a tripod.
For light, I used an F&V R-300 ring light that I polarized with a linear polarizing film placed over the light source. This method of using two polarizing elements (one for the light and one for the lens) is called cross-polarization and with it, you can eliminate specular highlights and leave only the diffused component. I learned about this technique from this YouTube channel. This gentleman has absolutely amazing photogrammetry tutorials with great entertainment value.
I had learned from scanning leaves that I would need to take more pictures than usual (with an easy subject, I take about 40 pictures per round and with thin subjects like leaves, coins, and petals I take about 80–120). The next thing was to have a shooting angle that I could shoot the largest part of the thinnest area head-on and capture the whole subject with one round. I have never managed to align more than one set of pictures when dealing with leaves but, for example, with coins, you can align two or three different sets pretty easily. Lastly, I would need to remove the specular highlights for the alignment to be successful and this I could achieve with cross-polarization.
The total number of pictures I took of the orchid was 101 + one from the background. That means I took photos in about 3-degree intervals, but this is a rough estimate because it was done by rotating the turntable by hand and with no markers.
In photogrammetry, it’s useful to have powerful CPU+GPU and as I have a huge passion for games, this orchid was created with a self-built rig with Ryzen 3900x and RTX 2080Ti + 32gb of DDR4 3000mhz.
The most important thing when processing a scan that has been done with the “turntable” method is masking. I had to mask this orchid twice with different techniques because I took the photos in an unusual space. The cabinet behind the subject wasn’t an ideal background so my mask (which I created from the background) was either taking part of the background with it or leaving a part of the model out. You can also mask in Metashape with the “tie-point” method—where you mask the background and then align the photos—but I did not want to use this method because I knew I had to probably align the photos on with the highest setting and the tie-point method would take ages with that.
Now I had to make a choice between taking the pictures again with a green screen or trying to create proper masks some other way. So here is what I did. I created masks from background, aligned the photos with a little background attached to them, cleaned the background tie-points a bit, and built a medium dense cloud out of it. Then I quickly cleaned the background out of the dense cloud and created a mesh out of it. After I had the “simplified” mesh ready I created masks again based on the model. Now I had a pretty good mask for the model and I ran the alignment, dense cloud + mesh creation again.
The mesh came out pretty ok, but there were a couple of spots where Metashape was unable to build mesh so something had to be done with that. There was also the stick on the back which kept the flower standing in the vase. Metashape’s own “fill holes” tool wasn’t working too well on this one so I decided to take it to Blender.
Quick Trip to Blender 2.8
I took my mesh to Blender to remove the stick from the back and filled a couple of holes that Metashape had failed to build. After this, I exported the model and took it back to Metashape for texturing.
Texturing in Metashape
There are a few different ways I typically texture in Metashape, but with this one, it was very straightforward. I had pretty controlled conditions apart from a bad background, and all the specular highlights had been removed, so I textured the orchid with the mosaic setting and with all images in the photoset.
I wanted to post a downloadable high-poly model of an orchid so I didn’t need to bake any additional maps for this one.
New Thing I Learned
I learned a couple of new things from this project, but the most important thing I learned is that a distinctive pattern on a bendy thin surface can probably help with the construction in these kinds of situations when the alignment seems to be fine and the cloud of tie points look very good, but the dense cloud is creating some sort of noise that prevents the mesh from being properly built. (I had lots of these kinds of problems with maple leaves that I tried to scan back in those darkening autumn nights.)
(And if anyone reading this knows a solution or how to prevent this, feel free to comment or send me a message. I would really appreciate it.)
This red one is best one I got. I took photos of this specific leaf roughly 10 times from different angles and setups and this was the best I got after giving it a good ride in Blender.
I used the 45° FOV and as an environment I used “Road in Dordogne”. I think it has a pretty good look for this scene, as it has blue sky above and brown/greenish tint at the bottom. Lights are the standard three-point light preset and I used shadow catcher instead of Baked AO ground shadows because I think shadow catcher brought more balance to the scene. Post-processing filters that I used for this one are Depth of Field, Sharpness (0.33) and Vignette (amount 0.39 and hardness 0.68).
Overall I think Sketchfab is a superb tool to present or demonstrate anything in 3D, and I hope that some parts of my work with the hospital navigation project will be on Sketchfab at some point.
Big thanks to Abby for contacting me and offering me this chance for a write-up.
And If you have questions about this model or you have an interesting project or job proposal that involves photogrammetry, please feel free to contact me via Sketchfab!
Thanks for reading!