Hey there, I’m Pavel, co-founder here at Rigsters. We are a group of passionate people, crazy enough about photogrammetry to try making it into something commercially viable. You might have seen us around on Sketchfab. Apart from 3D scanning things all day every day, we are building smart machines to do it for us. As my co-founder Iulian likes to put it – we work hard to be lazy. You can learn more about us at rigsters.com.
We have been through many stages of photogrammetric fever – scanning dozens of shoes, furniture, trees, rocks, etc. But 3D scanning for cultural heritage always remained one of our main interests. Collaboration with museums and individual researchers on digitisation projects led to exciting discoveries and innovative ways for dissemination. We have worked on projects involving things like ancient Mayan drawings, Viking rune stones, arms and armour artefacts, and more. Even got to scan an entire whale!
Today I want to talk about our endeavours into the world of multispectral imaging. Some time ago, we had the chance of getting first-hand insights from a world-leading multispectral camera manufacturer. From revealing forgotten texts of ancient manuscripts to detecting traces of ancient pigments that have faded away throughout the centuries, this technique allows discovering what is invisible to the human eye. As always, the first thing that came to our mind was – can we scan it?
The story began at the Digital Humanities Lab (aka HUMlab) at Copenhagen University, where information specialist Lars Kjaer is creating a space for innovative collaborations and finding new ways for fueling student curiosity. Shortly after we got together to exchange ideas and knowledge, Lars organised several field trips to local museums to get hands-on experience with photogrammetry. Among other places, Rigsters crew was invited to Ny Carlsberg Glyptotek – a neighbouring museum with a vast collection of antique art. How can we say no to running around an empty museum and scanning priceless artefacts?
That’s how we got to meet Cecilie Brøns and Signe Buccarella Hedeaard, researchers of ancient polychromy at Ny Carlsberg Glyptotek. During our field trip, we captured several ancient Greek sculptures that were a part of their research. It didn’t take long for Cecilie and Signe to recognise the power of 3D and its applications in scientific research. Why spend countless hours on making a physical replica or tracing dozens of images when three-dimensional reconstructions can offer all the flexibility you need?
Not long after, Iulian and Signe got together for a ‘geeking-out’ session and attempted to 3D scan one of the statuettes using visible-induced (infrared) luminescence (VIL), a technique used to detect traces of ancient pigments, in this case – Egyptian blue. This pigment was used in ancient Egypt and Mesopotamia for thousands of years and is considered to be the first synthetic pigment.
The setup for VIL capture included an infrared-sensitive DSLR (normal DSLR with the IR filter removed and an IR bandpass filter on the lens). The camera was mounted on a tripod because of very long exposure time, approximately 30 seconds to 1 minute. Due to the nature of this technique, we also needed to use specialised light sources that emit only in the visible spectrum, making sure that no other light sources, including natural, were present in the room. The resulting images revealed pigment traces which represented themselves as luminescing areas on the statuette.
For the first time, Ny Carlsberg Glyptotek inventory number IN 895 has been validated to be an authentic replica of a Greek terracotta figurine from the 3rd century BCE. Without further ado, Iulian snapped an extra hundred or so images for attempting a photogrammetric 3D reconstruction. To our pleasant surprise, image sets of the visible spectrum and the VIL aligned with each other perfectly. (Thank you, Reality Capture!) This made it possible to use parts of the data for generating a 3D model that later worked as a common base mesh for texturing individual spectrums. By blending texture maps of VIL and visible spectrum, we were able to approximate the distribution of the pigment and visualise it on the model.
To our knowledge, the 3D model reconstructed from VIL images was the first of its kind. This discovery led to a collaborative research paper that explores the potential of multispectral photogrammetry and illustrates developed methods. Shortly after, we were invited to present our research at the Digital Humanities conference, here, in Copenhagen. For this occasion, we have decided to take it a step further and investigate what other spectrums can reveal.
Ultraviolet-induced luminescence/UV fluorescence (UVL) revealed traces of another pigment, possibly Madder Lake, that is known to have been used in ancient polychromy including Tanagra-figurines similar to our statuette. UVL also highlighted areas around the neck and the hand, which appear to be old repair works. (This might explain the unrealistically long neck of the figurine)
Infrared-reflected (IRR) technique may have shown details drawn in carbon black. This technique is primarily used to reveal preparatory drawings (underdrawings) underneath paint layers. In this particular instance, an IR-absorbing smear becomes very obvious on the right cheek and indicates repair work, whereas an increase in IR-reflection in the hair revealed a possible decoration.
Visible-induced visible luminescence (VIVL) reveals materials which luminesce in the red part of the VIS spectrum. VIVL was the newest of the imaging techniques used, and we have only just begun exploring its potential. Areas painted with (possibly) Madder Lake pigment have lit up especially strong.
A research paper detailing this project is in the works.
Following the same post-processing procedure, images of individual spectrums were aligned in a single composition and later re-projected to the base 3D model. Our endeavour with multispectral imaging has resulted in 7 models, each illustrating different spectrums and their combinations. In no time, as most of our models do, they found their way to Sketchfab.
At first, we spent some time pondering about the best way for visualising the models – shall we animate the transition between different textures, present them as individual models, or? Shortly after publishing the annotated, carousel-like scene as seen above, Sketchfab Cultural Heritage lead Thomas Flynn reached out with the brilliant solution – why not use the Sketchfab configurator with a single model and only swap textures on demand? That’s how we got to play with a preview of the Sketchfab Configurator Studio, which was quite straight-forward and powerful. (You go, Sketchfab!) You can play around with what we’ve made of it here.
There is no doubt about the great potential that the marriage between photogrammetry and multispectral imaging can bring. Although it requires an immense amount of data, extremely long exposures, computation and processing times, not to mentioning funding – this technology is able to show us what no one else can see.