Tom Ryley, Communications and Digital Officer at the Old Royal Naval College in London, tells us how his team is using 3D technology to show new discoveries in new ways.
A Serendipitous Discovery
Today a tourist landmark, the Old Royal Naval College has a complex history, with the remains of a Tudor palace lying beneath elegant baroque buildings designed by Sir Christopher Wren and Nicholas Hawksmoor that are visible today (history buffs can read the complex story of our site here).
The site is currently midway through the Painted Hall Project, an ambitious £8.5 million project to conserve the vast murals of the Painted Hall. Whilst working on a new visitor centre beneath the Painted Hall, workmen discovered two rooms from Greenwich Palace, the birthplace and principle residence of Henry VIII, hidden beneath a section of floor.
The find is extraordinary. Glazed Flemish floor tiles still shine, and archaeologists believe the niches in the subterranean room to be ‘bee boles’, for storing wicker bee ‘skeps’ (the precursors of wooden hives) during the hibernation of the bee colonies in the winter months. At the time, beekeeping was an important industry for its production of beeswax for candles, and honey for brewing mead.
Beekeeping was an industry largely practiced by friaries and monasteries, and the newly-discovered rooms may form part of the palace complex’s Friary, where Elizabeth I was baptised. The remains could also be part of the service range of the palace. In any case, the find is the first of this area of the Tudor site; previous archaeological excavations have largely been confined to the royal wing along the riverfront, excavated during the 1970s.
Whilst we were excited to engage the public with the find, doing so is quite difficult. The space remains closed to the public until 2019 whilst work continues, and even then, though we hope to acquire funding to display the remains within the new visitor centre, it is possible that the discovery will have to be re-covered. So we needed a way to relate what we’d found to the public, as well as something that could serve as a record should we be unable to raise the funds to preserve the discovery.
Sketchfab was a therefore great solution for achieving this. The photogrammetric scan allows the public to interact with and explore the discovery digitally. It also suited the Communications team’s tiny budget and is easy to do. Try it for yourself with this version of the scanned scene optimised for virtual and augmented reality:
My favourite part of the find is the interaction between the remains and the later buildings above. The walls of the remains lie directly beneath Hawksmoor’s columns, serving as ready-made foundations. Details such as this made it important to scan the surroundings of the Tudor rooms, illustrating this interplay between old and new. Capturing the surroundings also acts as a record of the space at the moment of discovery.
Beyond this find, we hope that the model will be the first of many scans for the Old Royal Naval College, providing a new way of encountering the rich architecture, carving and art work that can be found across our site.
Digitisation & Display
Thomas Flynn from Sketchfab provides a quick look at how the site was digitised.
With no budget and only limited opportunities available to capture the site, we decided to use the simplest set up possible – a single camera and whatever lighting was to hand onsite.
In the end, we simply used a hand held Canon G7X set to full auto. Not only is the camera surprisingly good in low light, it’s quick autofocus allows for fast shooting – essential when capturing large volumes of images.
After an hour and a half on-site, we had 996 digital images ready to process.
We ran the images through RealityCapture to align the image set, reconstruct 3D geometry and texture the scene. In the GIF below you can see the camera positions in the 3D scene as heavier white dots in and around the tie point cloud.
The software also lets you define a reconstruction region (the large wireframe cube) which effectively allows you to crop out the parts of the scene that you do not wish to compute into 3D.
The model we exported from RealityCapture was a record of the entire scene we photographed on-site – complete with safety barriers and exposed carpentry. Now that we had the space digitised however, we were able to edit the scene and use this base model to create some derivative versions for different uses.
First we wanted to remove all of the extra objects in the scene to focus on the tudor floor and bee boles – a straightforward task over an hour or so in Blender, deleting geometry and some setting boolean modifiers:
Secondly we wanted to create a version of the scene that was optimised for viewing in different ways – namely virtual and augmented reality using the Skecthfab web viewer or app. We can help ensure a smooth webVR or AR experience by reducing face counts on our meshes and pushing some of the detail information to texture maps.
While the base model – even with all the extra objects removed – is some 1 million faces, we were aiming at somewhere around 20 – 30 thousand faces (or a reduction of 97 – 98%) for the model. Luckily there is a great free tool to help with this kind of thing: Simplygon.
There are lots of options to tweak in the software, but the gist is you input a hi-resolution mesh, hit ‘process’ and some clever algorithms reduce the poly count of your model to your specification, maintaining detail where it counts:
We then imported the model back into Blender and to UV unwrap it, re-saved it and the used another free tool, xNormal, to bake details (normal & ambient occlusion maps) from the hi-res mesh to the new, optimised version.
Finally, if you can’t get your hands on a compatible device, here’s how it looks in action in VR…
We hope you enjoying exploring this amazing discovery yourself on the Old Royal Naval College Sketchfab profile.