It’s easy to share 3D and VR using Sketchfab and more & more museums and cultural organisations are doing just that, including the UK Government Art Collection (GAC).
Let’s explore how we captured the temporary exhibition Reframing the Past and shared it online – complete with an audio tour from curator Dr Laura Popoviciu and VR ready. Check it out for yourself below:
While undertaken very much an experiment for GAC and myself, this project has been very useful in figuring out some of the technical challenges of capturing large spaces and re-creating them ready to share online. I’ll detail some of the process below but also recommend that you read Laura’s post on the GAC blog which details why she is interested in the digital documentation of exhibitions and subsequent sharing in 3D and VR on sketchfab.
Our main aims were to:
- capture a specific place and time in 3D
- create a new and engaging reflection of that space for an online audience
Capturing the Gallery Space
We knew that we wanted to document the exhibition as it was in the gallery space so our first challenge was to capture the space itself. We were working with no budget so the simplest way to this was using good ol’ photogrammetry. We used a couple of different cameras, running the images through PhotoScan and Reality Capture to see what results we could get. After a couple of false starts, ended up with an image set of 800 photos from a Canon G7x compact, shooting on full auto.
We ran the set through Reality Capture with good results:
Optimising the Space for Sketchfab & VR
After sucessfully capturing the space, it was time to make the 3D ready for the web. Two things that were notable from the processed scan were
- that the poly count was very high (especially considering the relatively simple shape of the room)
- even with the texture for the whole space exported at 8k, resolution of individual artworks was insuffiecient.
The first point was easily handled by loading the scan mesh into Blender and simply rebuilding the space from cubes and extruded planes – this included the room partitions and furniture that were not part of the exhibition but still part of the space. To keep the file size especially low, I ommited modelling fine details like lamps, plug sockets, window frames etc. so the final mesh was a couple of hundred faces instead of a few thousand.
Assuming that you don’t deviate from the postion of the imported mesh as you work, this simplified mesh can be exported back to Reality Capture and textured:
When it came to individual artworks, it was necessary to source higher resolution image files and import them as individual planes. In the case of Roger Ackling’s sculptural Isle of Wight, we had to process another image set into 3D then optimize and import it separately. Frames for all the flat artworks were modelled from scratch too.
The total face count for the finished scene is less than 1,000 faces and weighs in at less than 5mb when uploaded as a .blend file.
The final touches were to add annotations to eachartwork, an audio description and set the scene up for VR.
The entire process, from initial photography to uploading to Sketchfab, took just under two weeks and I’m sure it would be possible to create something like this in under a day given some initial preparation. Imagine being able to experience other works from the Government Art Collection in the context of grand rooms at British embassies around the world – it’s definitely possible!
Thanks so much to Dr Laura Popoviciu for the opportunity to work on this experiment and for providing access and content to make it possible.
If you like what you see or have any feedback on this work please leave a note in the comments!