King's Chamber API Spotlight

API Spotlight: King's Chamber Prototype

Back to overview


How can 3D publishing programs work hand-in-hand to augment traditional 2D print publications?

Can one publish a descriptive volume in a modular format that is visually cohesive and functional, and that doesn’t require all of the component parts to be there first?

We are a small working group of three colleagues – Owen Murray, Alexis Pantos & Ariel Singer – endeavoring to answer these questions through a series of ongoing conversations and collaboration. Owen & Ariel are members of the Epigraphic Survey of the Institute for the study of Ancient Cultures at the University of Chicago, also known as Chicago House. Alexis works for the Digital Documentation team at the Museum of Cultural History at the University of Oslo, Norway.

Our most recent efforts have resulted in a digital publication prototype called The King’s Chamber: an open source web-based experimental viewer providing a platform for presenting contextual geospatial relationships (in 3D) and the examination of their surface details (in 2D). The King’s Chamber is one of six rooms in the Inner Sanctuaries of the 18th Dynasty (Small Amun) Temple at Medinet Habu on the west bank of Luxor, Egypt.

The King’s Chamber Prototype integrates current best practice documentation techniques derived from hybrid photogrammetry and laserscan data sets, with archival imagery from the Epigraphic Survey’s photo archive and large format collection, as well as highlighting the epigraphic illustrations and text from OIP 136 Medinet Habu IX. The Eighteenth Dynasty Temple, Part I: The Inner Sanctuaries With Translations of Texts, Commentary, and Glossary.


Chicago House is a long-standing project under the umbrella of the Oriental Institute of the University of Chicago working in conjunction with the Egyptian Ministry of Tourism & Antiquities (MoTA). Its primary objective is to produce photographs and precise epigraphic line drawings of the inscriptions and relief scenes on major temples and tombs in Luxor for publication. Epigraphy is derived from the Classical Greek epigraphein (“to write upon, incise”) and epigraphē (“inscription”) and is the study of written material inscribed upon hard and/or durable materials such as stone or metal. In the case of Luxor (ancient Thebes) 4000+ years of Egyptian history are etched into its temples, tombs and standing cultural heritage sites.

The Chicago House method is rooted in an epigraphic tradition where attention to detail and the expertise of artists, photographers and Egyptologists work in concert to form the foundation of the documentation process. The result is precise, accurate, information-rich facsimile drawings that would be unobtainable if any of the constituent parties were removed. This methodology places greater emphasis on the process, skills and knowledge of those involved, than on tools and techniques that can, and have been, adapted as needed. When the project was first conceived in 1924 the goal was to provide a published volume that could stand in place of the original (or section thereof), making these ancient records widely available for scholarly research and preserving them for the future. From the outset, the publication of such results was envisioned as a series of large folio print publications presenting the facsimile drawings alongside other relevant information

Chicago House - large folio sample

Developments in computing, especially with the internet and 3D technologies have not only revolutionized documentation practices, but continue to open up new publication mediums that would have been unimaginable a few decades ago. Drawings done on paper with ink can now be done on a tablet with a stylus, all to the same effect.

Chicago House now regularly employs 3D data from photogrammetry in its digital epigraphy and documentation process. Exploring the fusion of 2D and 3D data to meet the needs and standards of traditional publications while working in tandem with them is an active area of interest, and one of the core questions driving the King’s Chamber Prototype. We’re not interested in replacing print publications; we view them as a critical component of past, present and future documentation programs. The question for us is: what can 3D publications provide that 2D publications cannot?


The King’s Chamber Prototype marries 2D imagery and 3D models with available linked data. The interface is built with BootStrap 5.0, with 3D interactions derived from Sketchfab’s API and 2D IIIF interactions based on a number of excellent open source projects.

The prototype consists of 3 core components: a scene selector tab, a IIIF scene viewer with inscription and archive tabs, and a text and commentary section.

The scene selector tab contains a 3D model (sketchfab embed) allowing the user to navigate content by exploring the model at will, or sequentially via annotated nodes. A menu bar with a more traditional table of contents is located directly below the embedded model. Each annotated node in the model is linked to a section in the menu bar, and loads high resolution photographs, facsimile illustrations and archival imagery in the relevant tab in the scene viewer. It also loads applicable text and commentary, including inline footnotes, in a section below the scene viewer.

One major feature of the prototype is that it allows layers of high resolution visual information to be seen side-by-side, as well as overlaid, one on top of the other. Recent photogrammetric documentation of a scene can be seen side-by-side with archival documentation, on top of which epigraphic illustrations are overlaid and can be viewed at any desired opacity, all seamlessly.

Another significant feature of the prototype is the ability to see the epigraphic illustrations in contextual relationship with one another in 3 dimensions. Herein lies the true power of the medium and one of the most crucial ways that 3D publications can augment traditional 2D publications: definitive, crystal clear facsimile recordings of the inscribed material, seen ‘in situ.’ No other medium provides this ability – what only imagination afforded scholars in the past, 3D models can now make explicit for all. Sketchfab’s Physically Based Render (PBR) engine made the ability to implement this feature relatively simple.

Sketchfab integration

We chose Sketchfab primarily because Chicago House already had 3D data hosted on the platform, and it reduced development time as well as external hosting requirements. Having a broad user base, fantastic cultural heritage curation via Thomas Flynn and being the premier destination for 3D, VR & AR on the web didn’t hurt either.

Within the King’s Chamber Prototype the King’s Chamber Nav Model is embedded within the scene selector as a wayfinding element, which can be made full screen. One of the great upsides of using a 3D model in this manner is the implicit knowledge of where any given scene is in relation to another, as well as where the user is in relation to all of this information at any given moment, too.

The Sketchfab API allows for customization within a self-hosted html page with many pre-made viewers or configurators available. Unfortunately, not only did we have no budget, but none of the pre-made options offered exactly what was desired – the ability to fade between two textures; from the facsimile illustration to the photographic data and back again to more directly facilitate comparison between the two. To achieve this, we trialled two different approaches. The first allowed swapping textures live – that is, a hidden html canvas element was used to mix the 2 different source files and replace the model texture. This meant we could replace the original texture with any, and as many, different (UV mapped) textures as we wanted, using a simple slide control. This approach came with a significant performance cost, so instead we took advantage of Sketchfab’s own texture and display optimizations and placed the final inked illustration inside the emission channel, located in the PBR texture settings. Then we simply used a custom html page slider to mimic the controls inside Sketchfab’s own settings. This reduced flexibility, as the epigraphic line drawing texture needed to be loaded into the original model, but came with huge efficiency gains.

Hint: Turn on the emission layer in the model inspector

The results have been fantastic and work effortlessly within the prototype. In addition to this, one unexpected but very welcome upside has been access to AR and VR features available with Sketchfab. We never developed the 3D model with this intention, but being able to take advantage of these functions means we won’t need to double-up on our work down the road. If you have access to any type of VR headset we strongly encourage you to try it out! (Hint: turn on the emission layer in the model inspector.)

Improvements & future direction

It’s important to note that the King’s Chamber Prototype is part of a process that started in 2020, originating here. It’s not meant as a be-all, end-all solution, rather as a tool for exploring answers to a variety of questions, and as a way of thinking critically about what is needed for a meaningful digital publication.

As such, there are a number of improvements that need to be made, and a never ending change request log. First and foremost, we need to get the prototype to work fluidly on mobile and tablet devices – easier said than done.

After that, our next steps will be in the direction of adhering to WCAG 2.1 Web Content accessibility guidelines, and looking into sustainability and linked open data standards. Findability, Accessibility, Interoperability and Reuse of Digital Data (FAIR principles) are top of mind at the moment, as are search and content indexing, so we may very well redevelop the backend of this prototype into a module for existing publication platforms that can do the heavy lifting for us. Omeka, Juncture, Scalar or the Getty’s Quire are all on our radar.

Code, feedback & collaboration

The King’s Chamber Prototype has been presented at the American Research Center in Egypt’s (ARCE) 2022 Annual Meeting as well as the RIJKS Museum 2+3D Photography 2022 Conference.

The beta version code is very much a work in progress and is freely available through github. It is built around the open source IIIF format, using OpenSeaDragon 2.4.2 with a modified version of the most excellent openseadragon-curtain-sync, as well as snippets of code adapted from:

Hashing out this digital publication prototype has been invaluable, providing our small working group a deeper familiarity with the technological, but also theoretical and ethical landscape of digital publication.

Interested in joining the conversation? Help us answer questions and improve the King’s Chamber Prototype by providing your feedback. We’d love to hear your thoughts, input and insights.


Owen Murray (B.Design Visual Communications) is the Senior Digital Photographer with the Epigraphic Survey (Chicago House) of the Institute for the Study of Ancient Cultures at the University of Chicago. Owen is a member of the American Research Center in Egypt (ARCE) as well as ICOMOS Canada, with expert membership status in CIPA Heritage Documentation.

Alexis Pantos (MA MSc. International Heritage Visualisation) is an archaeologist by training with many years working in archaeological photography, as well as digital heritage technologies and presentation. He is currently employed as a member of the DigDok (Digital Documentation) team at the Museum of Cultural History at the University of Oslo, Norway.

Ariel Singer (PhD Candidate, Near Eastern Languages and Civilizations Department at the University of Chicago) is an Egyptologist and Epigrapher with the Epigraphic Survey (Chicago House) of the Institute for the Study of Ancient Cultures at the University of Chicago. Ariel’s dissertation is a lexicographic analysis of anatomical terminology for the head in ancient Egypt.

About the author

Owen Murray, Alexis Pantos, and Ariel Singer

Chicago House

No Comments

    Related articles