Creating an Interactive Museum Exhibit using an IoT Joystick

Back to overview

About Us

Hello! We are IT developers in the field of culture and museum activities from Russia. Here is our team that worked on the IoT Sketchfab controller project:

  • Amir Akhtamzyan – Senior Research fellow in the Multimedia Department, the State Darwin Museum.
  • Konstantin Ryabinin, PhD, Assistant Professor in the Computer Science Department, Perm State University.
  • Dmitry Olshansky – Head of the Multimedia Department, the State Darwin Museum.
  • Elena Sudarikova – Anthropologist, Senior Research fellow of the Department of Scientific Research, the State Darwin Museum.
  • We would also like to thank Aurore Mathys from the Royal Museum of Central Africa, who kindly provided a 3D model of the bonobo skull for our exhibition.

A few words about our activities on Sketchfab:

Since 2016, with the help of a three-dimensional Artec Eva scanner owned by the Center for Youth Innovative Artworks “BionicLab” (established in 2013 at the Darwin Museum), we have been digitizing models belonging to the State Darwin Museum and publishing them on Sketchfab. Now our channel has more than 200 models of various types: osteological collections, scans of birds and animals taxidermy mounts, three-dimensional reconstructions of the Burges Shale fauna, as well as scans of sculptures and scientific reconstructions, many of which are available for download.

The paradigm of Internet of Things. Clever museum objects? FineVector / shutterstock.com

IoT Joystick Idea for Sketchfab API

While working on our Bonobo photo exhibition about the endangered species of great apes, we decided to use a proxy object as the tool used to navigate a 3D model displayed on Sketchfab.

The Internet of Things (IoT) is a rather promising concept. Basically, it’s a system of “smart” objects carrying out a certain function and transmitting data to the Internet in real-time mode. The Internet joystick for Sketchfab fits into this concept, as well as into the communication and interaction model.

Interactive museum exhibit IoT controller

Watch the video below to see how it works and the results we have achieved:

Before starting the actual development we outlined a number of requirements that the future IoT joystick had to meet:

  • To be a shockproof copy of a museum object that museum visitors can touch and actually use;
  • To be fully autonomous and wireless, work through the museum’s local WiFi network;
  • To actually be a means of steering the 3D model of the skull on the touch screen terminal (sync the 3D model rotation with the skull);
  • To be an exact copy of a museum object, to be equipped with buttons that highlight the key pins with annotations on the 3D model;
  • To play audio guides on selected pins of the 3D model.

Implementation and Technical Details

Based on the above mentioned, we began our work, which included several stages:

1. Creating the body.

According to our original concept, the device body had to be the exact copy of a museum object. For this we used a 3D copy of a skull (based on the model that the Royal Museum of Africa in Belgium had sent to the Darwin Museum) made with 3D printing equipment.

2. Writing the code and programming the ESP 8266 microcontroller.

ESP8266 microcontroller

The ESP8266 controller is the “brain” of our IoT device.

Technologies of IoT were the key to hit all these points at once. We chose ESP8266 as the brain of our controller. This chip is really handy. The advantages are:

  • built-in WiFi module;
  • low price ($4 when ordered in China);
  • versatility (hardware support of communication with other electronic components via different interfaces: I2C, SPI, UART, etc.);
  • good performance (80 MHz of clock frequency, 1 Mb flash memory to store the code, 3 Mb–sometimes, even more–to store the data and 80 Kb RAM: this is enough to solve a lot of data
  • processing tasks like, for example, effective denoising of sensors);
  • low power consumption (the required voltage is 3.3 V and the average current is 100 mA);
  • high-level programming tools (the firmware can be written in C/C++, there are also bindings for Arduino IDE);
  • wide international developers community (and correspondingly a lot of libraries for different use cases).
Interactive museum exhibit Sketchfab API code

Writing the code and adjusting the prototype device.

We actually needed just 3 push-buttons, and we were able to connect them directly to the ESP8266 chip: there are enough free GPIO (General-Purpose Input-Output) pins on it.

For the rotation tracking we used the GY-85 chip. This chip is also very handy as it contains a 3-axis gyroscope, 3-axis accelerometer and 3-axis magnetometer. Actually, we needed just the inertial measurement unit (IMU), not an attitude and heading reference system (AHRS), so we decided not to use the magnetometer. GY-85 can also be connected directly to ESP8266 via I2C interface.

For the power supply we used an accumulator of 1200 mAh capacity.

When the hardware part was assembled together, software got its turn. The overall architecture of our solution is shown in the figure below.

IoT architecture

It consists of three parts:

  1. Sketchfab Server that stores the content and can display it on demand;
  2. JavaScript-based Viewer that communicates with Sketchfab Server by Sketchfab public API;
  3. ESP8266 Firmware that polls IMU over I2C and buttons over GPIO, processes the data and sends steering commands to the Viewer over WebSocket.

Each button has its index number. This number is transmitted to the Viewer when the corresponding button is pushed. The Viewer calls the gotoAnnotation Sketchfab API function with that number as argument to show the appropriate information block on the model. Also, the button triggers playback of audio with the same information.

The IMU sensor provides information from gyroscope and accelerometer, that is processed by ESP8266 using Mahony filter. As a result, a quaternion is composed that mathematically represents the current orientation of the controller. This quaternion is then multiplied by the calibration quaternion to compensate for the difference in initial orientation of controller related to the initial orientation of the virtual 3D-model. The resulting quaternion is transmitted to the Viewer, where it is converted to the rotation matrix. This matrix is applied to the 3D model using Sketchfab API function setMatrix.

To compose firmware for ESP8266 and the Viewer code we used SciVi scientific visualization system. This system allows a user to generate low-level visualization and steering code for IoT systems, providing high-level visual programming language.

3. Assembling the device

building joystick for interactive museum exhibit

Assembling an IoT device: installing the body, soldering the controller and electronic components.

In general, the process of creating and equipping the skull with electronics can be seen in the photo. The first step was to drill the holes for the buttons and install them into the body. A miniature Power Bank for 1200 mAh was installed to serve as a battery. The body of the device was equipped with a gyroscope and, of course, the “heart” of our device – the ESP8266 controller. The gyroscope and the buttons were connected to the controller, which in its turn was connected to the battery. The connection architecture can be seen on the diagram above.

A separate part of our work was the recording of the audio guide. Together with the anthropologist museum researcher Elena Sudarikova we drew key pins on the skull and added annotations to them. Pins with annotations on the 3D model corresponded to the buttons on the plastic model. The same annotations were voiced. Both written and voiced annotations can be activated by pressing the button on the plastic model.

We should also mention the fact that we decided to install a button for calibrating the IMU, as well as a reset button at the bottom of the skull. This made the process easier, allowing us to restart it each time the skull is placed on a surface and the button is pressed – a simple solution to reset the rotation of the IMU.

In the exhibition hall we placed a touch screen terminal with an intermediary software for the Sketchfab display, which updated the data received from the skull sensors in real time and controlled the display of the 3D model on the screen.

A rather interesting effect, which we also tested at the exhibition, was simultaneous model steering on several touch screen terminals. Since our IoT device actually worked through the local Wi-Fi network of the museum, we could freely move it around the museum space allowing us to synchronously transmit the data to all devices where the 3D model could be viewed.

Conclusion

In general, we were pleased with the steady operation of the joystick and the effect it had on visitors. The results of the work were reported at the Integrated System Russia conference, and the performance itself aroused the interest of the audience. As for the problems we continue to work on, we are searching for the possibility to expand the battery life in active mode or through standby mode. We are also concerned about visitors taking home the plastic model joysticks, as they have a wireless connection. To avoid this we might use the alarm which is set off when the Wi-Fi signal is lost.
Next time, we would also like to try to implement Sketchfab’s VR technology, and even try to create a set of IoT sensors that would enable spatial interaction with the 3D model via the VR!

We will keep you informed about our further developments!

Instagram / Facebook / VK / State Darwin Museum on TripAdvisor

 

About the author

Amir Akhtamzyan

Amir Akhtamzyan is a graduate of the State Academic Humanitarian University with a master’s degree in history. He now works as a research fellow at the State Darwin Museum, and also teaches at the Moscow State Institute of Culture. He also leads the itmus.ru project, whose mission is to introduce modern technologies in museum and exhibition activities.


Leave a Reply

Your email address will not be published. Required fields are marked *

  • Wow, some nice making going on there!
    Thank you very much for sharing those explanations about the hardware setup, this is a really inspiring post to drop the Arduino experiments and move to more “serious” tinkering 😉
    And congratulations on your nice digitalization work too!

Related articles