Creative solution & implementation
The VAE called for the creation of a “hands free” audio tour that allowed visitors to listen to audio-driven narrative storylines on a topic of their choosing while located within the re-designed gallery space. Visitors would first select a tour on a large multi-panel touch screen placed at the entrance to the gallery space, with their selection made the visitor would then be able to walk freely through the space with a mobile device providing them location specific audio content.
Three digital technology companies and Memorial staff worked in close collaboration to undertake the successful implementation of the VAE. Art Processors developed the mobile technology component of the project by commission from the project lead, the Sydney-based exhibition production specialists Mental Media. Sydney-based technology company Snepo Research was responsible for the interactive orientation wall.
The Memorial curatorial team wanted the audio experience to have a serious tone and to be both informative and emotional, inspiring occasional chills as well as sustained periods of contemplation. A key objective of the project was to have the visitors’ attention on the exhibit itself rather than their audio tour devices.
With these requirements in mind, the mobile application was designed with a target of “hands free” tour technology. This approach to minimizing physical engagement with the tour device was a logical next step from our work in user experience for previous clients, such as the Museum of Old and New Art and the Melbourne Zoo.
The tour experience begins with a visitor’s arrival at the Memorial: attention is immediately drawn towards a vast 20 screen, HD multi-touch display system that orients each person to the space. The screen offers visitors the ability to choose from a number of different tours (currently five tours are offered, with plans to expand to eleven over the coming months).
Once the visitors select the tour from their touch screen, this selection is transferred to the mobile device. Instructions are provided to wear headphones and hang their device around their neck as they make their way into the gallery space. Turning away from the touch screen with their tour selected the spatial audio engine sparks to life and visitors are provided their audio narrative.
As the visitor explores the gallery, her movements trigger carefully layered audio samples. Some samples are interpretive and others are curatorial artifacts themselves, including archival sound, music, spoken word, and performance recordings. In just enough spaces, the audio subtly references visitor’s orientation in the exhibit. This personalized synchronicity often surprises and delights visitors who have, by this time, largely forgotten that they’re even carrying tour technology with them.
From the beginning of the project it was understood there were going to be a number of technical challenges that would need to be overcome to deliver the project to the user experience standards we strive for. Key to these were:
- Transferring the visitor tour selections made on the video wall to the device with 100 percent accuracy and minimal latency
- Providing a spatial audio engine that allowed the creative team freedom to produce exciting content and gave the audience member an enjoyable experience
- Device management, with the decision made to run on Android devices a management solution would be required to make device maintenance by staff simple and effective
The media wall is the epicenter of all three agencies involved in the project, the place where a visitor actively engages with tour technology. Our goal was to work closely with Mental Media and Snepo Research to make this engagement as responsive, user friendly, and intuitive as possible.
To deliver this experience communication was needed to take place between the device and the video wall. To achieve this communication with complete success we settled on a novel use of computer vision algorithms on both the medial wall and mobile device. The wall would detect devices placed on its surface by use of recognition of a unique marker place on the back of the device, while the device would recognise tour selection by use of the its camera and a colour matching profile embedded in the application data. This solution lets visitors be able to quickly comprehend the process without concerns over failure or complex understanding of the underlying technology.
Internal location for the project was provided by Fraunhofer, but to make this location relevant to the engaging visitor experience being built, a spatial audio engine was required. This engine could drive the visitor experience and make the tour as frictionless as possible. The solution that allowed for web-based controls over audio queues using the visitors location (micro-geofencing) which would drive distinct types of audio (narrative, background, special effects). Error correction had to be built in, to account for jitter on the location prediction provided by Fraunhofer. The result was a simple front end for the visitor and an elegant and powerful solution that gave the content creators the freedom to really explore the boundaries of what had been possible.
Making the management of devices as simple as possible for Memorial staff was key, as they were being asked to manage a fleet of 200 devices.. Since the application was being written for Android devices (Google Nexus 6), we had the ability to write a custom version of the operating system with which the device would run (a custom ROM). This custom ROM allowed us to focus the device purely on the application it was designed to run, shutting down multiple other processes to greatly extend battery life. It was also able to respond to very specific events, such as refreshing all content when charging. As well as creating the custom ROM, we also designed and built charging and handout stations, including custom hard cases lanyards and a wireless power supply solution.