Simon Denny: Mine at the Museum of Old and New Art is a technologically ambitious exhibition, blending installation and sculpture with AR, visitor interaction, and real-time data extraction.
Over two-and-a-half years, we worked closely with artist Simon Denny and exhibition curators Jarrod Rawlins and Emma Pike on exhibition design, creative conceptualisation and technology integration for the Mona-commissioned project.
Bringing our expertise in extended reality and data analytics, we developed AR and facial recognition for The O, the mobile experience that is central to the Mona experience. We also mined Mona’s real-time visitor data, finding new ways to show visitors what museums can collect about them—adding another layer of discovery and information density to the exhibition.
The O: integral to Mona and Mine
The O is core to the Mona experience. It replaces wall labels and texts, using indoor positioning technology. It finds visitors wherever they are in the 12,000 square metre museum and tells them about the artwork on display closest to them.
When it launched with Mona’s opening in 2011, it put the museum at the cutting edge of mobile technology—and it continues to push the boundaries of museum mobile experiences. Over 90 per cent of visitors use The O to learn about, engage, and interact with the museum’s many permanent and temporary exhibits.
The O is also core to the Simon Denny: Mine experience. Without it, visitors can’t fully experience the exhibition. Most of the sculptural components of the exhibition also have an AR layer to them including AR experiences, videos, interactive facial recognition, and data visualisations.
The O collects real-time data about Mona’s visitors from their devices, tracking their movements and engagement and feeding this data back to the museum and the exhibition.
Simon Denny: Mine was the first time The O is part of an exhibition and not just the supporting act, bringing together artist, curator and tech company. It’s also an example of how technology, when used innovatively and collaboratively, not only enhances the visitor’s experience but can support an artist’s vision.
“As I became more and more familiar with The O, it became clear that using AR would be a really great option for my exhibition [...] Mine was really the perfect first outing for me in this medium, augmented reality. All the infrastructure was there for me to just focus on content and not be solving technical problems myself that I would have otherwise had to do in another situation.”
– Simon Denny, artist
Adding augmented reality capabilities to The O
One of our initial challenges involved getting AR to work with The O.
When visitors arrive at Mona, they can choose to use The O on a museum-owned iPod or download the application from the Apple Store to their iPhone. While excellent for their usual purpose delivering content and media to visitors, Mona’s fleet of 1,300 devices uses an older version of iOS not designed to handle Apple’s AR framework, ARKit. So we implemented a third-party framework, creating a completely custom AR solution for our needs.
Once inside the exhibition, visitors use their device to scan markers—pyramids with QR-like codes—to generate AR experiences.
In the first room, they watch the King Island Brown Thornbill come to life through their device as a virtual representation of the rare bird flits around a metal cage. As more viewers gather around the cage, each summons their own virtual thornbill, filling the gallery with the growing sound of birdsong.
In the second room, visitors snap selfies and watch as their face is ‘mined’ and superimposed with a rare earth mineral—a mineral used in the making of their physical device.
In the final room, it’s revealed that the museum has been tracking their behaviour, gathering data about their journey through the exhibition via their use of their device. Data visualisations of their time spent in the galleries comes to life in AR, driving home how museum and visitors are all players in the data game.
The King Island brown thornbill: the canary in the coal mine
The King Island brown thornbill has the unfortunate distinction of being the Australian bird most likely to become extinct within the next 20 years. It’s native to Tasmania and its habitat has been damaged, perhaps irrevocably, by industry and climate change.
Hope for the bird’s regeneration lies in gathering data about its habits and movements in the environment. But such data gathering requires technology that itself relies on the continued extraction of minerals from the earth... Naturally, the bird is the exhibition’s “canary in the coal mine”.
It ‘lives’ on The O throughout the exhibition where visitors watch it flit and fly. But recreating it virtually wasn’t easy. Sightings are rare and the bird had never been captured on video or in sound.
Fortunately, in 2019, researchers from the Australian National University were able to spend three weeks on King Island searching for the thornbill. A Mona-sponsored photographer/videographer joined the field trip in the hope they could capture the bird on video for use in research and for the exhibition.
About 50 of the species were found on the island, giving fresh hope it can be brought back from the brink. Sadly, the tiny birds were too quick to be caught on video, but the bird call was recorded for the very first time. This is the same birdcall visitors hear throughout the exhibition.
Through studying photographs of the thornbill together with images and videos of other Australian thornbill species, we were able to create 3D models of the bird. These models are overlaid onto pages of a patent for Amazon’s worker cage, which visitors see on the walls of the exhibition, and the AR bird darts around a life-size model of the cage via The O.
“Mona supported this trip because it’s a story, one that looks at the positive way data collection can impact the world, rather than always looking at mass data collection as a problem. The thornbill was at such high risk not only because of loss of habitat due to farming and climate change but also because we didn’t have enough data on them to understand how to save them. This research trip significantly increased the amount of data available on the bird and that’s why there’s a better chance we can now save it.”
– Emma Pike, co-curator, Mona
Developing facial recognition technology to ‘mine’ people
The exhibition deals with extraction—the mining of data and minerals. But Simon wanted to take things a step further and explore how we could mine people.
He’d read about a study into job candidates and how the emotions they displayed during interviews might result in hiring disparity and workplace bias. He became interested in how we could mind a persona’s emotional state, so we developed facial recognition technology to achieve this with The O.
After initially investigating and testing ways we could use facial landmarks to create an AR solution, we opted to keep things simple. We trained a neural network in Facial Expression Recognition (FER) to identify eight key emotions: happiness, fear, disgust, sadness, surprise, shame, anger, and contempt.
We then assigned a rare earth mineral to each emotional state, in keeping with Simon’s commentary on the mining of minerals and the mining of people.
When visitors take a selfie, the facial recognition feature in The O scans and displays their emotional state. They can then scan a nearby marker to ‘mine’ their face and collect a corresponding mineral.
We were able to develop and ship the custom neural network in a week, achieving a 60-65 per cent accuracy for all eight emotional categories and a more ‘human’ result.
Mining and sharing Mona’s real-time visitor data
To deliver location-aware experiences across the museum’s 12,000 square metre site, The O tracks every visitor’s movements and interactions and feeds this data back to Mona. Mine taps into this real-time data, collecting information about how visitors interact with the exhibition and the museum as a whole.
The final room in the exhibition features an assemblage of sculptures by a variety of artists. They depict humans at work, navigating the thorny relationship between technology, development, and human labour.
It also reveals how the museum has been accessing the visitors’ data, using AR data visualisations to share this information. Visitors can scan markers that sit between the sculptures to view the data, including:
- Mona’s visitors – Mona visitors arriving right now, this hour, today, this month, and this year.
- Mona’s database – Amount of Mona data being stored right now, this hour, today, this month, and this year.
- Visitor engagement – Minerals they’ve ‘collected’, AR they’ve viewed, artworks per hour.
- Mona love/hate ratings – The past 30 days for 7 loved and 7 hated works.
- Visitor engagement: visitor time spent in Mona – Overall visitor average, individual visitor total time in Mona, time spent in augmented reality – you vs visitor average.
- Visitors device data – Platform, OS version, app version, hardware, build, location, service name, timezone.
- Mona daily O sessions.
“One of the things I proposed for the exhibition was showing viewers what data was being collected about them during their experience in the museum. I could imagine some companies not being into that level of transparency about their data practices. Art Processors was curious to explore what I was interested in and suggested things that I would not have come to. That was a nice thing, that it was not a problem. It was even interesting and engaging for them, too, on some levels, to separate out data and find a way to show in ways where it made sense to people who might not be thinking about data capture in a museum environment.”
— Simon Denny, artist.
Testing our technology: recreating the gallery space
When visitors scanned the AR markers, we needed to ensure detection and scanning worked with 100 per cent accuracy. Since Mona’s moody, low-lit galleries are not at all like our sunny, well-lit Melbourne headquarters, we needed to recreate the exhibition space.
We had one of the museum’s gallery lights shipped over from Hobart and we converted one of our offices into a dark room. This allowed us to extensively test marker scanning and explore optimal lighting scenarios quickly and at low cost..
The best lighting was a diffused light that wasn’t too bright—it had to work with the dynamic range of the older iPod cameras. The colour of the pyramid markers was also changed slightly from pure white to off-white to further reduce light intensities.
We then spent two weeks testing in-situ at Mona ahead of the exhibition’s opening, working closely with the curators and the museum’s lighting technician.
Supporting Simon Denny’s vision as an artist
“Working with an artist can be different from clients because often the goals are different than, for example, what I imagine commercial clients or science museums might want. An artist might make a work that is intentionally difficult, for example, one that repels viewers rather than attracts them. They might want to encourage ambiguity or craft experiences that are, in some ways, leading to a space of being unresolved or confused.
“This is not always the case, of course, but it was great for me that I didn’t feel a certain pressure from Art Processors to produce a certain kind of work or outcome. They let me define the goals and metrics, which I feel some companies might have more difficulty with.
“Art Processors brought a lot to the table beyond the technical—they brought research, ideas and feedback. I have a certain knowledge base but they have a whole other level of understanding of the way some digital systems work like facial recognition and data capture on mobile devices. It was great to have a team of not only technical experts to craft the experience with, but also people from the industry that the artwork addresses to get their feedback.”
– Simon Denny, artist