
As far as Australian landmarks go, few are more iconic than the Sydney Opera House. Its status as an architectural marvel and cultural epicentre is globally recognised, so the task of celebrating its 50th anniversary is a deservedly year-long affair. To launch this extensive program, our Group Director of Creative Services, Sam Doust, was commissioned to produce a film capturing the history of the building and the land on which it stands.
From the Sails: Light Years, a dazzling 17-minute audio-visual celebration of the Opera House designed to run as a continuous loop, was projected onto its western sails every night from 19-30 October 2022.
While the visual aspect of the project is undoubtedly striking, a compelling soundtrack was equally crucial to the impact of its narrative. But how could this be delivered to the ears of the viewer in a broad urban space?
The Sydney Opera House sits on Bennelong Point, a headland in Sydney Harbour, granting it remarkable visibility from almost any direction. The two main structures that make up the "sails" are both situated in a roughly north-south orientation, with the western building being the larger of the two. As such, the western facade of this western building provides an extraordinarily unique canvas with unimpeded views from the vibrant Circular Quay precinct across the water, and is also clearly visible from the Sydney Harbour Bridge.
Given the number of locations from which the Opera House's western facade can be seen, it's hardly feasible—logistically nor acoustically—to install a vast array of speakers for a 12-night run. Quite simply, how could the public listen to the film's soundtrack, in precise sync with the visuals, and with a minimum of fuss?

Conventionally, we would assume the complexity of a project would increase in proportion to the physical scale on which it operates. However, the audio-visual synchronisation technology we developed for From the Sails: Light Years, designed for personal mobile devices, defied such an expectation. So much so that Rob Keniger, our Lead Mobile Engineer, offered to provide the Sydney Opera House with a proof of concept within 10 days.
The seeds to the solution lay in our previous projects—most notably, the immersive audio experience at the Bob Dylan Center in Tulsa, Oklahoma. Upon entry to the Center, visitors are handed an iPod and a pair of headphones. They then simply tap the device on a series of beacons located near key displays to generate the corresponding audio, which is delivered with no discernible latency from the visual component. Interaction with the device itself is kept to a bare minimum, allowing the visitor to stay immersed in the moment and the environment.
However, the system we developed for the Bob Dylan Center operates on a closed network within a comparatively small physical space. Attempting to achieve such a degree of audio-visual synchronisation for viewers watching From the Sails: Light Years from any given spot in Circular Quay (or even from a boat in Sydney Harbour, for that matter) required a different approach.
"The system we built [for the Bob Dylan Center] was to deal with local latency—latency over a local network inside of a building where you're controlling the infrastructure," says Nic Whyte, our co-founder and Chief Technology Officer. "But whether that system could be extended and made robust enough to work over a wide area network—the internet, essentially—was the thing we needed to prove."
Rob explains the first steps: "At the Bob Dylan Center, we have a sync server that sits on-site. What we did first was to get that server running in the cloud. We then did some prototyping with essentially a stripped-down version of the Bob Dylan Center player on iOS to see if that would work."
It turns out this approach was immediately successful. "We got that working very quickly," says Rob, "and the stretch goal was essentially to see if we could then develop a web-based client, so that we could play and sync the audio in a player just on a website."
Of course, technological development is never a smooth ride from start to finish, but we were once again able to draw upon previous innovations, as Anders Rasmussen, Senior Engineer - Systems, explains. "The Bob Dylan Center relied purely on NTP (Network Time Protocol), but it turns out web browsers can't speak NTP—or UDP (User Datagram Protocol), specifically, which is the underlying protocol," he says. "So we had to take what we built for the Australian Stockman's Hall of Fame, where the server is responsible for doing a time sync with the clients. We implemented that, which took only about two hours, and added that to the proof of concept."

Audio latency was an early concern, due to the uncharted territory of functioning on such a broad scale. "Our goal for latency was originally set at 250 milliseconds," says Adam Phin, Senior Engineer – Front End. "Which is way too high," he is quick to add. The human brain will start to notice latency at around 45 milliseconds, so this is regarded as the standard for broadcast. At a delay of 100 milliseconds, it becomes distracting. "But from monitoring [the AV sync], unless something happened in the background of the phone such as an incoming call, we were usually at 20 milliseconds out which is very low."
In what might arguably have been a bold move, albeit for what Nic describes as a "dream client", the proof of concept—10 days in development—was presented to the Sydney Opera House as a live test. Using a demonstration video on an iPad accompanied by a QR code, Rob used both iOS and Android devices to scan the code, connecting them to a website handling the synchronised audio playback.
To emphasise the simplicity and effectiveness of the new platform, Rob took it a step further. "I then allowed the Opera House team to connect to the sync server using their own devices and let them experience it," he says. "The demo went really well because we had five or six different devices simultaneously playing the audio and clearly playing it in sync. So that was actually a really effective demonstration."
A handful of devices in a demonstration is one thing, but how well could the system handle potentially thousands of viewers every night during the two-week run of From the Sails: Light Years? "We can effectively scale up to as many users as we need," Anders says. "If we could pack a stadium with 100,000 people, we could probably handle that. We tested it to tens of thousands of connections in our simulated tests before the project went live."
Again, the effectiveness and scalability of the system lies in its simplicity. As Anders elaborates further, "The solution of scaling these servers relies on the fact that we only send messages in one direction, and every client that's connected to the server will receive the same message. The servers don't need to change these messages, so it's very easy for us to send these messages out to everyone."
In attempting to develop a low-friction experience for users, this is about as frictionless as it gets. No app download is required, no fields need to be filled—simply scan a QR code, and the audio loads in the device's web browser. A web-based client for AV synchronisation also results in better support across a range of devices.
It also has the advantage of providing easier integration with projection systems. We were delighted to receive positive feedback from our projection partners for From the Sails: Light Years, as the work required to establish this integration is greatly reduced compared to native apps that drive the time code across the entire system. "Our solution is quite passive," explains Rob, "we just require them to send us a signal at certain points and we just listen for that."
"It worked the first time," adds Anders, describing the test. "We were meant to be there for two hours, from 11pm to 1am, and we were done by 11:15pm. It was amazing to see, and the sync was perfect."

The new platform opens up a range of ideas and possibilities for future uses, believes Adam. "Eventually this can be provided either through the web or through native apps. We've got options to choose what audio is being streamed, so the next step is to set up something where multiple presentations are being shown"—similar to the audio-visual synchronisation we developed for Outback Cinema at the Australian Stockman's Hall of Fame—"and you can select between different things in a space that has a number of synced pieces running. You could even do something with augmented reality. The sky's the limit."
Much in the same way that From the Sails: Light Years runs as a loop to connect the past with the present, the development of our audio-visual synchronisation technology gave us a window to return to past projects, and find solutions for present-day challenges.
"The opportunity here was, from our perspective, to really extend what we started with the Outback Cinema, which is the idea of people tuning in and listening in a cinema setting," observes Nic. "But using the Sydney Opera House as our cinema and the harbour as our theatre was a pretty amazing idea."
If the success of this synchronisation technology can be measured by a sense of wonderment imparted upon the viewer, then the reaction of our own designers and engineers working behind the curtain is an indication of just how immersive it can be. "It is just magical to see it light up and to hear the audio," says Anders. "I know how it works, but it's still magic."
From the Sails: Light Years was projected nightly onto the Sydney Opera House sails from 19-30 October, 2022. The film and an interview with principal artist Sam Doust are available to watch via the Opera House's streaming service.