“The Wizard of Oz at Sphere will transport audiences, making them feel like they have stepped inside the film and are traveling down the yellow brick road with Dorothy and her friends. The experience will fill Sphere’s 160,000 sq. ft. interior display plane, which wraps up, over and around the audience to create a fully immersive visual environment.
To take advantage of Sphere Immersive Sound’s 167,000 programmable speakers, and ability to direct sound anywhere in the venue, the original film’s score was re-recorded to take on new clarity. Multi-sensory 4D elements will be combined and leveraged for maximum impact to make audiences feel like they are in the experience alongside the characters, such as high-velocity wind arrays, atmospheric fog, towering fire bursts, bubbles, and infrasound haptic seats.”
Our team, Real Time Software (RTS), led the real-time pre-visualization for the iconic tornado scene brought to life in Sphere. I was the lead programmer for this part of the project, working with fellow programmers Nathan Rosquist, Jesus Aguirre, and technical artist Taylor Shechet under RTS Supervisor Lindsey Sprague. All pre-vis for this scene was done in Unreal Engine to nail down speed and timing for tornado rotation, house and debris spline paths, locations of hero actors, and surprise ending. Each pre-vis review session was held in the Sphere test site, where changes to the scene could be made in the engine’s Sequencer while in the presence and direction of film executives to streamline the production process, cutting down on massive amounts of time that would be spent rendering scenes that could not be edited in real-time.