Performers Set the Pace and Graphics Follow Suit
When you look through the eyes of the producer, you’ll notice that there are only two elements of the stage: the musicians and the backdrop design. Perhaps by nature, the musicians are their own form of entertainment, but equally important is the allure of the stage's design. Inventive stage architects are in constant competition to see who can configure the most awe-inducing exhibition. Their results have been nothing short of astonishing, but as they build grander displays, they inadvertently wedge a gap between the musicians and their stage. Interactive content like surface reality acts as a bridge, elevating the audience’s experience while reconnecting those elements of the stage.
There’s a vast industry focused on improving the experience of guests attending live performances. Most developments have focused on improving the design of the venue, but many producers overlook integrating technology directly into the performance. There's been no shortage of technological innovation over the last few decades, so why hasn’t the status quo of stage performers substantially changed? The simple answer is that the focus on innovation has been too narrow. Elaborate light shows and intricate prop designs do not help a performance stand out, instead, they only 'one up' the competition. An exceptional concert or performance doesn’t require reinvented props, it requires the use of alternative technology.
So how does interactive content like surface reality tie into concerts? Consider the functionality of the LED display, the screen that typically extends the length of the stage. Concerts tend to use the monitor to zoom in on the musicians or play specialized graphics on a loop. When the graphics are meant to parallel the music, there can be lags due to imperfections in timing or the miscoordination of the VJ. If producers want graphics, they must either choose generic content or have the VJ try their best to replicate the timing of the graphics with the music. This mimicking drives a disassociation between the musicians and the graphics on the monitor. Surface reality solves this issue. Surface reality works by generating visual graphics that autonomously respond to the movement of the performers on stage. This allows musicians to play at their own pace, knowing the monitor will perfectly project their music and movement onto the big screen.
When interactive graphics are displayed on the screen, the audience immediately stands at attention. A few years ago, a couple performed a dance piece on America’s Got Talent. At first, they appeared to use surface reality technology. In truth, they actually memorized a dance and mimicked the graphics that played on the screen, but their performance was convincing enough to have you believe the screen was responding to their movement. Their official video got nearly 2 million views on Youtube, over ten times the amount of views America’s Got Talent receives on an average video. Both the judges and the video’s commenters lauded the interaction between the performers and the screen. Surface reality allows the audience to experience this new genre of performance. When these elements of the stage are blended into one, audiences are simply mesmerized.
Why not aim for a better result than the America’s Got Talent couple, and along the way save time and energy on synchronization rehearsals? Zuzor is doing just that. Our proprietary use of surface reality enables performers to set the pace, and graphics to follow suit.