Experiential
9
min read

The Future of Events: How The Sphere, VR & AR Change the Experiential Talent Landscape

In October of last year the $2.3 billion Las Vegas Sphere opened for business, ushering in a new type of stadium experience and a new era of demand for virtual reality and augmented reality experiences as well as the talented individuals who create them.
Lana Steiner
Author:  
Lana Steiner

Aut repudiandae earum fuga dolorum dolorum nisi non. Velit ex accusamus suscipit qui harum iste repudiandae quibusdam. Et molestiae aperiam voluptatem. Ipsam eligendi doloremque ut assumenda nemo dolor fugit delectus occaecati. Pariatur quia eius quibusdam optio. Consequatur dolor veniam quia rem quos.

An architectural feat, the circular theater is 366-feet tall and 516 feet wide with a capacity to hold 18,600 people when seated or 20,000 people standing up. The dome is made from 32 trusses, each weighing 100 tons and the exoskeleton upon which the exterior LED screens are mounted, is 30 percent taller than the theater. The property includes 304 parking spaces and a 300 meter long pedestrian bridge; there were also plans to build a new monorail connecting The Sphere and the Venetian. At 16K x 16K, it is the highest resolution LED screen on Earth. 

Through Sphere Immersive Sound, a variety of technologies powered by HOLOPLOT, including beamforming, which enables audio to be directed to specific locations at a volume that remains consistent from point of origin to destination. The Sphere’s audio system also allows sound designers to create a virtual point of origin and place sound in a precise spatial location, meaning audio can be directed to the listener so that it sounds close even though the source is remote. So it wasn’t surprising, also given U2’s role in opening the Sphere, when the Grammy’s featured the new venue on its award show last night and had U2’s lead singer Bono present the award for best pop vocal directly from the stage during a showing of “U2:UV Achtung Baby Live At Sphere.” It’s the first time television cameras have been allowed into the space.  

Based on the demand for the Sphere, we can all expect to see more of these immersive digital experiences popping up around the country. For example, artist Roy Nachum is opening an immersive museum in New York. The Leonardo opened a new immersive exhibit in Salt Lake City called “Into the mind of AI.” Just down the strip in Las Vegas, U2 is featured in yet another digital experience at The Venetian, a more intimate experience designed for U2 superfans. 

Experiential VR/AR: Ushering in a new era of 21st century entertainment

The idea behind the Sphere was to create a uniquely immersive augmented reality (AR) experience built into the architecture of an arena — the first of its kind. The multi-purpose concert venue officially opened with U2's 40-show residency, "U2:UV Achtung Baby Live," running from late September to mid-December 2023, ushering in an entirely new kind of concert experience for an audience that was largely unfamiliar with the technological potential of AR.

Concurrently, the Sphere continues to run a two part and near two-hour narrative exhibition beginning in the holographic Atrium. The first part tells a universal story of how culture, technology, science, art, and the natural world have been intertwined throughout human history. Through a one-of-a-kind immersive experience created specifically for the Sphere, guests will gain a better understanding of how technology amplifies our human potential. 

The Sphere Experience then continues in the main venue with a multi-sensory cinematic experience at an unparalleled scale – Darren Aronofsky’s Postcard from Earth is the first production to feature the venue’s multi-sensory 4D technologies, a technological leap forward for the creative mind behind Pi, The Fountain, and The Whale. This includes mixed reality (MR) seating with an infrasound haptic system that will use deep vibrations so guests can feel the experience – such as the rumble of thunder or a soaring rocket launch. The Sphere also utilizes environmental effects to rouse the senses – the gust of a cool breeze and familiar scents which enhance an already full sensory experience. 

In a matter of months, the total estimated number of people who have visited the Sphere in Las Vegas so far could be somewhere between 750,000 and 1 million, announcing itself as one of the most coveted tourist attractions in the United States. Even Los Angeles’ West Hollywood is considering construction of its own orb like structure. Still pending review by the Sunset Boulevard Arts & Advertising subcommittee, the West Hollywood billboard would function as an entertainment studio and have two of the three levels house green rooms and studios for broadcasting and podcasting. Much like Las Vegas’ Sphere, the West Hollywood exterior would provide pedestrian oriented amenities and LED advertising aimed at drawing visitors out of their cars and into the plaza in which the sphere would reside.

The recent Aronofsky exhibit at Las Vegas’ venue is hardly the first time auteur filmmakers have toyed with the use of VR. Based on authentic accounts, Academy Award winning, Mexican born director Alfonso G. Iñárritu (Birdman; The Revenant) presented his VR-experience “CARNE y ARENA (Virtually present, Physically invisible)” employing state-of-the-art technology to create a multi-narrative experience in which the line between subject and bystander is blurred.

Originally premiering in May 2017 at the 70th Cannes Film Festival as its first virtual reality official selection, the immersive installation centered around a six-and-a-half-minute virtual reality sequence for a single person. As the simulation proceeds, the exhibition allows patrons to walk through a vast space and embody a fragment of the interviewees' personal journeys – particularly immigrants' experience on the border between the U.S. and Mexico.

“During the past four years in which this project has been growing in my mind, I had the privilege of meeting and interviewing many Mexican and Central American refugees. Their life stories haunted me, so I invited some of them to collaborate with me on the project. My intention was to experiment with VR technology to explore the human condition in an attempt to break the dictatorship of the frame, within which things are just observed, and claim the space to allow the visitor to go through a direct experience walking in the immigrants’ feet, under their skin, and into their hearts.” - Alejandro G. Iñárritu

Subsequently presented throughout the United States (from Washington DC to Los Angeles) as a traveling roadshow, the VR exhibition transforms into a selection of fragmented interviews, depicting a young mother who fled Central America after a gang member threatened her child; a Border Patrol agent who watched life escape from the migrants who didn’t quite make it; then, there's the young man who made the crossing at age 9 and ultimately went on to receive a law degree at UCLA.

And then there is early discussion and examples regarding how XR (Augmented and Virtual) can impact the global tourism sector, using the metaverse. Early adopters have already started experimenting, and several trends have emerged.

Virtual elements can be layered onto an established business. In the wake of the fire that damaged the famous cathedral in 2019, French startup Histovery produced an augmented exhibition on the history of Notre-Dame de Paris — motivated in part by an increased awareness of the fragility of physical landmarks. To navigate the exhibition, each visitor uses a “HistoPad” touch screen to take an immersive tour that allows interaction with physical elements: giant photographs, 3-D models of statues, replica flooring and stained glass, and audio of Notre-Dame’s organs and bells. Effects include animation and a virtual scavenger hunt for younger visitors.

Unreal Engine: Merging VR/AR gaming technologies with commerce 

Suffice to say with advances in the AR/VR space the curated immersive experience is no longer merely for Academy Award filmmakers. Growth within the VR space has allowed accessibility for not merely average consumers but developers across a multitude of media platforms. At the heart of this revolution lies Epic Games’ Unreal Engine — from the creators of Fortnite — a powerful and versatile game engine that is significantly changing how VR games are created and experienced. 

Epic has built Unreal Engine 5 atop four pillars:

  1. Data aggregation: real-time technology pulls in data from multiple sources, so architects and other design consultants using different CAD (Computer Aided Design) packages will be able to input data and see it synced on a single visual platform.
  2. Open Worlds: the technology can import wider data sources and global datasets without limit, providing there is an API (Application Programming Interface). For instance, this could be anything from geo-location data to weather forecasts or London’s data-sharing portal London Datastore..
  3. Collaboration: the software allows multiple inputs in real-time, allowing people to query designs and collaborate using relatively basic devices such as smartphones or iPads.
  4. One platform, many outputs: visualizations can be in the form of images, video, or VR (any VR headset can be used).

Now a pervasive software unlocking a more egalitarian mode of game/VR development, Unreal Engine utilizes a tool called Blueprint, a kind of visual scripting system, thereby allowing even people who aren’t experts in coding to create their own VR games. This user-friendly feature doesn’t take away from how powerful or adaptable the engine is, reinforcing how Unreal Engine – free to download and use for non-commercial projects – is a good choice for both small, independent game makers and big game companies. Furthermore, Unreal Engine provides extensive support and adaptability for a wide range of VR hardware, including popular headsets like Oculus Rift, HTC Vive, and PlayStation VR. This broad compatibility ensures that VR game developers can reach a wider audience, regardless of the hardware they use.

However, Unreal Engine’s VR/AR capabilities have unlocked far greater usage when it comes to innovation and e-commerce, Unreal Engine’s AR capabilities allows for state of the art digital twin technology, a term commonly used in reference to a virtual model designed to accurately reflect a physical object. Digital twins have the potential to deliver more agile and resilient operations. And their potential isn’t lost on CEOs. 

Interest in digital twins, combined with rapidly advancing supportive technologies, is spurring market estimates for digital-twin investments of more than $48 billion by 2026. McKinsey research also indicates that 70 percent of C-suite technology executives at large enterprises are already exploring and investing in digital twins.

Building and scaling a digital twin requires a three-step approach:

  1. Create a blueprint: A blueprint should define the types of twins an organization will pursue, the order for building them to maximize value and reusability, the way their capabilities will evolve, and their ownership and governance structures.
  2. Build the base digital twin: A project team then builds the base digital twin over the next three to six months. This phase begins with assembling the core data product, which enables the development of visualizations and allows data science professionals to build out one or two initial use cases.
  3. Boost capabilities: Once a digital twin is running, an organization can expand its capabilities by adding more data layers and analytics to support new use cases. At this stage, organizations frequently advance their twins from simply representing assets, people, or processes to providing simulations and prescriptions through the use of AI and advanced modeling techniques.

The infrastructure engineering software company, Bentley Systems, leveraged Unreal Engine to develop iTwin, a platform for creating and managing digital twins of infrastructure assets like roads, bridges, and power plants. An open platform and transformative tool for fomenting innovation, iTwin enables real-time monitoring, data visualization, and predictive maintenance, enhancing operational efficiency and asset lifespan.

Unreal Engine's real-time rendering capabilities make it an ideal tool for architectural visualization, particularly product prototyping and visualization. Architects and designers can create immersive virtual environments that allow clients to experience spaces before they are built, rendering photorealistic visuals, simulating lighting conditions, and even incorporating interactive elements like walkthroughs and dynamic object manipulation.

However, U2’s Sphere residency was hardly the first example of a musical artist incorporating XR technologies into live performance. By contrast, XR integration into live events is a market that has previously gone largely untapped. As of 2020, the metaverse accounted for 0.1 percent of live-music revenues — a figure which rose more than tenfold by 2021 — thanks in part to COVID shutdown policies. Nevertheless, McKinsey estimates that by 2030 virtual events could account for up to 20 percent of all concert revenues, driven in part by their capacity to accommodate huge audience numbers at reduced cost. 

In August 2021, Epic Games launched Fornite’s the Rift Tour, featuring Grammy-winning artist Ariana Grande. It proved to be a perfect match between the sensational popular metaverse game with (then) around 350 million registered users. One of the first of such Fortnite collaborations, this was particularly significant. It was the first time Ariana Grande had performed in nearly two years and the first concert to allow attendees to participate in mini-games.

The concert was an acclaimed success. The Rift Tour was viewed by as many as 78 million players (compared to average conventional concert attendance of under 15,000); the number of streams of Grande’s songs rose by up to 123 percent during the concert, and other featured artists also saw a streaming boost. While a traditional concert by a top North American performer might rake in less than $1 million, it’s estimated that Grande made more than $20 million from her headline performance—which may be remembered as a critical inflection point for the live-entertainment industry. According to McKinsey estimates, the live concert industry has an anticipated income potential of upwards of $800 million by 2025. Taken together with XR the MICE (meeting, incentives, conferences, and exhibitions) sector, the music sector is a rich opportunity with an expected $7 billion by 2030.

In order to create these combined immersive and commerce experiences, companies will need to either team up with existing studios or hire a full set of creators like: 

  • Augmented Reality Designers
  • Virtual Reality Developers
  • Motion Graphics Designers
  • Video Game Designers
  • Animators

Worky is already deeply embedded in the world of XR, providing support and insight to companies searching the competitive talent pool for top flight applicants. Using our cultivated relationships, Worky will scope, curate, and pair skilled talent with the right employers in the growing XR marketplace. 

Virtual Reality
Extended Reality
Design
Developers
Innovation