Cloud

Verizon debuts GPU-based 5G edge services for mobile VR/XR developers


As the 5G era kicks off, all signs point to dramatic improvements in virtual and mixed reality experiences as computing shifts from hardware worn on your body to nearby “edge” cloud servers. Since much of the processing will be visual and used by numerous users at once, Verizon has been preparing for the shift by developing new GPU-slicing technologies that enable a single graphics processor to offer services to multiple clients — an advance the carrier expects will benefit AR, VR, and XR users, as well as real-time enterprises, AI/ML users, and gamers.

Today, the GPU inside a virtual reality headset services only the user currently wearing it, and doesn’t feed any computing power back to the network for other people to share. Using a Verizon-developed prototype on a live network in Texas, a GPU being used to deliver computer vision as a service could support eight times as many concurrent users, or 80 times as many users when used for graphics gaming services.

In the real world, the numbers will obviously vary dramatically based on both the power of each GPU and the demands of the service(s) it’s running. But the core concept — taking maximum advantage of the GPU’s capabilities by managing virtualization across individual users and groups of users — should yield great benefits, assuming Verizon’s 5G network is up to the task of serving that content.

“Creating a scalable, low cost, edge-based GPU compute [capability] is the gateway to the production of inexpensive, highly powerful mobile devices,” explained Verizon technology SVP Nicki Palmer. “This new innovation will lead to more cost effective and user friendly mobile mixed reality devices and open up a new world of possibilities for developers, consumers, and enterprises.”

Verizon plans to offer developers access to eight different 5G edge services at first, including 2D and 3D computer vision, responsive XR lighting to blend digital objects into reality, split local/edge rendering, real-time ray tracing, 3D spatial audio, transcoding, and asset caching for collaborative work. Collectively, the services will enable mixed reality headsets to identify real objects, realistically render digital objects, and provide believable audio cues that blend them together, among other applications.

The carrier expects to publicly demonstrate the 5G edge services at Mobile World Congress Los Angeles next week, including augmented reality-aided workspaces, an AR attachment that lets firefighters see in low-visibility environments, and AR-assisted shopping using digital augmentation of physical products. Volumetric 3D scanning and educational VR demos will also be available.

When asked, Verizon wouldn’t confirm which company’s GPUs were being used in its tests, but the company was known to be working with Nvidia on GPU-powered distributed data centers for 5G. Rival AT&T previously said that it was working with Nvidia on 5G edge computing solutions for virtual reality and games, using a 40-GPU RTX server to send interactive VR content over a network with only 5 milliseconds of network delay — around one-sixth the latency of typical 4G connections.



READ SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.