ARVR

The DeanBeat: RP1 simulates putting 4,000 people together in a single metaverse plaza


Interested in learning what’s next for the gaming industry? Join gaming executives to discuss emerging parts of the industry this October at GamesBeat Summit Next. Learn more.


RP1 is a metaverse startup that has simulated putting 4,000 people together in a single metaverse plaza, or shard.

In metaverse circles, that’s a pretty good technical achievement, as games like Fortnite and Call of Duty: Warzone will pack 100 or 150 players in the same digital space, dubbed in instance. They can replicate those instances infinitely, but someone in one instance can’t talk to someone in another, except through long delays. That’s why RP1’s work, while still in a prototype stage, could prove important.

Since the metaverse is envisioned as a real-time, synchronous experience, delays in interaction, or latency, are the enemy. Kim Libreri, CTO of Epic Games, talked at our metaverse event in January about the problem of the “sniper and the metaverse.” In a game like Fortnite, players are often grouped together in shards, or single servers, where they can interact (meaning fight) all they want. But if a sniper goes to a tall building or a mountain, and they can see someone far away, they might be scoping someone in another shard, and that’s a no-no when it comes to latency.

Sean Mann and Dean Abramson, the founders of RP1, aim to solve this problem by rearchitecting servers in a way that they can squeeze a lot more people in to the same space. Companies like Improbable and others are trying to do the same thing, but it’s kind of a metaverse Holy Grail.

“When a lot of people talk about the metaverse, they mention the lack of scalability,” Abramson said. “That’s a big roadblock to actually achieving a true metaverse. They say it’s just not possible, and we have to wait for hardware and Moore’s Law to double multiple times. They say we are a decade or maybe two away from really achieving what everyone believes is that kind of the future iteration of the internet that’s shared by everyone on this planet.”

As a company, RP1 believes that it can be done with ultra-efficient software.

“We’ve achieved the ability to maximize not only on a single server, which is what the first demo did,” said Mann. “You were able to see how we can put a massive amount of people in one place. A server can monitor those thousands of people with full fidelity. That’s really important because in gaming, there are a lot of people that don’t do full fidelity. It’s a real hard problem to scale with characters that have finger and hand movements and facial expressions.”

Many visionaries have talked about the metaverse and consensus has been we are a decade or two away, waiting on hardware to allow a large scalable world through a 3D browser, Mann said.

Tim Sweeney to Matthew Ball and now the release of PWC and McKinsey reports all describe a large persistent world. And it all starts with scalability or we are just connecting intranets, Mann said. Putting lots of users in a single instance and connecting applications together is a difficult problem to solve.

“Current gaming engines are not designed to support non-compiled web deployed experiences (games, social, etc) at scale and quality. It is going to take a complete rethink of technologies to enable this vision,” Mann said.

For the past 10 years, RP1’s founders have been working on a new paradigm for solving the network server architecture in the turn-based gaming space, starting with online poker. They have a small team of less than 10 people.

They realized that the same architecture could be applied with slight modification to solve the problems inherent in prevalent gaming architectures.

The system RP1 is building allows creators and developers to deploy spacial 3D content in gaming, social, digital twins and Internet of Things applications that need to be experienced in a real-time persistent application.

“This new architecture allows gaming companies, content creators, and developers to focus on what they do best, and not have to worry about scale, while also enabling them to deploy content directly into a persistent, shardless world that could host hundreds of millions of users moving in and out of different real-time applications seamlessly (aka the metaverse),” said Yin-Chien Yeap, chief client architect for RP1, in an interview. “And thereby bypassing the current model of precompiling and pre-downloading software. All of this is possible at a small fraction of the cost of what cloud service providers like Amazon Web Services and Azure charge.”

A demo

Sean Mann (left) and Dean Abramson are the founders of RP1.

I went into a prototype metaverse world where RP1 demonstrated the tech. They just completed the first phase of a demo which has 4,000 full fidelity avatars (six degrees of freedom, IK, facial, hand and finger tracking) with spatial 3D audio in a single persistent, shardless virtual reality space that is about a square kilometer in size. They did this in a VR app accessible via the Oculus Quest 2 (that’s how I logged in) via a browser with no pre-downloads using one six-year-old server.

Abramson said the audio solution alone is disruptive as this is a massive challenge in the gaming industry. As I passed by various non-player characters in the demo (each one of them representing possible unique players), I could hear them talking over audio with unique scripted statements. It was spatial audio, where I could hear them in one ear or the other as I passed by them.

I certainly felt a sense of presence thanks to the 3D audio. I could walk around and float above the crowd, but I always heard the sound coming in a directional way. We went from an urban area to a place that looked more like a large plaza, and it go very noisy as RP1 dropped more and more bots into the space. At one point, there were thousands in the simulation with me.

Yeap also showed me around in the demo. He helped build the space with RP1, and he was very impressed with the technology. He said he wanted to use it to create his own applications.

“The bottleneck is not on the server connectivity,” said Yeap.

The proximity audio is also an important part of the demo, with full spatial audio in a VR space. The tech is not limited to VR spaces, as it could also be done on desktop or mobile devices or game consoles. The demo used WebXR graphics on a Meta Quest 2 VR headset.

“There are tradeoffs,” said Yeap. “As we increase the fidelity of the avatars, the number of avatars drops off. It moves the bottleneck up to the graphics pipe, rather than the network type.”

He added, “Sometimes you think that if you draw 400 avatars, the network is going to die because the WebRTC cannot handle so many peer-to-peer connections or voice for instance. But in the case of working with the RP1 code, I found that the bottleneck is entirely graphical. The network has, for the first time in working with other network engines, I have run out of graphics before the network ran out of capacity, which is amazing. And that’s why I love working on it. I feel that has enormous promise in being able to deliver the kind of mass participation, which the metaverse always claims to be heading towards.”

Abramson said RP1 would probably be able to mix in about five times more audio than it is mixing into the demo now. There will be more benefits once RP1 can add more machines to its demo, he said.

Of course, it’s just a simulation and it’s just a demo. You could say it’s not proven until you really do have 4,000 humans in a small space. But Mann and Abramson think that day will come.

“From the server’s perspective, the bots are sending information like humans would, and the bots are actually more performant than a human because they’re speaking all of the time,” Abramson said.

Next steps

Yin-Chien Yeap is chief client architect at RP1.

The second phase will be linking many servers together to put 100,000 users in a 20-square-kilometer space. It will also be persistent and shardless, with the limiting factor merely being budget, not the limitations of what the hardware and network can do. (The company is hoping to raise money).

“We will be deploying different experiences to show what the future could look like through an entirely new browser experiencing unlimited number of applications in social experiences and gaming,” Mann said. “What gets us excited is imagining a user sending a link to where they are in a store, museum, game, work space or social meet ups and a friend or fan could instantly join them to play or spectate from any device (mobile, AR, VR and desktop) without having to pre-download anything.”

The company has been showing the demo to as many big game companies as it can.

“There are other architectural changes that we’d have to make on the server side,” Abramson said. “We haven’t done this yet, but if you were in a stadium, and you were surrounded by 10,000 people, we could actually give you the full audio experience of all 10,000 people, even though we can only ship you the movements of the nearest thousand.”

It’s not clear what would happen if you tried to use the RP1 technology for a game like Fortnite. It depends on how big the map is or how many people you’d want to squeeze into a small space.

“If everybody is shoulder to shoulder, you aren’t going to have much of a battle,” Abramson said. “There are limits to how many people you are going to see on the horizon. You can see 10 kilometers away. Once you get to about 250 meters, people are virtually invisible.”

In any case, Abramson believes that you could have a lot more people in a space than you can currently fit in battle royale games. Part of the question becomes whether you want to. But I can foresee games where you have a big medieval army charging at another one with individualized actions, Braveheart style.

“Everyone would really be on top of each other,” Abramson said.

The next move would be to demonstrate 100,000 people in one space, or a million. All of this is theoretical until RP1 gets more resources to show bigger and bigger demos, but it’s a big dream and this is a start.

“We hope people will understand that the notion that the metaverse is 10 to 20 years away, because we’re just waiting for Moore’s Law to solve the hardware problem, is wrong,” Mann said. “We want people to know that that’s not the real case. The real scenario is that within a year, hopefully, we can roll out a fully scalable system that can handle many people in a shardless architecture.”

GamesBeat’s creed when covering the game industry is “where passion meets business.” What does this mean? We want to tell you how the news matters to you — not just as a decision-maker at a game studio, but also as a fan of games. Whether you read our articles, listen to our podcasts, or watch our videos, GamesBeat will help you learn about the industry and enjoy engaging with it. Learn more about membership.



READ SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.