Gaming

Afference creates artificial touch sensations with rings on your fingers



I got up early in Las Vegas to check out Afference, and I’m glad I did. In a suite at the Venetian during the CES 2024 tech trade show, founders Jacob Segil and Dustin Tyler showed me how they create artificial touch sensations by putting some wired rings on my fingers.

They stimulated the rings with wires and I felt a haptic sensation in my fingers. The founders of Afference — — call it the Phantom. It can relay tactile information to the brain so that you feel things that aren’t there. It’s a new step in developing a neural interface for the brain and digital electronics.

“For the first time, this is what we call neural haptic,” Segil said in an interview with GamesBeat. “We’re scientists, inventors and entrepreneurs who know how to create artificial touch. We’re actually talking directly to your nerves in your hand, creating artificial sensations.”

How it works

Afference sends electrical signals to your nerves to simulate touch.
Afference sends electrical signals to your nerves to simulate touch.

Connecting to devices via Bluetooth, Phantom creates artificial sensations without having to press on the skin as with current haptic devices with a similar aim. With Phantom, rings that encircle the finger communicate with nerves through electrical stimulation. The device delivers information to the nerves which the brain translates as sensations on your fingers.

Those sensations are tied to digital interactions experienced in spatial computing platforms like mixed
reality headsets, creating haptic illusions that effectively allow you to feel things that are not there.

The founders describe the Phantom is the only wearable neural interface that creates tactile sensations in the fingers based on digital interactions. Based on decades of research from neural engineers, the Phantom uses neural stimulation to create artificial touch. The Phantom connects to a device through Bluetooth, creating artificial sensations without pressing on the skin. The rings on the Phantom ‘talk’ to the nerves in your fingers through electrical stimulation. It delivers information to the nerves which the brain reads as a sensation on your fingers. Those sensations are tied to digital interactions provided by spatial computing platforms and mixed reality headsets, which then create the haptic illusion.

The hands-on demo

My hands-on demo with Afference.

They put a virtual reality headset on my head and put the rings with electrodes attached on all of my fingers. Then they started a mixed reality demo.

I’ve tried haptic gloves before, but Tyler said there is a fundamental difference and he said they learned they don’t have to cover the fingertips with gloves.

“We don’t have to make a choice between virtual reality and reality,” Tyler said. “I can do something in reality, grab it, use it, then reach up and click a button in a spatial computing headset and seamlessly mix between them. And the reason for that is because the way we produce the sensation is by essentially working on the nervous system as it goes through the finger.”

That allows for new things. The fingers are free.

“Unlike mechanical vibrators, which cover the fingertip for impression depressors, we have no moving parts,” Tyler said. “This is all electrical. You’ll see the speed of the response is fast. The other thing this offers is we have the opportunity to create brand new sensations that are impossible with physical interfaces. And that means for gaming in particular the opportunity to create things that nobody else can. So now a game developer can start to think about brand new worlds things that we’ve never experienced or experienced in different ways that aren’t possible with other interfaces.”

A lot of the current haptics use mechanical motors that vibrate to simulate a sense of touch. The demo created a tingling in my fingers as I touched virtual objects via my hands that were tracked back to VR images in a Meta Quest 3 headset. The demo showed both temporal and intensity dynamics.

I had to calibrate the wearable first, one hand at a time, to set the feedback level with a slider.

“Over the next year, we will work to refine the way we interface with that nerve which more mimics natural nervous system responses to tactile inputs. And that’s where we go to give a much broader range of sensations for people to have,” Tyler said.

There are three people in the company now and several contractors are helping. The tech development has been happening for about a year. The original work started when Tyler and Segil worked together on people with missing limbs. Those people were given prosthetics but they would often say that if they received an electrical stimulation by the shoulder, they could feel it on the missing hand.

“The brain thinks I’m getting signals from the hand,” Tyler said. “By adding sensation, the world becomes real. Then we realized from virtual reality or what we know from basic neuroscience is that if you don’t have touch, if you don’t have visual and tactile tactile synchrony, there’s no reality. That was a huge thing for us to start to think about what’s beyond the prosthetics. And this is where the AR and particularly gaming and things became an area one to focus on where the potential applications are much broader.”

During calibration, I saw a logo on the screen and pulled it to the right. I could feel a buzzing or tingling sensation across all five fingers, and they turned up and down on the intensity so I had a sense of touch “volume.” I put my hand next to a virtual speaker on the screen to feel the sensation of an audio speaker translated to a sense of touch on my fingers. It gave me sensations to the beat of a drum. They played songs like Boston’s More Than A Feeling and I could feel the pulses of the drums in my hand.

The music demo is interesting because it happens faster than a motor haptics system.

“As we start to collaborate with game studios, the same transform we’re doing here is to take this music and create the sensation that we can use for games,” Segil said. “You take an audio track for a game, and we can port it through our SDK, and then you have an entire haptic track based on the right sound effects that you’ve already built into your game. And so we think that’s going to be a really powerful tool to get integrated with existing content, and then to to get collaborations up and running.”

Then they showed me a big red button on the virtual screen that I could press with my virtual hand. By pressing the button, I could get feedback in my finger. They said I should feel resistance as I pressed down, and I could feel that. They said they can pattern the simulation so you feel more than vibration.

“We describe this sensation like black-and-white TV. There was a time when you knew the world was in color but the TV was black and white. This is where we are in the sensory domain, as opposed to vision. What Afference is going to do is create color, new and different kinds of sensations, using all of our understanding of the neurophysiology,” Segil said.

There was another demo where I could grab blocks with my fingers and use two hands to make them bigger or smaller. Sensations accompanied the touch points.

“We can start to play with digital objects between your physical hands, and track and then overlay these haptic events to make these interactions more and more, call it complex and full,” Segil said.

“And for from a gaming perspective, too, you can imagine, as you touch a spot, there’s maybe a different type of field, depending on whether it’s a dangerous thing or a good thing,” Tyler said. “There’s lots of opportunities that gamers can start to develop and figure what they want to do.”

The tech has a way to go before it feels fine-grained and unmistakable in terms of the feeling it is trying to convey. But it’s a good start.

They only capture sensory feedback with the Phantom. And it can be integrated with other parts of a VR system.

“What’s missing in a control system is the feedback to the user,” Tyler said.

Starting a business

The Phantom by Afference.

Back in September, Boulder, Colorado-based Afference raised $1.5 million in a pre-seed round led by Konvoy Ventures to fuel development for the wearable neural interface platform. The idea is to create an artificial sensation platform and a neurostimulation wearable, the Phantom, that provides sensation across XR platforms including VR headsets, mixed reality headsets, and mobile augmented reality.

Segil, CEO at Afference, sees a shift coming in the way we interact with digital content. The artificial sensor platform allows digital interactions to be experience as seamless as physical reaction, said Tyler, chief scientific officer.

The inspiration behind the development of Phantom stemmed from the founders recognizing the
significance of touch in today’s digital era and how spatial computing has the potential to
transform the way society interacts and consumes digital information.

The artificial sensation platform is currently compatible with VR headsets, mixed reality headsets, augmented reality glasses, Android mobile phones, and laptop computers.

Development kits including the Phantom wearable device and Afference SDK will be available in
early 2024 through Afference’s first contact program. Then, strategic partnerships will be formed to integrate Afference technology across Spatial Computing platforms throughout 2024. The company plans to raise a new round of funding.

Segil previously founded multiple medical device companies including Point Designs and MITA, which was later acquired by Stryker. He has a doctorate in mechanical engineering from the University of Colorado Boulder.

Tyler has worked on Afference’s on the science and technical front. He leads research, implements proprietary neurostimulation techniques, and plays a role in overseeing strategic planning and financing for Afference. He is a professor of biomedical engineering at Case Western Reserve University. He is also a career research scientist with the Department of Veteran Affairs and recently founded and is the director at the Human Fusions Institute.

Afference is built on a long-standing research collaboration between the founders. Tyler has spent decades researching how to interface with nerves in order to create artificial sensations. The first applications of this work were for people with limb loss and showcased the ability to restore a sense of touch. In 2021, Tyler’s work created artificial sensations through wearable technology which could be used by anyone. This breakthrough led to the formation of Afference in June 2022.

The company expects consumer products could be ready in early 2025. Phantom works with VR headsets, mixed reality headsets, augmented reality glasses, iOS and Android mobile phones, and laptop computers.

Advisers include Aaron Luber, director of Google Labs; Chris Ulrich, a haptics expert and CTO of Immersion; and Josh Duyan, cofounder of CTRL Labs.

There are other companies out there doing touch feedback like Ultraleap and Emerge. Afference wants to create a five-finger wearable for gaming entertainment, and also create a single-ring wearable for other kinds of applications.

Afference is creating wearable neural interfaces.

“We think we’re critical for creating efficient hand tracking, or efficient hand-based interactions. Digital dexterity is the phrase we’re using, where our goal is that you can be as dexterous in the 3D space as you are in the physical space,” Segil said. “That’s our goal.”

The name “Afference” refers to the side of the nervous system that carries information back to the brain, Segil said.

“Our whole company is about new feedback with the sense of touch,” Segil said. “We’re going to let the headsets and everyone else figure out the hand tracking and those things. We leverage it so that you have this natural interface, a seamless interface, and then we provide afferent information from the senses.”

I can think of a lot of uses for this, like giving you touch feedback when you play a faux virtual piano with your fingers. If you’re holding a sword, it should feel heavier in your hands than something like a fork.

“We think we can help enable a new computing era. But we aren’t dependent on that for success.”

GamesBeat’s creed when covering the game industry is “where passion meets business.” What does this mean? We want to tell you how the news matters to you — not just as a decision-maker at a game studio, but also as a fan of games. Whether you read our articles, listen to our podcasts, or watch our videos, GamesBeat will help you learn about the industry and enjoy engaging with it. Discover our Briefings.



READ SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.