Mobile news

Virtual touch, real feelings: Decoding haptic technology beyond smartphones | Technology News


Haptics and vibrations are often used interchangeably when we talk about smartphones, but they are actually quite distinct. Haptics is when information is conveyed through touch, such as the satisfying click you feel when you press a virtual button on your screen or the realistic recoil you experience when you pull the trigger on your gaming controller. Vibrations, on the other hand, are simply alerts that grab your attention, like when your phone buzzes for an incoming call.

Vibrations may not go much beyond notifying users about things on phones, but haptics is a different story: the technology has a myriad of potential applications beyond phones and controllers. For example, in 2017, Royal Institution, a charity that fosters public engagement with science, demonstrated how haptics can create the illusion of touch without any physical contact, using nothing but forces and vibrations.

In this article, we will explore what haptic technology is, how it can enhance our interaction with the digital world beyond smartphones and controllers, and how it can shape the future of mixed reality, a field that blends the physical and virtual realms.

What is haptic technology?

Haptic technologies are a fascinating way of enhancing our interaction with the digital world. They can simulate the sense of touch by applying different kinds of forces, vibrations, and motions to a person using the tech. This can make us feel more immersed in virtual reality, gaming, or remote control applications. For example, if you touch a virtual button on a screen, you might feel a click or a buzz. If you drive a car in a game, you might feel the steering wheel resist your movement or shake when you hit an obstacle.

There are different types of haptic technologies that can produce different kinds of feedback. Some use small motors that spin or move back and forth to create vibrations. These are called eccentric rotating mass vibration (ERMV) motors or linear resonant actuators (LRAs). Others use thin materials that bend or contract when an electric voltage is applied. These are called piezo haptics sensors. The choice of which haptic technology to use depends on several factors, such as the desired effect, the cost, and the space available.

Haptic technologies can be used in various ways, such as:

Haptic gloves that allow users to feel virtual objects and manipulate them with their hands.

Haptic suits that simulate sensations such as temperature, pressure, and pain on the whole body.

Haptic displays that project tactile ‘images’ onto the skin using ultrasound waves.

Mid-air haptics: A new branch of haptic technology

The haptics tech demonstrated in the video above uses a relatively new branch of the technology called mid-air haptics. This uses ultrasound waves to create the feeling of touch in mid-air, without the need for any physical contact or wearable devices. Users can touch and interact with objects that support the technology with their hands as if they were real. Imagine feeling the shape, texture, and motion of a virtual object with your bare hands, or feeling a gentle breeze on your skin. This is possible with mid-air haptics, which uses focused ultrasound waves to create pressure on your skin.

By using an array of ultrasound transducers (or ‘speakers’), mid-air haptics can generate a focal point of high pressure in the air, which can be moved and modulated to create different sensations. Mid-air haptics can also be combined with other modalities, such as vision and sound, to create rich multisensory feedback. The potential applications of this technology are numerous, such as enhancing virtual and augmented reality, creating touchless interfaces, and providing novel forms of entertainment and education.

Who’s currently working on mid-air haptics?

One of the leading companies in mid-air haptics is Ultraleap, which was founded in 2013 as Ultrahaptics by Tom Carter, a PhD student at the University of Bristol. Ultraleap combines hand tracking and mid-air haptics to enable natural and intuitive interaction with digital content. It has developed several products and platforms, such as STRATOS, Gemini, and TouchFree, that are used by various industries and partners, such as DS Automobiles, Ocean Outdoor, Aquarium of the Pacific, and Qualcomm. Ultrahaptics became Ultraleap after it acquired Leap Motion, a company that specialises in hand-tracking technology, in 2019.

Another company that is working on mid-air haptics is Hosiden, a Japanese manufacturer of electronic components and devices. Hosiden has partnered with Ultraleap to bring mid-air haptics to cars of the future. The company plans to use Ultraleap’s technology to create touchless interfaces and infotainment systems for drivers and passengers that include music volume, temperature, and navigation.

How can mid-air haptics enhance mixed/extended reality?

When Apple unveiled the Vision Pro headset at the WWDC event amid much fanfare, it demonstrated a slew of abilities that may well have never been seen before on any headset so far. The headset offered functionalities such as smartphone apps, calling, and more, delivered directly to the eyes with a stunning resolution. It seemed that the Apple headset had left no stone unturned. However, there was one glaring omission – haptics. To the disappointment of some, the Vision Pro did not support haptics in any way, mainly because it did not use controllers. Thus, besides visual feedback, there was no way to sense the elements one interacted with or the games played.

While Apple doesn’t seem to have any plans in this direction, mid-air haptics can play an important role in creating highly immersive experiences in the future across realities – whether it’s virtual, augmented, or mixed. For example, the technology can enable users to feel virtual objects, menus, buttons, and feedback in MR/XR environments. It can also create realistic sensations of wind, rain, and other natural phenomena in MR/XR scenarios.

But perhaps the biggest advantage is the fact that mid-air haptics does not require any wearable or graspable devices, which can be crucial for making reality headsets less cumbersome. It can also reduce the need for controllers or gloves, which can limit the naturalness and freedom of movement.

Applications of mid-air haptics

With mid-air haptics, users can explore a wide range of applications in MR/XR. They can learn and practice skills in a safe and realistic way, such as performing surgery, playing musical instruments, or painting. They can also enjoy more immersive and engaging games and movies, where they can feel the actions and emotions of the characters in a better way. Moreover, they can communicate and interact with others in a more natural and expressive way, such as shaking hands, hugging, or high-fiving.

Other potential applications include:

Education: Mid-air haptics can facilitate learning and exploration, by enabling students to feel and manipulate abstract concepts and phenomena, such as molecular structures, sound waves, and magnetic fields.

Healthcare: Mid-air haptics can improve the quality and safety of medical procedures, by providing haptic guidance and feedback for surgeons, nurses, and patients, such as palpation, needle insertion, and wound dressing.

Accessibility: Mid-air haptics can assist people with visual impairments, by providing spatial and directional cues, such as navigation, obstacle detection, and object recognition.

By creating sensations in the air, mid-air haptics can bridge the gap between the virtual and the real world, and open up new possibilities for the future of MR/XR.



READ SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.