Artificial Intelligence

Spray-on smart skin uses AI to interpret hand movements


Scientists have devised a new material that can be sprayed onto the back of human hands to track their movements, according to a new study published in the journal Nature Electronics.

The researchers have built a prototype that recognises simple objects by touch and can even do predictive two-handed typing on an invisible keyboard.

A tiny electrical network senses the stretching and bending of the skin as the hand moves, which is then interpreted by artificial intelligence (AI) to identify movements.

The algorithm was able to type “No legacy is so rich as honesty” from William Shakespeare and “I am the master of my fate, I am the captain of my soul” from William Ernest Henley’s poem “Invictus.”

The researchers say that this technology could have applications in a variety of fields from gaming to sports, telemedicine, and robotics.


Read more: Language was born in the hands.


Electronic devices that can identify the movement and intended tasks of the human hand already exist. But these are often bulky and require large amounts of data to be collected for each user and task to train the algorithm. Unsurprisingly, widespread adoption has been limited.

Now, researchers have created a sprayable electrically sensitive mesh network – made up of millions of silver nanowires coated in gold and embedded in polyurethane – which conforms to the wrinkles and folds of the human finger and stays on — unless rubbed away in soap and water.

“As the fingers bend and twist, the nanowires in the mesh get squeezed together and stretched apart, changing the electrical conductivity of the mesh. These changes can be analysed to tell us precisely how a hand or a finger or a joint is moving,” explains senior author Zhenan Bao, Professor of Chemical Engineering at Stanford University in the US.

A light-weight Bluetooth module is also attached to wirelessly transfer those signal changes and machine learning steps in to interpret them.

The changing patterns in conductivity are mapped to specific tasks and gestures and the algorithm learns to recognise them. For instance, typing the letter X onto a keyboard. And once the algorithm is suitably trained the physical keyboard isn’t needed any more – just the hand movements by themselves are enough.


Read more: Promise and problems: how do we implement artificial intelligence in clinical settings and ensure patient safety?


The machine learning scheme is also more computationally efficient than existing technologies.

“We brought the aspects of human learning that rapidly adapt to tasks with only a handful of trials, known as ‘meta-learning.’ This allows the device to rapidly recognise arbitrary new hand tasks and users with a few quick trials,” says first author Kyun Kyu “Richard” Kim, a post-doctoral scholar in Bao’s lab.

Diagram showing device on the hand and forearm.
Spray-on sensory system which consists of printed, bio-compatible nanomesh directly connected with wireless Bluetooth module and further trained through meta-learning. Credit: Kyun Kyu “Richard” Kim, Bao Group, Stanford U.

“Moreover, it’s a surprisingly simple approach to this complex challenge that means we can achieve faster computational processing time with less data because our nanomesh captures subtle details in its signals,” Kim adds.

Because it is sprayed-on, the device can conform to any size or shaped hand and could be used to recognise sign language or even objects by tracing their exterior surfaces by hand. In the future, it could also potentially be adapted to the face to capture subtle emotional cues, which may enable new approaches to computer animation or even virtual meetings.





READ SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.