Google – researchers have discovered “e-textile

Google – researchers have discovered “e-textile

22. May 2020 0 By Horst Buchwald

Google – researchers have discovered “e-textile

New York, 22.5.2020

You are invited to a brainstorming on Google. Topic: find the smartest alternative to activation, control and use with fingers and mice when a smartphone or other devices are integrated into clothing. You can’t figure it out that fast? Or do you? The Google research team has discovered a new method – it’s called “e-textile”.

It is intended to enable users to control electronic devices by flicking or twisting their hoods. They invented a smart cable that can detect six types of events: twisting, flicking, pushing, pinching, gripping and patching. But that’s not all, each of these six actions can be performed at different speeds and in different directions. This opens up an interesting number of possibilities.

Researcher Alex Olwal says: “Textiles have the potential to help technology fit into our everyday environments and objects by improving aesthetics, comfort and ergonomics” and continues: “Advances in materials and flexible electronics have made it possible to integrate sensors and displays into soft form factors such as jackets, clothes and blankets”.

A closer look reveals eight sensor threads woven into the intelligent cable, each generating its own electrical field. The sensors can then detect objects, such as a user’s hand, when they interrupt the electrical field.

Different types of interaction with the textile cause different types of sensor reactions. The cord can sense proximity, but also contact area, contact time, roll and pressure; therefore the technology can distinguish between a finger and a hand gripping the braid, or between a pinch and a grasp.

To enable users to choose from this wider range of actions, Google researchers trained the model with machine learning. A group of 12 participants produced 864 gesture samples to feed the algorithm.

The group was not instructed how to perform the actions to ensure that the AI would be able to identify specific operations, although the individual styles varied from person to person, from different hand sizes to the use of clockwise or counterclockwise. According to the team, the algorithm can now detect gestures with an accuracy of 94%.

The gestures are associated with intuitive results: In a video released along with the news, the team showed how sliding a string connected to a speaker could change the song, how turning clockwise could increase the volume, and how pinching could start and stop the audio. In another example, the string was also used to scroll through a Web page, with the user turning the string clockwise to scroll down and counterclockwise to scroll up again.

Google’s new intelligent string also includes fiber optics, so that each action has visual feedback to help the user see how a gesture leads to an action. The visual signals, the researchers say, give the technology a certain intuitiveness.

Although the technology is still in its infancy, Olwal and his team have already developed prototypes with intelligent cables. In addition to hoodie strings and web browsing, Olwal mentioned the testing of the technology for “e-textile USB-C headphones” to control media playback on a phone and an interactive cable to control intelligent speakers.

“We hope to advance textile user interfaces and inspire the use of micro-interaction for future portable interfaces and smart fabrics,” said Olwal, “where eye-free access and casual, compact and efficient input are beneficial.

The search giant’s AI team even conducted an analysis to compare how users handle smart cables, comparing them to the feedback they get when scrolling through a trackpad and controlling headphones with remote control buttons. The results showed that twisting e-textiles is faster than existing headphone button controls and is comparable in speed to a touch interface; and that users tend to prefer smart cables over headphone controls. Conventional buttons effectively require users to find a specific location, which means that pressing the wrong button is costly – while an e-textile thread can be manipulated anywhere along the cord.

The Google research team only mentioned building prototypes, but perhaps the technology giant will want to incorporate the technology into fashion items in the future. The company has shown interest in e-textiles since 2015, when the Jacquard project was launched to explore the potential of using conductive threads to create touch-sensitive panels in garments.

Hits: 5