Meta Is Creating A Real-Life Glove Capable Of Holding Virtual Objects and Performing ‘Emoji Handshakes’

Meta, formerly Facebook, is working on a glove that will allow users to interact with virtual reality objects.

Although the business claims it is still in the “early stages of research,” the haptic gloves appear to be capable of recreating the sensations of texture, pressure, and vibration when interacting with a digital environment.

Meta’s ultimate goal is to combine the gloves with a virtual reality headset to simulate performing in a concert or poker game, and the gloves may also function with augmented reality glasses in the future.

Facebook has been working on its smart eyewear, Project Aria, for some time, but it appears that the debut date has been pushed back into 2021.

“Hands have enormous significance in tackling the interaction challenge in AR and VR,” said Sean Keller, research director of Meta Reality Labs.

“We use our hands to communicate with one another, to learn about the world, and to act in it.” If we can bring full hand presence into AR and VR, we can take advantage of a lifetime of motor learning.

“Without having to learn a new method of engaging with the world, people could touch, feel, and manipulate virtual objects just like real objects.”

Meta will have to use a combination of aural, visual, and tactile cues to fool the brain into believing the virtual environment is genuine.

READ ALSO: Major tech companies strive to seize metaverse opportunities

The gloves will need to be “stylish, comfortable, inexpensive, durable, and totally adjustable,” according to Meta. That problem is tough to overcome in practice.

The gloves are built up of hundreds of tiny actuators – small motors – that work in unison but are now too large, expensive, and hot to use. Soft ones that change shape when the wearer walks may theoretically replace these, but they don’t exist yet.

The business is currently exploring and developing the technology, focusing on weight and speed, as well as developing the necessary software to accurately imitate real-world physics.

“If I pick up a cube, I already have ideas about what kind of material it is and how hefty it might be,” explains Meta researcher Jess Hartcher-O’Brien.

“I grasp it, I verify the material, so I’m integrating visual clues about its material properties with haptic feedback from the first impact.” When I try to manage the thing, my brain detects frictional forces and inertia, allowing me to determine how dense or heavy it is. My visual system is updating itself in response to the movement of my arm. Proprioception tells me where my arm is in space, how fast it’s moving, and how my muscles are working.”

This is also where technology like hand-tracking, which is already included in the Oculus headsets, comes in to deliver information to the precise location. Meta might generate a “haptic click” with virtual buttons or “haptic emoji handshakes” for meeting individuals users know in the metaverse in the future.

Mark Zuckerberg, the CEO of Meta, has continuously championed the metaverse as Facebook’s future, especially in light of a slew of problems involving the app and others like it, such as Instagram.

According to Mr. Zuckerberg, a “embodied internet” will focus on “engag[ing] more organically” with the behaviors we already have, such as grabbing for our cellphones as soon as we wake up.

It remains to be seen whether the firm can successfully govern the metaverse — or at least its portion of it. In a leaked document, Meta’s CTO Andrew Bosworth stated that the company’s products should have “nearly Disney standards of safety,” but that virtual reality can be a “toxic environment” that may drive “mainstream customers away totally.”

He did say, though, that controlling users’ words and behavior “at any meaningful scale” is “practically impossible.”