29 C
New York
Friday, July 25, 2025

Meta Outlines its Latest Advances in Wrist-Controller Inputs for AR and VR Interaction


Meta’s provided a glimpse into the future of digital interaction, via wrist-detected control, which is likely to form a key part of its coming AR and VR expansions.

Meta’s been working on a wrist controller, which relies on differential electromyography (EMG) to detect muscle movement, then translate that into digital signals, for some time, and now, it’s published a new research paper in Nature which outlines its latest advancement on this front.

Which could be the foundation of the next stage.

Meta EMG controls

As explained by Meta:

Our teams have developed advanced machine learning models that are able to transform neural signals controlling muscles at the wrist into commands that drive people’s interactions with [AR] glasses, eliminating the need for traditional – and more cumbersome – forms of input.”

Those “more cumbersome” methods include keyboards, mice and touchscreens, the current main forms of digital interaction, which Meta says can be limiting, “especially in on-the-go scenarios.” Gesture-based systems that use cameras or inertial sensors can also be restrictive, due to the potential for disruptions within their field of view, while “brain–computer or neuromotor” interfaces that can be enabled via sensors detecting brain activity are also often invasive, or require large-scale, complex systems to activate.

EMG control requires little disruption, and aligns with your body’s natural movement and behaviors in a subtle way.

Which is why Meta’s now looking to incorporate this into its AR system.

You can type and send messages without a keyboard, navigate a menu without a mouse, and see the world around you as you engage with digital content without having to look down at your phone.”

Meta says that its latest EMG controller recognizes your intent to perform a variety of gestures, “like tapping, swiping, and pinching – all with your hand resting comfortably at your side.”

The device can also recognize handwriting activity, to translate direct text.

And its latest model has produced solid results:

“The sEMG decoding models performed well across people without person-specific training or calibration. In open-loop (offline) evaluation, our sEMG-RD platform achieved greater than 90% classification accuracy for held-out participants in handwriting and gesture detection, and an error of less than 13° s−1 error on wrist angle velocity decoding […] To our knowledge, this is the highest level of cross-participant performance achieved by a neuromotor interface.”

To be clear, Meta is still developing its AR glasses, and there’s no concrete information on exactly how the controls for such will work. But it increasingly seems like a wrist-based controller will be a part of the package, when Meta does move to the next stage of its AR glasses project.

The current plan is for Meta to begin selling its AR glasses to consumers in 2027, when it’s confident that it will be able to create wearable, fashionable AR glasses for a reasonable price.

And with wrist control enabled, that could change the way that we interact with the digital world, and spark a whole new age of online engagement.

Indeed, Meta CEO Mark Zuckerberg has repeatedly noted that smart glasses will eventually overtake smartphones as the key interactive surface.

So get ready to keep an eye out for recording lights on people’s glasses, as their hand twitches at their side, because that, increasingly looks to be where we’re headed with the next stage of wearable development.  

Related Articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Stay Connected

0FansLike
0FollowersFollow
0FollowersFollow
0SubscribersSubscribe
- Advertisement -spot_img

CATEGORIES & TAGS

- Advertisement -spot_img

LATEST COMMENTS

Most Popular