Meta is testing a feature that might be a major improvement to its Quest VR headsets: the ability to navigate and tap on virtual objects with just your hands, eliminating the need for controllers. The concept is that you will be able to use simply your fingers in the air to perform tasks you may already be acquainted with from your smartphone, such as swiping up and down a page, tapping a button to activate it, or typing on an onscreen keyboard.
“Direct Touch” is the new experimental feature that comes with the Quest v50 software upgrade, which is now being distributed. When the update finally arrived for me after several weeks of waiting, I turned it on right away.
When hand tracking is enabled, the Quest 2 tracks your hands with its external-facing cameras. Your hands appear as black shadows inside the VR headset. (More hand and arm detail may be seen in CEO Mark Zuckerberg’s Direct Touch video, which appears to have been taken from a Quest Pro.) To estimate when your hand will “touch” a menu or window in front of you, use those shadows. When you make “contact” with anything using Direct Touch, it will begin to scroll or light up. Although scrolling is choppy, it usually responds better than I anticipated.
Direct touch typing, on the other hand, smells. The Quest onscreen keyboard appears beneath the window when you tap on a text-input area of the user interface (UI), and you can “press” individual keys to type text. But it’s difficult to know where or what you’re truly typing because there’s nowhere to put your hands or fingers down. (First, picture the iPad’s onscreen keyboard without providing any feedback; then, picture a world without glass.) The user interface occasionally believes that I tapped a different key than I meant to, even when I use VR hunt-and-peck to hopelessly compose even a single word. Thankfully, the keyboard does offer word suggestions while you’re typing, which is useful occasionally.
The Quest web browser is possibly the greatest example of the Direct Touch controls because of its poor typing and good scrolling. If I misspell something on the internet, the search engine will most likely correct me. Both swiping up and down and tapping on links function fairly well. For some reason, The Verge’s homepage on the Quest browser doesn’t scroll past our list of the Top Stories, but tapping any of the six stories I can view works better than I anticipated.
Although many apps from the Quest Store, including Meta’s own Horizon Worlds VR social network, haven’t been upgraded to work with just your hands, the majority of other built-in Quest apps that I tried were at least useable with Direct Touch. Even without a controller, they wouldn’t open. While I wasn’t anticipating programs like Beat Saber to improve now that I didn’t have a controller, I wanted to at least be able to play around with them.
As things are, it’s obvious why Direct Touch is considered an experiment. I find it hard to believe that my hand will actually “touch” a virtual portion of the Quest’s user interface (UI) with each mid-air poke, thus using it for more than a few minutes at a time becomes tiresome. It also becomes tiresome after a time to have to hold out my arms in midair to navigate the user interface. Although I find them less intuitive, Meta’s other controller-free hand gestures, which entail pinching, are usually more dependable.
Having said that, I still find the concept of direct touch to be cool. Even though my words per minute drop by 99 percent, I feel like I’m living out some sort of sci-fi fantasy when I scroll and tap on virtual surfaces in my VR headset, and none of my taps seem to work the way I expect them to. Using my hands is also much more convenient than using the Quest’s controls when Direct Touch functions as planned. I realize that’s a big disclaimer, but wearing the headset and using my hands to scroll through content eliminates a lot of the friction that comes with donning the Quest. (Having said that, I still need to make sure the controllers are close by because Direct Touch is so picky.)
Furthermore, it’s easy to envision where this technology might go, particularly if Meta’s still-ahead AR glasses are developed. You probably won’t want to have a controller or two when you could just use your hands while wearing those spectacles in public. It’s also plausible that Apple is investigating these kinds of interactions as well since the company’s long-rumored mixed reality headset would allow users to text on onscreen keyboards in addition to using Meta gadgets with their hands in the air.
I will mostly continue to use the quest’s controllers for the time being. But, I might put the controllers down and try to do the task with my hands if I only need to fast-check anything on my headset. Even though it can take three times as long, it’s much cooler.