Tech hacks the nervous system to bring touch to virtual reality
The era of virtual reality seems to have finally arrived. VR headsets dominated the convention halls of this year’s CES. Apple’s Vision Pro and Microsoft’s HoloLens aim to blend virtual and real spaces into a single augmented reality. CEOs are waxing poetic about the metaverse’s potential to reimagine work, play, socialization, and just about everything else.
While the technology is impressive, it also focuses almost exclusively on two of our senses: sight and sound. But humans experience the world through a panoply of senses beyond our eyes and ears — some say as many as 20. One that is often neglected or absent entirely from these virtual worlds is touch.
“There is no reality to mixed or augmented reality until there’s touch,” Dustin Tyler, the chief science officer of Afference, told Freethink.
Tyler co-founded Afference with Jacob Segil, its CEO, to deepen the virtual experience by bringing realistic, tactile feedback to it. Their prototype device, called the Phantom, is a neural haptic device that doesn’t stimulate touch through a poke or vibrations. Instead, it hacks your nervous system.
All in your head
Tyler has been developing neurostimulation technology for 20 years as a neural engineer researching prosthetics technology. He notes that while some advanced prosthetics can return fine motor skills to people who have lost limbs, it isn’t quite the same. Even when they see themselves, say, gripping an apple, the experience can feel disconnected.
“Every person said something like, ‘I want to hold my wife’s hand.’ It’s emotional, right? It’s not only a function; it’s an actual emotional connection. Connection matters,” Tyler said.
In 2012, Tyler and his research team created an implant sensory interface to add the sensation of touch to prosthetic fingers. Since then, they’ve continued to work on ways to improve the function and feeling of prosthetics. “People would say, ‘It’s my hand grabbing that. I have my hand back,” Tyler said. “Touch changes all of that.”
Building on Tyler’s research, Afference’s Phantom prototype aims to bring touch into VR and spatial computing. The device is a palmless, fingerless glove. It basically looks like something Billy Idol would rock during the cyberpunk apocalypse (a feature we place firmly in the “pro” column).
When worn, a user’s fingers pass through rings that rest on the bottom joint. It’s these rings that stimulate the sense of touch — or what Afference calls “haptic illusions” — by conducting electrical signals through the nerves. These signals convince your nervous system that it feels the object you’re interacting with in the virtual space.
Hence why Tyler and Segil named their company Afference, which comes from afferent, meaning to convey impulses toward the central nervous system.
“We’re directly communicating with the brain,” Tyler says. “This technology allows us a lot more flexibility and opportunity. You can gain information from the virtual world. You’re not holding anything, but you can learn about an object by interrogating it with your tactile experience.”
By altering the complexity of these signals, the technology can subtly change the sensations a user experiences. It can be something as simple as the click of a button or something more complex, such as discerning the ripeness of avocados based on their firmness or squishiness. The Phantom can also render more abstract sensations, such as the beat of a virtual speaker crying more, more, more. The pulse of Idol’s “Rebel Yell” will even intensify or weaken based on the user’s proximity to it in the virtual space.
Hooked on a feeling
At the moment, the Phantom is in the proof-of-concept phase. Afference demoed the prototype publicly at this year’s CES, where it won the Best of Innovation XR Technologies & Accessories award. The next steps include seed funding to build capacity and finding partnerships to experiment with and develop use cases.
“We think this is a game changer for making spatial computing a reality,” Tyler said. “If you really want to engage in a virtual or alternate reality, you have to feel when you touch something. That’s what we’re bringing, and without it, we don’t think [spatial computing] is going to be very successful.”
He points out that while we often focus on sight and sound, it’s touch that has made technologies more relatable in the past. Clicking a mouse made interacting with computer operating systems feel more intuitive. Vibrations in smartphones make actions like typing or rotating the screen feel natural and engaging. And if virtual and augmented realities are to become standard ways for people to work, play, and socialize, they’ll need to reimagine haptic feedback to match their high-definition worlds.
“What’s amazing to me is that your brain begins to understand that you really need that physical environment,” Tyler added. “Tiny things make all the difference.”