In pursuit of more immersive technology, scientists and engineers have developed ever more realistic graphics, wearable devices, and forms of sensory feedback. The Human-Computer Integration Lab, directed by Assistant Professor Pedro Lopes, has explored many different ways of using electrical muscle stimulation (EMS) to control a user’s body, enhancing the realism of virtual and augmented reality and creating new interfaces for controlling tech – and allowing tech to control us. The group’s inventions have found new ways of manipulating arms, hands, and fingers, but one important body part has eluded their control: the head.

By controlling head movement, a device can direct its user’s point of view, create realistic sensations of motion, and create new control mechanisms based around simple gestures such as a nod or a head shake. But head orientation is controlled by a complex network of neck muscles, a tricky system to manipulate through electrodes placed on the skin. Previous devices designed to move the head have used a bulky headgear apparatus that is both expensive and restrictive.

In a paper presented at the 2022 CHI conference, PhD student Yudai Tanaka discovered a more elegant solution which he calls “electrical head actuation” (EHA), opening up new opportunities for applications that manipulate the head orientation. Tanaka’s interactive demo for the project, which used VR and his EHA to simulate the G-forces of a roller coaster ride, received the People’s Choice Best Demo Award at the conference, one of the most prestigious in human-computer interaction.

“I was interested in not just actuating the limbs, but in how we can use these actuation techniques to guide a user’s point of view, or what they are looking at,” Tanaka said. “The dominant organs that decide our point of view are the eyes. But it’s very difficult to directly move the eye muscles, because they are beneath our skull. Instead, if you can actuate the user’s head, it might be possible to also redirect where the user is looking. That was the starting point.”

But first, Tanaka needed to figure out how to stimulate the right muscles to produce reliable head motion in different directions. The neck contains a layered system of 12 main and 6 minor muscles controlling head movement, so finding the precise placement of EMS electrodes required many, many trials. While the laboratory was closed during the early months of the COVID-19 pandemic, Tanaka tested different electrode positions on himself for hours each day, until he found the right combination.

The final setup involves placing five electrodes on the back of the neck and two on the front. The soft electrode pads are unobtrusive and easy to hide – as Tanaka and his co-authors Jun Nishida and Lopes write in the paper, “a simple turtleneck would cover our system’s EMS electrodes entirely.” Once calibrated, the EMS can move the user’s head up and down, left and right, and combinations in between. Tanaka also developed a way to use VR/AR headsets, or even AirPod earbuds as sensors to monitor head position and adjust the strength of the stimulation to create the desired movement.

With the functionality settled, Tanaka set about creating and testing applications that use the system. In one, users take a mixed reality fire safety training course where the head actuator directs their point of view to the location of a fire extinguisher or the emergency exit route. Others include an interface where the user controls the volume on their computer by tilting their head up or down, the creation of realistic sensations of “getting punched” while playing a VR boxing game, and synchronization of head movements between two individuals, which could be also useful for these users to look at the same things at the same time.

“If someone is an expert on a task that uses a large-scale interface, it might be useful to feel how that expert is looking at specific things at specific times through your neck,” Tanaka said.

While the thought of a device intervening in your head movement might be off-putting to some, Tanaka said that the response in user studies and in his conference demos (he’ll bring EHA to SIGGRAPH 2022 this summer as well) has been mostly positive.

“All our electrical muscle stimulation systems have been designed with the goal to assist users while keeping them in control, not removing control,” Tanaka said. “For instance, our systems can detect whether a user moves by themselves or not, and can turn off their assistance when the user is moving against, allowing the user to be in control. In fact, over the last three years, my colleague, Jun and my advisor, Pedro have published five papers on how to measure and preserve the user’s agency during electrical muscle stimulation – it is an area we are really passionate about because we believe that’s the direction of the HCI community: human-computer integration systems.”

Related News

More UChicago CS stories from this research area.

Five UChicago CS students named to Siebel Scholars Class of 2024

Oct 02, 2023

UChicago Computer Scientists Design Small Backpack That Mimics Big Sensations

Sep 11, 2023

Computer Science Class Shows Students How To Successfully Create Circuit Boards Without Engineering Experience

May 17, 2023

UChicago CS Researchers Shine at CHI 2023 with 12 Papers and Multiple Awards

Apr 19, 2023

New Prototypes AeroRigUI and ThrowIO Take Spatial Interaction to New Heights – Literally

Apr 18, 2023

Computer Science Displays Catch Attention at MSI’s Annual Robot Block Party

Apr 07, 2023

UChicago, Stanford Researchers Explore How Robots and Computers Can Help Strangers Have Meaningful In-Person Conversations

Mar 29, 2023

Asst. Prof. Rana Hanocka Receives NSF Grant to Develop New AI-Driven 3D Modeling Tools

Feb 28, 2023
Young students on computers

UChicago and NYU Research Team Finds Edtech Tools Could Pose Privacy Risks For Students

Feb 21, 2023

Assistant Professor Chenhao Tan Receives Sloan Research Fellowship

Feb 15, 2023
Two students looking at a wearable device

High School Students Find Their Place in Computing Through Wearables Workshop

Jan 13, 2023

UChicago CS Researchers Create Living Smartwatch to Explore Human-Device Relations

Dec 12, 2022
arrow-down-largearrow-left-largearrow-right-large-greyarrow-right-large-yellowarrow-right-largearrow-right-smallbutton-arrowclosedocumentfacebookfacet-arrow-down-whitefacet-arrow-downPage 1CheckedCheckedicon-apple-t5backgroundLayer 1icon-google-t5icon-office365-t5icon-outlook-t5backgroundLayer 1icon-outlookcom-t5backgroundLayer 1icon-yahoo-t5backgroundLayer 1internal-yellowinternalintranetlinkedinlinkoutpauseplaypresentationsearch-bluesearchshareslider-arrow-nextslider-arrow-prevtwittervideoyoutube