Ken Nakagaki (MIT) - Designing Actuated Tangible UIs that blend the Joy of Physical Interaction, Robotic Actuation, and Digital ComputationReturn to Full Calendar
- April 8, 2021 at 1:00pm - 2:00pm
- Live Stream
- Event Audience:
Speaker: Ken Nakagaki PhD Candidate, MIT Media Lab
Ken Nakagaki is an interaction designer and HCI (Human-Computer Interaction) researcher from Japan. As a Ph.D. Candidate in the Tangible Media Group of the MIT Media Lab, he focuses on inventing novel user interface technologies that seamlessly combine dynamic digital information or computational aids into daily physical tools and materials. He is passionate about creating novel physical embodied experiences using such interfaces through curiosity-driven tangible prototyping processes.
Before joining the Media Lab, he received Master’s and Bachelor’s degrees in interaction design from Keio University. His research has been presented in top HCI conferences (ACM CHI, UIST, TEI) including 9 full paper publications (two of them received the Honorable Mention Award). His works were also demonstrated in international exhibitions and museums such as the Ars Electronica Festival and Laval Virtual. He has received numerous awards, including the MIT Technology Review’s Innovators Under 35 Japan, the Japan Media Arts Festival, and James Dyson Award.
Abstract: Designing Actuated Tangible UIs that blend the Joy of Physical Interaction, Robotic Actuation, and Digital Computation
Squishing clay between fingers, twisting string into geometric shapes, tinkering with the gears of a mechanical toy -- physical objects fill our world and invite us to touch and interact with them using our hands. These tangible actions and sensations evoke a sense of joy and curiosity that stimulates our ingenuity, imagination, and creativity. However, the mainstream of HCI focuses on Graphical User Interfaces (GUI) that do not provide the joy of interaction we experience with physical tools and materials as the information is trapped behind a flat-screen.
My research in Actuated Tangible User Interfaces (A-TUIs) envisions the emerging future of physical environments that incorporate dynamic actuation (e.g., shape change and motion) enabled by advanced robotic hardware, closed-loop control, and interactive computing technologies. These interfaces enrich our interactions with digital information and our physical environments through motion and transformation. My research has investigated interdisciplinary approaches across engineering, human cognition, and interaction design to advance A-TUIs’ capability to weave themselves into the fabric of everyday life. Such research includes:
- Shape-shifting surfaces that can dynamically render material properties -- from clay to liquid -- in response to touch.
- Transforming robotic strings that can fit within hands, wrap around bodies and form expressive shapes.
- Table-top swarm robots that actuate interchangeable external mechanical shells to afford new functions, forms, and haptic outputs.
In my talk, I will discuss my research approaches and relevant projects that reconceptualize the relationship between humans and machines. This new relationship of the future is rooted in the joy of tangibility.
Host: Blase Ur