Date & Time:
April 8, 2021 1:00 pm – 2:00 pm
Location:
Live Stream
04/08/2021 01:00 PM 04/08/2021 02:00 PM America/Chicago Ken Nakagaki (MIT) – Designing Actuated Tangible UIs that blend the Joy of Physical Interaction, Robotic Actuation, and Digital Computation Live Stream

Designing Actuated Tangible UIs that blend the Joy of Physical Interaction, Robotic Actuation, and Digital Computation

Squishing clay between fingers, twisting string into geometric shapes, tinkering with the gears of a mechanical toy — physical objects fill our world and invite us to touch and interact with them using our hands. These tangible actions and sensations evoke a sense of joy and curiosity that stimulates our ingenuity, imagination, and creativity. However, the mainstream of HCI focuses on Graphical User Interfaces (GUI) that do not provide the joy of interaction we experience with physical tools and materials as the information is trapped behind a flat-screen. 

My research in Actuated Tangible User Interfaces (A-TUIs) envisions the emerging future of physical environments that incorporate dynamic actuation (e.g., shape change and motion) enabled by advanced robotic hardware, closed-loop control, and interactive computing technologies. These interfaces enrich our interactions with digital information and our physical environments through motion and transformation. My research has investigated interdisciplinary approaches across engineering, human cognition, and interaction design to advance A-TUIs’ capability to weave themselves into the fabric of everyday life. Such research includes: 

– Shape-shifting surfaces that can dynamically render material properties — from clay to liquid — in response to touch.  
– Transforming robotic strings that can fit within hands, wrap around bodies and form expressive shapes. 
– Table-top swarm robots that actuate interchangeable external mechanical shells to afford new functions, forms, and haptic outputs. 

In my talk, I will discuss my research approaches and relevant projects that reconceptualize the relationship between humans and machines. This new relationship of the future is rooted in the joy of tangibility.

Host: Blase Ur

Ken Nakagaki

Assistant Professor of Computer Science

Ken Nakagaki is an interaction designer and HCI (Human-Computer Interaction) researcher from Japan. He joined the University of Chicago’s Computer Science Department as an Assistant Professor and founded the Actuated Experience Lab, or AxLab, in 2022.

His research has focused on inventing and designing novel user interface technologies that seamlessly combine dynamic digital information or computational aids into daily physical tools and materials. He is passionate about creating novel physical embodied experiences via these interfaces through curiosity-driven tangible prototyping processes. At AxLab, he pursues research in actuated and shape-changing user interface technologies to design the future of user experiences.

Before joining UChicago, he received his Ph.D. from the MIT Media Lab, where Prof. Hiroshi Ishii was his advisor. There he focused his research on Actuated Tangible User Interfaces. Ken has presented at top HCI conferences (ACM CHI, UIST, TEI) and led demonstrations of his work at international exhibitions and museums, including the Ars Electronica Festival and Laval Virtual. He has received numerous awards, including the MIT Technology Review’s Innovators Under 35 Japan & Asia Pacific, the Japan Media Arts Festival, and the James Dyson Award.

Related News & Events

FabRobotics: The Fusion of 3D Printing and Mobile Robots

Feb 27, 2024

High School Students In The Collegiate Scholars Program Get To Know Robots

Nov 14, 2023

Five UChicago CS students named to Siebel Scholars Class of 2024

Oct 02, 2023

UChicago Computer Scientists Design Small Backpack That Mimics Big Sensations

Sep 11, 2023

Computer Science Class Shows Students How To Successfully Create Circuit Boards Without Engineering Experience

May 17, 2023

UChicago CS Researchers Shine at CHI 2023 with 12 Papers and Multiple Awards

Apr 19, 2023

New Prototypes AeroRigUI and ThrowIO Take Spatial Interaction to New Heights – Literally

Apr 18, 2023

Computer Science Displays Catch Attention at MSI’s Annual Robot Block Party

Apr 07, 2023

UChicago, Stanford Researchers Explore How Robots and Computers Can Help Strangers Have Meaningful In-Person Conversations

Mar 29, 2023

Asst. Prof. Rana Hanocka Receives NSF Grant to Develop New AI-Driven 3D Modeling Tools

Feb 28, 2023
Young students on computers

UChicago and NYU Research Team Finds Edtech Tools Could Pose Privacy Risks For Students

Feb 21, 2023

Assistant Professor Chenhao Tan Receives Sloan Research Fellowship

Feb 15, 2023
arrow-down-largearrow-left-largearrow-right-large-greyarrow-right-large-yellowarrow-right-largearrow-right-smallbutton-arrowclosedocumentfacebookfacet-arrow-down-whitefacet-arrow-downPage 1CheckedCheckedicon-apple-t5backgroundLayer 1icon-google-t5icon-office365-t5icon-outlook-t5backgroundLayer 1icon-outlookcom-t5backgroundLayer 1icon-yahoo-t5backgroundLayer 1internal-yellowinternalintranetlinkedinlinkoutpauseplaypresentationsearch-bluesearchshareslider-arrow-nextslider-arrow-prevtwittervideoyoutube