Karen Liu, a specialist in computer animation, expounds upon her rapidly evolving specialty, known as physics-based simulation, and how it is helping robots become more physically aware of the world around them.
Stanford’s Karen Liu is a computer scientist who works in robotics.
She hopes that someday machines might take on caregiving roles, like helping medical patients get dressed and undressed each day. That quest has provided her a special insight into just what a monumental challenge such seemingly simple tasks are. After all, she points out, it takes a human child several years to learn to dress themselves — imagine what it takes to teach a robot to help a person who is frail or physically compromised?
Liu is among a growing coterie of scientists who are promoting “physics-based simulations” that are speeding up the learning process for robots. That is, rather than building actual robots and refining them as they go, she’s using computer simulations to improve how robots sense the physical world around them and to make intelligent decisions under changes and perturbations in the real world, like those involved in tasks like getting dressed for the day.
To do that, a robot must understand the physical characteristics of human flesh and bone as well as the movements and underlying human intention to be able to comprehend when a garment is or is not going on as expected.
The stakes are high. The downside consequence could be physical harm to the patient, as Liu tells Stanford Engineering’s The Future of Everything podcast hosted by bioengineer and Stanford Institute for Human-Centered Artificial Intelligence Associate Director Russ Altman.
Stanford HAI's mission is to advance AI research, education, policy and practice to improve the human condition. Learn more.