Design and Create

Building and refining AI applications that enhance human capabilities

pillar3

Research Mission

HAI seeks to develop new human-centered design methods and tools so that AI agents and applications are designed and created with the ability to communicate with, collaborate with, and augment people more effectively, and to make their work better and more enjoyable. These breakthroughs will allow great progress in healthcare, education, sustainability, automation, and countless other domains.

AI has the potential to replace people in their jobs. But AI also has the potential to educate, train, and augment people, making them better at their tasks and activities. AI can make the quality of an individual’s work better, resulting in better writing, design, healthcare, communication, teaching, and art.

People are social animals; machines are not. To achieve broad acceptance, AI systems must conform to the often-implicit cultural conventions that underlie human interaction and communication. When should such systems “listen” and when should they “speak up”? If they require a shared resource, how can they balance their own needs with those of others? If humans are asked to rely on machine guidance to augment their decisions (and perhaps override their intuition), they may need to understand the strengths and weaknesses of the AI.

The advances and considerations developed in our other areas of focus, in addition to research in design methods, will help us to create systems that have these more appropriate communication capabilities. This underlying research will be combined with the use of AI in important application domains, such as education, healthcare, and sustainability, where the new design methods and tools can be leveraged and evaluated.

Other Focus Areas

Funded Research Projects

Dynamic Artificial Intelligence-Therapy for Autism on Google Glass

Dennis Wall, Tom Robinson and Terry Winograd

Children with autism (ASD) struggle to recognize facial expressions, make eye contact, and engage in social interactions. There is potential to meet this need through wearable tools. Tapping into this potential, we have prototyped an AI tool for automatic facial expression recognition that runs on Google Glass through an Android app to deliver social emotion cues to children with autism while interacting with family members in their natural environment. With the HAI grant, we will refine the system’s efficacy and ready it for deployment.

Learning Haptic Feedback for Motion Guidance

Julie Walker, Andrea Zanette, Mykel Kochenderfer and Allison Okamura

Haptics is a promising method for providing guidance to users during human-machine interaction, particularly through wearable or ungrounded devices. We plan to apply modeling and reinforcement learning to optimize ungrounded and wearable haptic guidance. We hope these methods will improve the ability of humans and intelligent systems to communicate effectively during tasks such as robotic surgery, teleoperation, and collaborative object manipulation.