Stanford
University
  • Stanford Home
  • Maps & Directions
  • Search Stanford
  • Emergency Info
  • Terms of Use
  • Privacy
  • Copyright
  • Trademarks
  • Non-Discrimination
  • Accessibility
© Stanford University.  Stanford, California 94305.
Ambient Intelligence, Human Impact | Stanford HAI
Skip to content
  • About

    • About
    • People
    • Get Involved with HAI
    • Support HAI
    • Subscribe to Email
  • Research

    • Research
    • Fellowship Programs
    • Grants
    • Student Affinity Groups
    • Centers & Labs
    • Research Publications
    • Research Partners
  • Education

    • Education
    • Executive and Professional Education
    • Government and Policymakers
    • K-12
    • Stanford Students
  • Policy

    • Policy
    • Policy Publications
    • Policymaker Education
    • Student Opportunities
  • AI Index

    • AI Index
    • AI Index Report
    • Global Vibrancy Tool
    • People
  • News
  • Events
  • Industry
  • Centers & Labs
Navigate
  • About
  • Events
  • Careers
  • Search
Participate
  • Get Involved
  • Support HAI
  • Contact Us

Stay Up To Date

Get the latest news, advances in research, policy work, and education program updates from HAI in your inbox weekly.

Sign Up For Latest News

news

Ambient Intelligence, Human Impact

Date
May 07, 2025
Topics
Healthcare
Computer Vision

Ehsan Adeli is an assistant professor of psychiatry and behavioral sciences at Stanford.

Health care providers struggle to catch early signals of cognitive decline. AI and computational neuroscientist Ehsan Adeli’s innovative computer vision tools may offer a solution.

Inside eight apartments at a housing community for seniors in Yuma, Arizona, a camera smaller than a sticky note sits on a shelf in the living room. With the consent of residents, it captures their movements, behaviors, and facial expressions throughout the day. On the back end, an algorithm monitors the footage for troubling changes: Have they started watching television for 10 hours straight? Wobbling as they walk? Frowning and waving their hands more often?

Behaviors like these—which can indicate memory loss, depression, mobility challenges, or irritability—are among the early signs of cognitive decline, which can lead to dementia. But they’re often invisible to health care providers, who base diagnoses on self-reported questionnaires, brief assessments during rare visits, and reports from caregivers who may not notice subtle shifts. 

With the experiment in Yuma, Ehsan Adeli, an assistant professor of psychiatry and behavioral sciences at Stanford, is hoping to change that. His pioneering research uses computer vision intelligence to analyze patients’ movements in videos, from daily activities to minute gestures, in order to flag worrisome symptoms for clinicians. Catching those early can allow for interventions and support that otherwise would have to wait until the disease is further along. 

“Our hope is that this will potentially revolutionize the early diagnosis of cognitive decline, Alzheimer’s disease, and related dementias,” Adeli says.

Among the model’s abilities: recognizing actions such as adjusting the bed, or putting slipper socks and compression sleeves on the patient. Courtesy of Ehsan Adeli and the Partnership in AI-Assisted Care team.

Adeli’s work is part of a growing field called ambient intelligence, which embeds sensors in everyday environments and uses artificial intelligence to interpret the data. His first related project, an ongoing work with Clinical Excellence Research Center (CERC), uses computer vision, a type of AI, to analyze videos of patient interactions in order to improve care at Stanford Hospital.

Ambient intelligence is also a leading priority for the Stanford Institute for Human-Centered Artificial Intelligence (HAI), which partially supports Adeli’s research. As an HAI-affiliated faculty member, Adeli shares the institute’s vision of a future in which technology augments human potential and enriches lives, ultimately contributing to a more sustainable and compassionate society.

Adeli’s research with seniors brings his innovative approach into homes for the first time. What are known as neuropsychiatric symptoms—such as mood changes, confusion, and wandering—are proven early predictors of cognitive problems. Sensors commonly used in health care, such as sleep and temperature monitors, can’t track these effectively. Wearable sensors are too onerous and can result in missing data if patients remove them.

“That’s why the type of technology we are developing—specifically, computer vision—is key,” Adeli says. “There are few instances of using passive camera data to understand behavior, let alone relating them to clinical outcomes.”

"Our hope is that this will potentially revolutionize the early diagnosis of cognitive decline, Alzheimer’s disease, and related dementias.”

— Ehsan Adeli

(Photo by Jess Alvarenga)

Adeli and his collaborators began conceiving the technology two years ago with support from, among others, the Jaswa Innovator Award for Early Career Innovators from the Department of Psychiatry and Behavioral Sciences. 

With input from practitioners, Adeli is building personalized dashboards that track patients’ key behaviors over time, allowing doctors to notice gradual changes. 

“This would be a kind of contactless vital sign monitoring for human behavior,” he says. “If we can detect these signs early on, medications and behavioral therapies could be used to delay adverse effects and have prolonged, higher quality of life.”

Once Adeli and his team establish the feasibility of the technology, they hope to launch a clinical trial that compares its efficacy to reigning methods of diagnosing cognitive decline. With support from Stanford’s psychiatry department, Adeli has built a “Living Lab,” which resembles a typical living room and will be equipped with more than 20 contactless sensors. His goal is to find sensors that can supplement cameras in people’s homes while maintaining privacy. For example, sensors in a mattress could track sleep patterns, or those embedded in a bathroom floor could capture movement. 

Ambient intelligence tools will soon be commonplace in communal areas of senior homes, predicts Bryan Ziebart, president of Insight Living, which manages the Yuma facility. He believes that quality of life and health outcomes will improve as a result.

“If you have 100 residents, you don’t have 100 caregivers,” he says. “Leveraging computer vision around affect, gait, and emotional state will be a core part of how communities operate in the future.” 

Along with HAI collaborators James Landay and Fei-Fei Li and the CERC collaborators Arnold Milstein and Vankee Lin, Adeli is in the process of launching a complementary pilot project with the National University of Singapore focused on detecting neuropsychiatric symptoms as a precursor for dementia. Adeli’s team, which also includes Sarah Billington, is also testing and designing ambient intelligence tools that go beyond tracking cognitive decline for use in general senior care. 

For Adeli, the issue is personal. Years ago, he witnessed his wife’s grandmother, who suffered from dementia, gradually lose her ability to perform daily activities. 

“It was heartbreaking, and that’s partly why I am passionate about the technology we are developing,” he says. “I truly hope it can help millions of families like mine—offering early detection, timely intervention, and ultimately, a chance to preserve the health and independence of their loved ones.”

Ehsan Adeli is an assistant professor of psychiatry and behavioral sciences and, by courtesy, of computer science at Stanford. He directs the Stanford Translational AI Lab.

This story was first published on Stanford Momentum.

Ehsan Adeli is an assistant professor of psychiatry and behavioral sciences at Stanford.

Share
Link copied to clipboard!

Related News

From Privacy to ‘Glass Box’ AI, Stanford Students Are Targeting Real-World Problems
Nikki Goth Itoi
Feb 27, 2026
News

An Amazon-backed fellowship will support 10 Stanford PhD students whose work explores everything from how we communicate to understanding disease and protecting our data.

News

From Privacy to ‘Glass Box’ AI, Stanford Students Are Targeting Real-World Problems

Nikki Goth Itoi
Generative AIHealthcarePrivacy, Safety, SecurityComputer VisionSciences (Social, Health, Biological, Physical)Feb 27

An Amazon-backed fellowship will support 10 Stanford PhD students whose work explores everything from how we communicate to understanding disease and protecting our data.

America's 250 Greatest Innovators: Celebrating The American Dream
Forbes
Feb 11, 2026
Media Mention

HAI Co-Director Fei-Fei Li named one of America's top 250 greatest innovators, alongside fellow Stanford affiliates Rodney Brooks, Carolyn Bertozzi, Daphne Koller, and Andrew Ng.

Media Mention
Your browser does not support the video tag.

America's 250 Greatest Innovators: Celebrating The American Dream

Forbes
Computer VisionGenerative AIFoundation ModelsEnergy, EnvironmentEthics, Equity, InclusionFeb 11

HAI Co-Director Fei-Fei Li named one of America's top 250 greatest innovators, alongside fellow Stanford affiliates Rodney Brooks, Carolyn Bertozzi, Daphne Koller, and Andrew Ng.

AI Can’t Do Physics Well – And That’s a Roadblock to Autonomy
Andrew Myers
Jan 26, 2026
News
breaking of pool balls on a pool table

QuantiPhy is a new benchmark and training framework that evaluates whether AI can numerically reason about physical properties in video images. QuantiPhy reveals that today’s models struggle with basic estimates of size, speed, and distance but offers a way forward.

News
breaking of pool balls on a pool table

AI Can’t Do Physics Well – And That’s a Roadblock to Autonomy

Andrew Myers
Computer VisionRoboticsSciences (Social, Health, Biological, Physical)Jan 26

QuantiPhy is a new benchmark and training framework that evaluates whether AI can numerically reason about physical properties in video images. QuantiPhy reveals that today’s models struggle with basic estimates of size, speed, and distance but offers a way forward.