What Your Phone Knows Could Help Scientists Understand Your Health

Stanford scientists have released an open-source platform that lets health researchers study the “screenome” – the digital traces of our daily lives – while protecting participants’ privacy.
Numerous sensors allow smartphones to silently witness everything we do, says Ian Kim, a postdoctoral fellow in psychology at Stanford University. They count each smartphone owner’s steps, measure their sleep, record where they are, log their every tap, swipe and scroll, recognize their faces, and capture screenshots of what they’re looking at as they go about their lives.
Collectively, these digital traces constitute an individual’s “screenome” – a term coined by Kim’s advisor, Nilam Ram, a professor of communication and of psychology at Stanford’s School of Humanities and Sciences.
Now, Kim, Ram, and their colleagues have released Stanford Screenomics, an open-source Android-based platform that allows health researchers to collect screenome data at scale while preserving study participants’ privacy.
As described in a recent Nature Health paper, the team hopes that knowing what people see and do on their phones will yield insights into how the digital world intersects with, reflects, and influences their physical and mental health.
Until now, using digital trace data raised technical and privacy challenges that required researchers to have access to software engineering and infrastructure expertise, Kim says. By releasing a comprehensive, turnkey open-source platform, the team hopes to help health researchers more easily explore a range of useful questions about how we shape and are shaped by our digital environments, and consider the possibility of delivering time- and context-specific interventions.
“We want to understand people’s digital lives and help them negotiate their way through these environments in beneficial ways,” Ram says.
Developing a Comprehensive Screenomics Platform
Five years ago, Ram’s team started collecting screenomic data to see what it would reveal about human behavior. They set up an app to capture screenshots and also gathered other data streams, including favorite apps, GPS location, certain words being typed, and step counts.
This work yielded several papers providing interesting insights into how smartphone use patterns relate to week-to-week fluctuations in mental health, and even in the days/hours before a mental health crisis, Ram says. The team has also used screenomic data to study how switching between applications reflects the ways people create meaning for themselves; the connectedness between young adults and their parents; how exposure to nature on the screen supports well-being; and how individuals obtain information during extreme weather events.
Having shown that screenome data opens up new ways to study individuals’ daily lives, Ram invited Kim to develop a more comprehensive, open-source screenomics platform that would allow researchers to collect more than 20 different data types simultaneously. “Our goal was to create a tool that is as flexible as it is powerful,” Kim says.
The result: the Stanford Screenomics platform, which allows researchers to easily customize their projects and collect the data they need. Researchers interact with a front-end console where they can configure study settings using a drag-and-drop interface with binary (on/off) controls, monitor the progress of data collection through a dashboard, and store data securely in HIPAA-compliant databases. For study participants, a data collection app runs unobtrusively in the background as they go about their daily lives.
“Configuring the settings requires no coding or technical expertise,” Kim says. “And all of the backend infrastructure (database, server, and storage) is provisioned automatically.”
Privacy, Security, and Screenomics
By design, the platform handles highly sensitive data, Kim says. That includes when and where someone opens an app, what they look at and for how long, what they are doing (walking, sitting, driving), their location, and what they do next.
Because this kind of tracking raises significant privacy concerns, Ram says, “we have taken great care with the ethics surrounding the data, using confidentiality and privacy protocols that are much, much stricter than the policies used by corporations.”
The Stanford Screenomics platform has multiple levels of privacy protection. Researchers using the platform need not only Institutional Review Board (IRB) approval but also Google Play Store approval. The informed consent form is thorough: Study participants have to read and acknowledge their understanding of each data type, how frequently it is collected, and how it will be used. The data collection app also has a conspicuous pause button, giving people the freedom to turn data collection off when doing a financial transaction or having a private text chat. They can also turn the data collection app off entirely if they so choose.
Screenomics and Health
With a focus on preventive medicine, Kim himself is using the platform to study how digital environments shape health behaviors. For example, what is the interplay between screen time and physical habits, and what does that reveal about opportunities for clinical intervention? What specific digital triggers lead to improvements in health outcomes? How can interventions be personalized to ensure long-term adherence?
“The next frontier is integrating AI to turn raw Screenome data into actionable insights,” Kim says. “Eventually, we hope to move beyond observation and provide real-time, personalized, adaptive support.”
Contributing Stanford authors include: Nick Haber, assistant professor at the Stanford Graduate School of Education; Thomas N. Robinson, Irving Schulman, MD Endowed Professor in Child Health; Byron Reeves, Paul C. Edwards Professor of Communication, Nathan Kline, adolescent screenomics study coordinator, and Jack Boffa, Stanford class of 2025.
This research was supported in part by the National Heart, Lung, and Blood Institute of the National Institutes of Health and the Stanford Institute for Human-Centered AI.





