HAI Weekly Seminar with Nina Vasan and Sara Johansen
A Psychiatrist’s Perspective on Social Media Algorithms and Mental Health
Get the latest news, advances in research, policy work, and education program updates from HAI in your inbox weekly.
Sign Up For Latest News
A Psychiatrist’s Perspective on Social Media Algorithms and Mental Health
The AI Index, currently in its ninth year, tracks, collates, distills, and visualizes data relating to artificial intelligence.

The AI Index, currently in its ninth year, tracks, collates, distills, and visualizes data relating to artificial intelligence.
The possibility that AI will automate most cognitive labor is worth taking seriously. How should we adapt to this transformation? I start from the perspective, articulated in the essay “AI as normal technology”, that the true bottlenecks lie downstream of capabilities and that AI’s impacts will unfold gradually over decades. If this is true, there are major gaps in our current evidence infrastructure, because it over-emphasizes the capability layer.
.png&w=1920&q=100)
The possibility that AI will automate most cognitive labor is worth taking seriously. How should we adapt to this transformation? I start from the perspective, articulated in the essay “AI as normal technology”, that the true bottlenecks lie downstream of capabilities and that AI’s impacts will unfold gradually over decades. If this is true, there are major gaps in our current evidence infrastructure, because it over-emphasizes the capability layer.
The algorithms that determine the people and content we see on social media platforms are designed to keep us watching. These platforms have massive reach, with over 3 billion people active on social media platforms globally. With the opportunity to affect change at scale comes even greater responsibility to act ethically and compassionately. As both physicians and industry consultants to leading social media companies, we offer our perspectives into the ways that social media can both support and harm mental health and wellbeing. Social media can offer individual and collective benefits in creative expression, rapid spread of information, and social connection, and yet, active engagement on social media platforms is correlated with increased symptoms of depression and anxiety, exacerbation of self-harm behavior, and in some cases, suicide and accidental death. Though social media platforms are taking action to be more accountable, there is an opportunity for proactive solutions that consider evidence-based mental health interventions and ethical design principles.
We propose that social media offers a unique intervention point with the potential to impact the health of billions of people globally. We present safety frameworks in social media and explore how algorithms can be used to design interventions that support the most vulnerable groups, considering not only if an intervention is helpful but how and for whom it is helpful. We also discuss empathic design frameworks that seek to emphasize human qualities of compassion, empathy, and mutual support on these platforms.
If you have questions for the speakers, please email them at brainstormlaboratory@gmail.com.
