Federated Learning is a machine learning approach where a model is trained across multiple decentralized devices or servers that hold local data, without the data ever leaving its original location. Instead of collecting all data in one central location, each device trains the model on its own data and only shares model updates (like adjusted weights) with a central server, which aggregates these updates to improve the global model. The approach improves privacy and security by keeping sensitive data (think health care records, financial information) localized.
Get the latest news, advances in research, policy work, and education program updates from HAI in your inbox weekly.
Sign Up For Latest News
Explore Similar Terms:

A Stanford researcher advocates for clarity about the different types of interpretability and the contexts in which it is useful.
A Stanford researcher advocates for clarity about the different types of interpretability and the contexts in which it is useful.


Stanford researchers show that shifting the cognitive costs and benefits of engaging with AI explanations could result in fewer erroneous decisions due to AI overreliance.
Stanford researchers show that shifting the cognitive costs and benefits of engaging with AI explanations could result in fewer erroneous decisions due to AI overreliance.
