Embeddings are numerical representations that convert complex data (like words, images, or other objects) into vectors of numbers that capture their meaning or characteristics. The core idea is that similar items end up close together in this mathematical space—for example, the words "dog" and "puppy" would have similar embeddings, while "dog" and "toast" would be far apart.
Get the latest news, advances in research, policy work, and education program updates from HAI in your inbox weekly.
Sign Up For Latest News
Explore Similar Terms:
Domino: Discovering Systematic Errors with Cross-Modal Embeddings
Domino: Discovering Systematic Errors with Cross-Modal Embeddings
Regional Negative Bias in Word Embeddings Predicts Racial Animus--but only via Name Frequency
Regional Negative Bias in Word Embeddings Predicts Racial Animus--but only via Name Frequency
Negative Associations in Word Embeddings Predict Anti-black Bias across Regions–but Only via Name Frequency
Negative Associations in Word Embeddings Predict Anti-black Bias across Regions–but Only via Name Frequency