What is a Hyperparameter? | Stanford HAI
Stanford
University
  • Stanford Home
  • Maps & Directions
  • Search Stanford
  • Emergency Info
  • Terms of Use
  • Privacy
  • Copyright
  • Trademarks
  • Non-Discrimination
  • Accessibility
© Stanford University.  Stanford, California 94305.
Skip to content
  • About

    • About
    • People
    • Get Involved with HAI
    • Support HAI
    • Subscribe to Email
  • Research

    • Research
    • Fellowship Programs
    • Grants
    • Student Affinity Groups
    • Centers & Labs
    • Research Publications
    • Research Partners
  • Education

    • Education
    • Executive and Professional Education
    • Government and Policymakers
    • K-12
    • Stanford Students
  • Policy

    • Policy
    • Policy Publications
    • Policymaker Education
    • Student Opportunities
  • AI Index

    • AI Index
    • AI Index Report
    • Global Vibrancy Tool
    • People
  • News
  • Events
  • Industry
  • Centers & Labs

What is a Hyperparameter?

A Hyperparameter is a parameter whose value is set before the learning process of a machine learning model begins. Unlike model parameters, which are learned automatically during training, Hyperparameters must be chosen manually by the user or through optimization techniques. 

Navigate
  • About
  • Events
  • Careers
  • Search
Participate
  • Get Involved
  • Support HAI
  • Contact Us

Stay Up To Date

Get the latest news, advances in research, policy work, and education program updates from HAI in your inbox weekly.

Sign Up For Latest News


Hyperparameter mentioned at Stanford HAI

Explore Similar Terms:

Parameter | Training Data | Optimization Algorithm

See Full List of Terms & Definitions

Enroll in a Human-Centered AI Course

This AI program covers technical fundamentals, business implications, and societal considerations.
Stanford CRFM Introduces PubMedGPT 2.7B
Michihiro Yasunaga, Tony Lee, Percy Liang
Elliot Bolton, David Hall, Chris Manning
Dec 15
news

The new 2.7B parameter language model trained on biomedical literature delivers an improved state of the art for medical question answering. 

Stanford CRFM Introduces PubMedGPT 2.7B

Michihiro Yasunaga, Tony Lee, Percy Liang
Elliot Bolton, David Hall, Chris Manning
Dec 15

The new 2.7B parameter language model trained on biomedical literature delivers an improved state of the art for medical question answering. 

Healthcare
news
TextGrad: AutoGrad for Text
Federico Bianchi, James Zou
Jun 19
news
Your browser does not support the video tag.

Scholars develop a new framework that optimizes compound AI systems by backpropagating large language model feedback.

TextGrad: AutoGrad for Text

Federico Bianchi, James Zou
Jun 19

Scholars develop a new framework that optimizes compound AI systems by backpropagating large language model feedback.

Machine Learning
Your browser does not support the video tag.
news
SEAMS Affiliate Program

SEAMS: Self-improving, Efficient and Accelerated Models and Systems

SEAMS Affiliate Program

Oct 28

SEAMS: Self-improving, Efficient and Accelerated Models and Systems