Model calibration

Model calibration                     Machine Learning; Explainable AI (XAI)

noun phrase

Definition: The property or process by which a model’s predicted probabilities are aligned with observed outcome frequencies, so that stated confidence levels correspond to empirical likelihoods [Pavlovic 2025].

Example in context: “In this work, we perform a comprehensive investigation into machine learning model calibration across 7 open access engineering mechanics datasets.” [Mohammadzadeh et al. 2023]

Synonyms: probability calibration; confidence calibration

Related terms: reliability diagram; temperature scaling; calibration error; uncertainty estimation

Добавить комментарий 0

Ваш электронный адрес не будет опубликован. Обязательные поля помечены *