Feature importances

Feature importances                             Machine Learning

noun phrase

Definition: Quantitative scores that indicate how much each feature contributes to a model’s predictions or predictive performance, commonly used in tree-based models and other interpretability workflows [scikit-learn documentation].

Example in context:  “TabNet translates the local and global interpretability as feature importances.” [Borsos et al. 2023]

Synonym: variable importances

Related terms: feature importance scores, permutation importance, SHAP values (related interpretability method)

Добавить комментарий 0

Ваш электронный адрес не будет опубликован. Обязательные поля помечены *