Adaptive Gradient Algorithm

Adaptive Gradient Algorithm (AdaGrad)             Machine Learning; DL

noun phrase

Definition: A gradient-descent optimization algorithm that rescales the gradient for each parameter individually, effectively assigning each parameter its own adaptive learning rate [Bakanach 2023/2024].

Example in context:This optimizer is characterized by the combination of an adaptive gradient algorithm with root mean square propagation and is especially suited to problems involving large amounts of data.” [Mattia et al. 2024]

Synonyms: AdaGrad

Related terms:  gradient descent; Adam; RMSProp; learning rate scheduling

Добавить комментарий 0

Ваш электронный адрес не будет опубликован. Обязательные поля помечены *