Gradient clipping

Gradient clipping                                   Deep Learning

noun phrase

Definition: A technique that limits gradient values (or gradient norm) during training to prevent exploding gradients and stabilize optimization [TensorFlow/Keras optimizer docs].

Example in context: Tempered sigmoid functions, however, control the gradient norm and reduce the amount of actual gradient clipping that takes place during DP-Training.” [Ponomareva et al. 2023]

Related terms: clip-by-norm, clip-by-value, exploding gradients

Добавить комментарий 0

Ваш электронный адрес не будет опубликован. Обязательные поля помечены *