Rectified linear unit

Rectified linear unit (ReLU)                                     Deep Learning

noun

Definition 1: An element-wise activation function used in neural networks, typically defined as returning zero for negative inputs and the input itself for positive inputs [Bakanach 2023/2024].

Example in context: “For the hidden layers, we employed ReLU activation functions, which give the model non-linearity.” [Choudhury et al. 2024]

Definition 2: A neural-network element or layer that applies this activation to its input representation [Bakanach 2023/2024].

Example in context: “A ReLU layer is applied to the hidden representation, and a normalization layer is applied to the output of the up-projection layer before summing it with the initial input embedding via a residual connection.” [Philippy et al. 2024]

Synonym: ReLU activation (sense 1)

Related terms: activation function, non-linearity, leaky ReLU, GELU, hidden layer, neural network layer

Добавить комментарий 0

Ваш электронный адрес не будет опубликован. Обязательные поля помечены *