Candidate sampling

Candidate sampling                                  Deep Learning

noun phrase

Definition: A family of training techniques used to reduce the computational cost of learning with very large output spaces by selecting a subset of candidate classes for each update instead of evaluating the full set of possible outputs. In deep learning, candidate sampling functions as an umbrella term covering approximation methods such as sampled softmax and noise-contrastive estimation (NCE); negative sampling is closely related but is usually treated as a more specific method rather than a full synonym [TensorFlow documentation].

Example in context: “The candidate sampling based criterion such as Noise Contrastive Estimation (NCE) and Sampled-softmax are usually used to train the EBR models due to the computational difficulty to evaluate partition functions by summing over the entire vocabulary of large scale item corpus.” [Chen et al. 2022]

Related terms: sampled softmax; noise-contrastive estimation (NCE); negative sampling

Добавить комментарий 0

Ваш электронный адрес не будет опубликован. Обязательные поля помечены *