Word embedding

Word embedding                                                 NLP

noun phrase

Definition: A representation of a word as a dense, low-dimensional vector of real numbers learned from distributional patterns in text, so that semantic and often syntactic relationships are reflected in the relative positions of vectors in a continuous space. In standard usage, word embedding refers to the representation itself, whereas word vector is a closely related near-synonymous label for an individual vector. [IBM].

Example in context: The basic idea is that each word should be represented by a low dimensional feature vector, also known as a word embedding. [Pereira et al. 2024]

Synonym: word vector, vector representation of words

Related terms: word vector, distributional semantics, static embedding

Добавить комментарий 0

Ваш электронный адрес не будет опубликован. Обязательные поля помечены *