Word embedding

Word embedding                                    NLP

noun phrase

Definition: A representation of a word as a real-valued vector in a continuous vector space, where semantically or contextually similar words tend to be located closer to one another. Word embeddings are a specific subtype of embeddings used for lexical representation in NLP [Google Cloud Generative AI Glossary].

Example in context: “We propose new static word embeddings optimised for sentence semantic representation.” [Wada et al. 2025]

Synonym: word vector

Related terms: embedding; sentence embedding; token embedding; semantic representation

Добавить комментарий 0

Ваш электронный адрес не будет опубликован. Обязательные поля помечены *