Sequence-to-sequence task

Sequence-to-sequence task                             NLP

noun phrase

Definition: A task that maps an input token sequence to an output token sequence, such as translation or question answering [Google Machine Learning Glossary]. 

Example in context:BERT (Bidirectional Encoder Representations from Transformers), a pre-trained Language model, is based on the transformer architecture and is optimized for sequence-to-sequence tasks such as language modeling and machine translation.” [Baswani et al. 2023]

Synonyms: seq2seq

Related terms: encoder-decoder, machine translation

Добавить комментарий 0

Ваш электронный адрес не будет опубликован. Обязательные поля помечены *