Big data

Big data                                             Data Engineering

noun phrase

Definition: A term for datasets whose characteristics – especially volume, variety, velocity, and/or variability – require scalable architectures for efficient storage, manipulation, and analysis, rather than traditional data-processing approaches. This wording closely follows the NIST Big Data Interoperability Framework [NIST].

Example in context: “However, big data is usually assumed to just be available when doing a scientific study or developing language technology tools, and the judgement “too little data” can mercilessly decide over the construction of an MT program, inclusion in predominant writing programs (MS Word etc) as well as whole platforms (Android, iOS).” [Wiechetek et al. 2022]

Related terms: large-scale data, high-volume data, volume/velocity/variety (3Vs)

Добавить комментарий 0

Ваш электронный адрес не будет опубликован. Обязательные поля помечены *