Paper
5 October 2021 A review of self-encoding language models for bidirectional representation
Yuetian Chen
Author Affiliations +
Proceedings Volume 11911, 2nd International Conference on Computer Vision, Image, and Deep Learning; 119110O (2021) https://doi.org/10.1117/12.2604536
Event: 2nd International Conference on Computer Vision, Image and Deep Learning, 2021, Liuzhou, China
Abstract
This paper presents a survey and implementation of BERT language models by analyzing and summarizing its superiority or limitations in relevant tasks and the reasons for it. On this basis, a family of BERT-like models is collected to address these limitations - namely, ERNIE 2.0 (Baidu) by adding and refining pre-training tasks and MT-DNN (MICROSOFT) by introducing multi-task learning downstream. By comparing the changes in the native models with controlling variables for specific tasks and the impact of pre-training on the models in real-world applications with the same language model architecture, we summarize and outlook the potential directions and characteristics of future technological iterations in the field of natural language processing in language modeling.
© (2021) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Yuetian Chen "A review of self-encoding language models for bidirectional representation", Proc. SPIE 11911, 2nd International Conference on Computer Vision, Image, and Deep Learning, 119110O (5 October 2021); https://doi.org/10.1117/12.2604536
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Data modeling

Performance modeling

Instrument modeling

Transformers

Computer programming

Neural networks

Software development

Back to Top