Abstract: In recent years, Natural Language Processing (NLP) has gradually become a heated topic in research area, and Transformer-based pretrained models (the most widely used is BERT) have achieved ...
Abstract: Recently, pre-trained Transformer [1] based language models such as BERT [2] and GPT [3], have shown great improvement in many Natural Language Processing (NLP) tasks. However, these models ...