🌑

Stephen Cheng

Grammatical-Error-Correction Sequence Tagging System with an Pretrained BERT-like Transformer Encoder

 

Stephen Cheng

Intro

Here a simple and efficient GEC (Grammatical Error Correction) sequence tagger using a Transformer encoder is introduced. It is pre-trained on synthetic data and then fine-tuned in two stages: first on errorful corpora, and second on a combination of errorful and error-free parallel corpora. In addition, a custom token-level transformation to map input tokens to target corrections is designed. The original paper can be found here.

Thus the GEC sequence tagging system here consists of three training stages: pretraining on synthetic data, fine-tuning on an errorful parallel corpus, and finally, fine-tuning on a combination of errorful and error-free parallel corpora.

The GEC sequence tagging system incorporates a pre-trained Transformer encoder, those encoders from XLNet and RoBERTa
outperform three other cutting-edge Transformer encoders (ALBERT, BERT, and GPT-2).

Use Case

The original code of the GEC sequence tagging model is here, more details of running code are also included.

, , — Oct 19, 2020

Search

    Made with ❤️ and ☀️ on Earth.