Abstract: It looks at the improvements made in NLP achieved by transformer-based models, with special focus on BERT (i.e. Bidirectional Encoder Representations from Transformers), which have already ...