Remove BERT Remove Computational Linguistics Remove Magazine
article thumbnail

Explosion in 2019: Our Year in Review

Explosion

The update fixed outstanding bugs on the tracker, gave the docs a huge makeover, improved both speed and accuracy, made installation significantly easier and faster, and added some exciting new features, like ULMFit/BERT/ELMo-style language model pretraining. ✨ Mar 20: A few days later, we upgraded Prodigy to v1.8 to support spaCy v2.1.

NLP 52
article thumbnail

Major trends in NLP: a review of 20 years of ACL research

NLP People

The 57th Annual Meeting of the Association for Computational Linguistics (ACL 2019) is starting this week in Florence, Italy. Especially pre-trained word embeddings such as Word2Vec, FastText and BERT allow NLP developers to jump to the next level. In IEEE Computational Intelligence Magazine – vol.

NLP 52