Remove 2012 Remove BERT Remove Computational Linguistics
article thumbnail

Explosion in 2019: Our Year in Review

Explosion

The update fixed outstanding bugs on the tracker, gave the docs a huge makeover, improved both speed and accuracy, made installation significantly easier and faster, and added some exciting new features, like ULMFit/BERT/ELMo-style language model pretraining. ✨ Mar 20: A few days later, we upgraded Prodigy to v1.8 to support spaCy v2.1.

NLP 52
article thumbnail

Multi-domain Multilingual Question Answering

Sebastian Ruder

Reading Comprehension assumes a gold paragraph is provided Standard approaches for reading comprehension build on pre-trained models such as BERT. Using BERT for reading comprehension involves fine-tuning it to predict a) whether a question is answerable and b) whether each token is the start and end of an answer span.

BERT 52