Remove 2011 Remove BERT Remove Computational Linguistics
article thumbnail

The State of Multilingual AI

Sebastian Ruder

Research models such as BERT and T5 have become much more accessible while the latest generation of language and multi-modal models are demonstrating increasingly powerful capabilities. Bender [2] highlighted the need for language independence in 2011. Linguistic Issues in Language Technology, 6(3), 1–26. Winata, G.

article thumbnail

Multi-domain Multilingual Question Answering

Sebastian Ruder

Reading Comprehension assumes a gold paragraph is provided Standard approaches for reading comprehension build on pre-trained models such as BERT. Using BERT for reading comprehension involves fine-tuning it to predict a) whether a question is answerable and b) whether each token is the start and end of an answer span.

BERT 52