Remove 2016 Remove Computational Linguistics Remove Natural Language Processing
article thumbnail

The State of Transfer Learning in NLP

Sebastian Ruder

From shallow to deep  Over the last years, state-of-the-art models in NLP have become progressively deeper. Up to two years ago, the state of the art on most tasks was a 2-3 layer deep BiLSTM, with machine translation being an outlier with 16 layers ( Wu et al.,

NLP 75
article thumbnail

Multi-domain Multilingual Question Answering

Sebastian Ruder

A domain can be seen as a manifold in a high-dimensional variety space consisting of many dimensions such as socio-demographics, language, genre, sentence type, etc ( Plank et al., 2016 ), Natural Questions (NQ; Kwiatkowski et al., 2016 ), among many others. 2016 ), and BookTest ( Bajgar et al., 2018 ; Gupta et al.,

BERT 52
article thumbnail

ML and NLP Research Highlights of 2021

Sebastian Ruder

2021) 2021 saw many exciting advances in machine learning (ML) and natural language processing (NLP). Transactions of the Association for Computational Linguistics, 9, 978–994. Transactions of the Association for Computational Linguistics, 9, 570–585. Schneider, R., Alayrac, J.

NLP 52