Remove 2016 Remove Computational Linguistics Remove NLP
article thumbnail

The State of Transfer Learning in NLP

Sebastian Ruder

This post expands on the NAACL 2019 tutorial on Transfer Learning in NLP. In the span of little more than a year, transfer learning in the form of pretrained language models has become ubiquitous in NLP and has contributed to the state of the art on a wide range of tasks. However, transfer learning is not a recent phenomenon in NLP.

NLP 75
article thumbnail

ML and NLP Research Highlights of 2021

Sebastian Ruder

2021) 2021 saw many exciting advances in machine learning (ML) and natural language processing (NLP). If CNNs are pre-trained the same way as transformer models, they achieve competitive performance on many NLP tasks [28].   Popularized by GPT-3 [32] , prompting has emerged as a viable alternative input format for NLP models.

NLP 52
professionals

Sign Up for our Newsletter

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

article thumbnail

Multi-domain Multilingual Question Answering

Sebastian Ruder

2016 ), Natural Questions (NQ; Kwiatkowski et al., 2016 ), among many others. 2016 ), and BookTest ( Bajgar et al., 2016 ) or narratives written by crowd workers such as MCTest ( Richardson et al., 2016 ), and ROCStories ( Mostafazedh et al., 2016 ), and BookTest ( Bajgar et al., 2019 ), DROP ( Dua et al.,

BERT 52
article thumbnail

Selective Classification Can Magnify Disparities Across Groups

The Stanford AI Lab Blog

Across a range of applications from vision 1 2 3 and NLP 4 5 , even simple selective classifiers, relying only on model logits, routinely and often dramatically improve accuracy by abstaining. In Association for Computational Linguistics (ACL), pp. This makes selective classification a compelling tool for ML practitioners 6 7.