article thumbnail

Multi-domain Multilingual Question Answering

Sebastian Ruder

2016 ), Natural Questions (NQ; Kwiatkowski et al., 2016 ), among many others. 2016 ), and BookTest ( Bajgar et al., 2016 ) or narratives written by crowd workers such as MCTest ( Richardson et al., 2016 ), and ROCStories ( Mostafazedh et al., 2016 ), and BookTest ( Bajgar et al., 2016 ; Chandu et al.,

BERT 52
article thumbnail

The State of Transfer Learning in NLP

Sebastian Ruder

From shallow to deep  Over the last years, state-of-the-art models in NLP have become progressively deeper. Up to two years ago, the state of the art on most tasks was a 2-3 layer deep BiLSTM, with machine translation being an outlier with 16 layers ( Wu et al.,

NLP 75
professionals

Sign Up for our Newsletter

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

article thumbnail

ML and NLP Research Highlights of 2021

Sebastian Ruder

Transactions of the Association for Computational Linguistics, 9, 978–994. Transactions of the Association for Computational Linguistics, 9, 570–585. In Advances in Neural Information Processing Systems 29 (NIPS 2016). Transactions of the Association for Computational Linguistics, 9, 362–373.

NLP 52
article thumbnail

Selective Classification Can Magnify Disparities Across Groups

The Stanford AI Lab Blog

In Association for Computational Linguistics (ACL), pp. In International Conference on Machine Learning (ICML), 2016. ↩ ↩ 2 Yonatan Geifman and Ran El-Yaniv. In World Wide Web (WWW), pp. 491–500, 2019. ↩ Adina Williams, Nikita Nangia, and Samuel Bowman. Selective classification for deep neural networks.