Remove 2014 Remove Computational Linguistics Remove Natural Language Processing
article thumbnail

The State of Multilingual AI

Sebastian Ruder

Developing models that work for more languages is important in order to offset the existing language divide and to ensure that speakers of non-English languages are not left behind, among many other reasons. In Findings of the Association for Computational Linguistics: ACL 2022 (pp. 2340–2354). Winata, G.

article thumbnail

How to Improve User Experience (and Behavior): Three Papers from Stanford's Alexa Prize Team

The Stanford AI Lab Blog

In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pages 5370-5381, Florence, Italy. Association for Computational Linguistics. ↩ Sheryl Brahnam. Interact 2005 work- shop Abuse: The darker side of Human-Computer Interaction , pages 62–67. ↩ John Suler.

professionals

Sign Up for our Newsletter

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

Trending Sources

article thumbnail

The State of Transfer Learning in NLP

Sebastian Ruder

Later approaches then scaled these representations to sentences and documents ( Le and Mikolov, 2014 ; Conneau et al., LM pretraining   Many successful pretraining approaches are based on variants of language modelling (LM). Early approaches such as word2vec ( Mikolov et al., 2017 ; Peters et al.,

NLP 75
article thumbnail

Major trends in NLP: a review of 20 years of ACL research

NLP People

The 57th Annual Meeting of the Association for Computational Linguistics (ACL 2019) is starting this week in Florence, Italy. On the other hand, a new and pretty disruptive mechanism for sequential processing – attention – has been introduced in the sequence-to-sequence (seq2seq) model by Sutskever et al. White (2014).

NLP 52