article thumbnail

Peer Review Has Improved My Papers

Ehud Reiter

BLEU survey: better presentation of results Another example is my 2018 paper which presented a structured survey of the validity of BLEU; this was published in Computational Linguistics journal. In short, by insisting that we do a proper evaluation, the reviewers massively improved our paper.

article thumbnail

The Seven Trends in Machine Translation for 2019

NLP People

Hundreds of researchers, students, recruiters, and business professionals came to Brussels this November to learn about recent advances, and share their own findings, in computational linguistics and Natural Language Processing (NLP). According to what was discussed at WMT 2018 that might not be the case — at least not anytime soon.

BERT 52
professionals

Sign Up for our Newsletter

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

article thumbnail

The three levels of NLP for your business

NLP People

According to Gartner’s hype cycle, NLP has reached the peak of inflated expectations in 2018, and many businesses see it as a “go-to” solution to generate value from the 80% of business-relevant data that comes in unstructured form. The folks here often split into two camps — the mathematicians and the linguists.

NLP 52
article thumbnail

Selective Classification Can Magnify Disparities Across Groups

The Stanford AI Lab Blog

In Association for Computational Linguistics (ACL), pp. 1112–1122, 2018. ↩ Yonatan Giefman and Ran El-Yaniv. . ↩ Adina Williams, Nikita Nangia, and Samuel Bowman. A broad-coverage challenge corpus for sentence understanding through inference. SelectiveNet: A deep neural network with an integrated reject option.

article thumbnail

Instruction fine-tuning for FLAN T5 XL with Amazon SageMaker Jumpstart

AWS Machine Learning Blog

Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers). “Scaling instruction-fine tuned language models.” arXiv preprint arXiv:2210.11416 (2022). [2] 2] Rajpurkar, Pranav, Robin Jia, and Percy Liang. Know What You Don’t Know: Unanswerable Questions for SQuAD.”

article thumbnail

The State of Transfer Learning in NLP

Sebastian Ruder

2018 ; Akbik et al., 2018 ; Baevski et al., Given enough data, a large number of parameters, and enough compute, a model can do a reasonable job. 2018 ; Wang et al., 2018 , Ruder et al., 2017 ) and pretrained language models ( Peters et al., 2019 ) of recent years. 2017 ; Peters et al.,

NLP 75
article thumbnail

A Gentle Introduction to GPTs

Mlearning.ai

It combines techniques from computational linguistics, probabilistic modeling, deep learning to make computers intelligent enough to grasp the context and the intent of the language. GPT-3 is a successor to the earlier GPT-2 (released in Feb 2019) and GPT-1 (released in June 2018) models .