article thumbnail

Major trends in NLP: a review of 20 years of ACL research

NLP People

The 57th Annual Meeting of the Association for Computational Linguistics (ACL 2019) is starting this week in Florence, Italy. The universal linguistic principle behind word embeddings is distributional similarity: a word can be characterized by the contexts in which it occurs. Sequence to sequence learning with neural networks.

NLP 52
article thumbnail

The State of Transfer Learning in NLP

Sebastian Ruder

2013 ) learned a single representation for every word independent of its context. This goes back to layer-wise training of early deep neural networks ( Hinton et al., Early approaches such as word2vec ( Mikolov et al., Instead, we train layers individually to give them time to adapt to the new task and data. 2006 ; Bengio et al.,

NLP 75
article thumbnail

Parsing English in 500 Lines of Python

Explosion

I wrote this blog post in 2013, describing an exciting advance in natural language understanding technology. It would be relatively easy to provide a beam-search version of spaCy…But, I think the gap in accuracy will continue to close, especially given advances in neural network learning.

Python 45