Remove 2011 Remove BERT Remove Neural Network
article thumbnail

Dude, Where’s My Neural Net? An Informal and Slightly Personal History

Lexalytics

This book effectively killed off interest in neural networks at that time, and Rosenblatt, who died shortly thereafter in a boating accident, was unable to defend his ideas. (I Around this time a new graduate student, Geoffrey Hinton, decided that he would study the now discredited field of neural networks.

article thumbnail

From text to dream job: Building an NLP-based job recommender at Talent.com with Amazon SageMaker

AWS Machine Learning Blog

Founded in 2011, Talent.com is one of the world’s largest sources of employment. The two sets are standard feature engineering and fine-tuned Sentence-BERT (SBERT) embeddings. The triple-tower architecture provides three parallel deep neural networks, with each tower processing a set of features independently.

NLP 100
professionals

Sign Up for our Newsletter

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

article thumbnail

Unsupervised Cross-lingual Representation Learning

Sebastian Ruder

In particular, I cover unsupervised deep multilingual models such as multilingual BERT. Cross-lingual learning in the transfer learning taxonomy ( Ruder, 2019 ) Methods from domain adaptation have also been applied to cross-lingual transfer ( Prettenhofer & Stein, 2011 , Wan et al., 2015 , Artetxe et al., 2019 ; Wu et al.,

BERT 52
article thumbnail

Introducing spaCy v2.1

Explosion

In 2011, deep learning methods were proving successful for NLP, and techniques for pretraining word representations were already in use. A range of techniques for pretraining further layers of the network were proposed over the years, as the deep learning hype took hold. Clearly, we couldn’t use a model such as BERT or GPT-2 directly.

NLP 52