Remove 2016 Remove Neural Network Remove NLP
article thumbnail

Introducing Our New Punctuation Restoration and Truecasing Models

AssemblyAI

  Each stage leverages a deep neural network that operates as a sequence labeling problem but at different granularities: the first network operates at the token level and the second at the character level. Training Data : We trained this neural network on a total of 3.7 billion words).   Fig.

article thumbnail

Embed, encode, attend, predict: The new deep learning formula for state-of-the-art NLP models

Explosion

Over the last six months, a powerful new neural network playbook has come together for Natural Language Processing. Most NLP problems can be reduced to machine learning problems that take one or more texts as input. However, most NLP problems require understanding of longer spans of text, not just individual words.

professionals

Sign Up for our Newsletter

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

article thumbnail

The State of Transfer Learning in NLP

Sebastian Ruder

This post expands on the NAACL 2019 tutorial on Transfer Learning in NLP. In the span of little more than a year, transfer learning in the form of pretrained language models has become ubiquitous in NLP and has contributed to the state of the art on a wide range of tasks. However, transfer learning is not a recent phenomenon in NLP.

NLP 75
article thumbnail

Pytorch vs Tensorflow: A Head-to-Head Comparison

Viso.ai

Artificial Neural Networks (ANNs) have been demonstrated to be state-of-the-art in many cases of supervised learning, but programming an ANN manually can be a challenging task. These frameworks provide neural network units, cost functions, and optimizers to assemble and train neural network models.

article thumbnail

Some papers I liked at ACL 2016

Hal Daumé III

P16-1152 : Artem Sokolov; Julia Kreutzer; Christopher Lo; Stefan Riezler Learning Structured Predictors from Bandit Feedback for Interactive NLP This was perhaps my favorite paper of the conference because it's trying to do something new and hard and takes a nice approach. Why do I like this? P16-2096 : Dirk Hovy; Shannon L.

article thumbnail

Some picks from NAACL 2016

Hal Daumé III

I like this paper because the problem is great (I think NLP should really push in the direction of helping learners learn and teachers teach!), A Latent Variable Recurrent Neural Network for Discourse Relation Language Models by Ji, Haffari and Eisenstein. the dataset is nice (if a bit small) and the approach makes sense.

article thumbnail

ML and NLP Research Highlights of 2020

Sebastian Ruder

The selection of areas and methods is heavily influenced by my own interests; the selected topics are biased towards representation and transfer learning and towards natural language processing (NLP). This is less of a problem in NLP where unsupervised pre-training involves classification over thousands of word types.

NLP 52