article thumbnail

Modular Deep Learning

Sebastian Ruder

d) Hypernetwork: A small separate neural network generates modular parameters conditioned on metadata.  Instead of learning module parameters directly, they can be generated using an auxiliary model (a hypernetwork) conditioned on additional information and metadata. Parameter composition.  We Module parameter generation.  Instead

article thumbnail

ACL 2022 Highlights

Sebastian Ruder

The initiative focuses on making Computational Linguistics (CL) research accessible in 60 languages and across all modalities, including text/speech/sign language translation, closed captioning, and dubbing. use an LM to generate relevant knowledge statements in a few-shot setting.

NLP 52
professionals

Sign Up for our Newsletter

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

article thumbnail

The State of Multilingual AI

Sebastian Ruder

Developing models that work for more languages is important in order to offset the existing language divide and to ensure that speakers of non-English languages are not left behind, among many other reasons. Writing System and Speaker Metadata for 2,800+ Language Varieties. Lucassen, T., 2340–2354).

article thumbnail

All Languages Are NOT Created (Tokenized) Equal

Topbots

Language Disparity in Natural Language Processing This digital divide in natural language processing (NLP) is an active area of research. 70% of research papers published in a computational linguistics conference only evaluated English.[ Association for Computational Linguistics.

article thumbnail

Accelerate hyperparameter grid search for sentiment analysis with BERT models using Weights & Biases, Amazon EKS, and TorchElastic

AWS Machine Learning Blog

Ana has had several leadership roles at startups and large corporations such as Intel and eBay, leading ML inference and linguistics related products. Ana has a Masters in Computational Linguistics and an MBA form Haas/UC Berkeley, and and has been a visiting scholar in Linguistics at Stanford.

BERT 75