Continual Adapter Tuning (CAT): A Parameter-Efficient Machine Learning Framework that Avoids Catastrophic Forgetting and Enables Knowledge Transfer from Learned ASC Tasks to New ASC Tasks
Marktechpost
MARCH 27, 2024
Continual Learning (CL) poses a significant challenge for ASC models due to Catastrophic Forgetting (CF), wherein learning new tasks leads to a detrimental loss of previously acquired knowledge. These adapters allow BERT to be fine-tuned for specific downstream tasks while retaining most of its pre-trained parameters.
Let's personalize your content