Theses Doctoral

New Approaches Towards Learning Across Time and Domains

Mahdaviyeh, Yasaman

This thesis studies topics in transfer and continual learning. Both of these fields concern learning across multiple tasks and are motivated by modern machine learning problems where models which are usually neural networks, need a large amount of samples and compute resources to train. This resource intensive training has motivated transfer learning, where the learner utilizes samples from a source distribution to perform well on a target distribution.

We explore a setting where learning not only involves choosing a hypothesis, but also a hypothesis class (model selection). Our setup is a natural extension of the PAC framework to the transfer learning setting. We show that unlike the single hypothesis class setting, where previous work has shown adaptivity to transfer distance, lack of distributional knowledge limits the learner in model selection. Since training models from scratch is expensive, being able to update them with new training data, potentially from new related tasks, becomes important. This is one motivation behind continual learning, where the learner visits task sequentially.

When trained on a new task, neural networks tend to forget previous tasks, which means that their performance on the previous tasks deteriorates. Developing techniques to reduce forgetting has been a focus of empirical work in continual learning. Theoretical studies of these techniques have been more rare, and we start with one of commonly used techniques called sample replay. We study sample replay in continual linear regression, and find that there are cases where sample replay could increase forgetting. This is surprising since our setting is benign; there is no label noise and we assume that the tasks share a solution.

These findings are followed by empirical studies, where we probe how far these results can be extended into more natural settings.

Files

This item is currently under embargo. It will be available starting 2026-09-02.

More About This Work

Academic Units
Computer Science
Thesis Advisors
Pitassi, Toniann
Degree
Ph.D., Columbia University
Published Here
October 8, 2025