- 11 views
Krishnan Raghavan
Abstract:
One of the critical features of an intelligent system is to continually execute tasks in a real-world environment. As a new task is revealed, we seek to efficiently adapt to a new task (improve generalization) and, in the process of generalization, we seek to remember the previous tasks (minimize catastrophic forgetting). Consequentially, there are two key challenges that must be modeled: catastrophic forgetting and generalization. Despite promising methodological advancements, there is a lack of a theoretical approach that enable analysis of these challenges.
In this talk, we discuss modelling and analysis of continual learning using tools from differential equation theory. We discuss the broad applicability of our approach and demonstrate the many applications where such an approach is required. We will derive methods in some of these applications using this point of view and show the effectiveness of such approaches in modelling these applications.
Bio:
I am an assistant computational mathematician with the mathematics and computer science division at Argonne national laboratory. I received my Ph.D. in computer engineering from missouri university of science and technology in 2019 and have been at Argonne since then. My primary research agenda is to develop a mathematical characterization of machine learning (ML) models, their learning/training behavior and the associated precision achieved by them. Towards this end, I study the two broad facets of ML: theory; through the eyes of tools from systems theory, statistics and optimization; and applied; by building AI/ML models to solve key problems in nuclear physics, material science, HPC and more recently climate. I enjoy rock climbing, outdoors, cycling, love ramen and many other nerdy things including but not limited to fantasy fiction novels -- go Malazan.