20 May 2021
Colloquium by Prof. Weinan E, Princeton University and Peking University
Thursday, 20 May 2021 14:00, Zoom Meeting
The heart of modern machine learning (ML) is the approximation of high dimensional functions. Traditional approaches, such as approximation by piecewise polynomials, wavelets, or other linear combinations of fixed basis functions, suffer from the curse of dimensionality (CoD). This does not seem to be the case for the neural network-based ML models. To quantify this, we need to develop the corresponding mathematical framework. In this talk, I will report the progress made so far and the main remaining issues within the scope of supervised learning. I will discuss three major issues: approximation theory and error analysis of modern ML models, qualitative behavior of gradient descent algorithms, and ML from a continuous viewpoint.
Please contact DASHH office for the Zoom access information, see link below.
Further information concerning the speakers and the lectures can also be found here: