Date/time: June 11th from 15:30 to 18:00 (coffee break from 16:30 to 17:00), and June 12th from 14:00 to 16:30 (coffee break from 15:00 to 15:30).
Location: Centre de Recerca Matemàtica, main auditorium [note: different location from COLT!]
Title: Analyzing Deep Neural Networks for Learning
Abstract: There is a big buzz around deep neural networks, with impressive numerical results, but nearly no mathematical back up explaining these performances. Available mathematical approaches to circumvent the curse of dimensionality, typically do not apply to classification of complex high-dimensional data such as signals or images, or to regressions of non-local physical functionals. Beyond applications, deep networks may thus be an opportunity to develop new mathematics for high dimensional problems.
The two lectures will introduce some mathematical tools to analyze these networks, and describe numerical results, while encouraging discussions on open questions. The following topics will be covered:
- Curse of dimensionality and approximation theory
- One hidden layer neural network
- Reduction of dimension and contractions
- Multiscale wavelets and stable invariants over Lie groups
- Deep convolution networks and scattering operators
- Regression of physical functionals and N-body problems
- Image and audio classification
- Unsupervised and supervised deep learning