Anders C. Hansen: Trends in mathematics of information -- Deep learning, artificial intelligence and compressed sensing, part I

This is the first in a series of three lectures by Anders C. Hansen (Cambridge Univ. and UiO) on this topic. Vegard Antun (UiO) will also contribute to the lectures.

Added June 06: Slides from the lectures are now available here.

In the last decade there have been two major breakthroughs in mathematics of information and data science that stand out: the introduction of compressed sensing in the mid 2000s and the documented success of deep learning from 2012 and onwards. In this series of talks we will give an overview of both of these techniques, provide mathematical background and plenty of practical examples. 

Deep learning is now the state-of-the-art method for classification and recognition. Its success is unmatched by a considerable margin and its performance is now referred to as super-human. This opens up for endless applications where automated recognition and classification is important such as for driverless cars, in surveillance, in image and speech recognition etc. Given that deep learning outperforms humans on many tasks one faces the question: have we reached artificial intelligence? I will consider this question in view of Smale’s 18th problem and Turing’s paper from 1950, where he introduces the imitation game. When considering this question, a rather fascinating issue is revealed. Indeed, deep learning becomes completely unstable. This phenomenon has both philosophical and practical consequences. 

Compressed sensing and sparse regularisations have in many ways changed the way one approaches medical imaging and inverse problems. However, interestingly, deep learning can also be used in these cases. The use of deep learning in inverse problems is a rather new concept, and the community has just started investigating the many possibilities. We will discuss several of the deep learning approaches and compare the results with compressed sensing. When doing so, one discovers an alluring phenomenon: deep learning may produce completely unstable reconstruction methods for inverse problems. 

The talks provide a sneak peak of material from the book “Structured Compressed Sensing, Imaging and Learning” (Cambridge University Press, coming 2019).
 

Anders  C. Hansen is head of the group in Applied Functional and Harmonic Analysis within the Cambridge Centre of Analysis at DAMTP. He is  also Prof. II at the Institute of Mathematics, UiO. 
 
The second lecture in this mini-course will be held on May 15, 10.15-12 in Aud. 4, V. Bjerknes' house, while the third will be on May 16, 12.15-14 in Aud 2, VB 
 
Published May 1, 2018 7:48 PM - Last modified June 6, 2018 10:47 AM