iTWIST’16 Keynote Speakers: Florent Krzakala
by ThomasA
Note: You can still register for iTWIST’16 until Monday the 1st of August!
Our next speaker at iTWIST’16 is Florent Krzakala. Much like Phil Schniter – the previous speaker presented here – Florent Krzakala has made important and enlightening contributions to the Approximate Message Passing family of algorithms.
Florent Krzakala is Professor of Physics at École Normale Supérieure in Paris, France. Professor Krzakala came to ENS in 2013 from a position as Maître de conférence in ESPCI, Paris (Laboratoire de Physico-chimie Theorique) since 2004. Maître de conférence is a particular French academic designation that I am afraid I am going to have to ask my French colleagues to explain to me 😉
Where Phil Schniter seems to have approached the (G)AMP algorithms, that have become quite popular for compressed sensing, from an estimation-algorithms-in-digital-communications-background, Florent Krzakala has approached the topic from a statistical physics background which seems to have brought a lot of interesting new insight to the table. For example, together with Marc Mézard, Francois Sausset, Yifan Sun, and Lenka Zdeborová he has shown how AMP algorithms are able to perform impressively well compared to the classic l1-minimization approach by using a special kind of so-called “seeded” measurement matrices in “Probabilistic reconstruction in compressed sensing: algorithms, phase diagrams, and threshold achieving matrices“.
At this year’s iTWIST workshop in a few weeks, Professor Krzakala is going to speak about matrix factorisation problems and the approximate message passing framework. Specifically, we are going to hear about:
Approximate Message Passing and Low Rank Matrix Factorization Problems
A large amount of interesting problem in machine learning and statistics can be expressed as a low rank structured matrix factorization problem, such as sparse PCA, planted clique, sub-matrix localization, clustering of mixtures of Gaussians or community detection in a graph.
I will discuss how recent ideas from statistical physics and information theory have led, on the one hand, to new mathematical insights in these problems, leading to a characterization of the optimal possible performances, and on the other to the development of new powerful algorithms, called approximate message passing, which turns out to be optimal for a large set of problems and parameters.