I am a CNRS researcher in the IOP team of the Institut de Mathématiques de Bordeaux.

**Contact adress** : yann [dot] traonmilin [at] u-bordeaux [dot] fr

**Latest Publications/News**

• I was interviewed in « CNRS Le journal » (french). I talk mostly about my professional path and how I became a mathematician at CNRS. Check it out!

• Our preprint on the non-convex spikes super-resolution problem is out: « The basins of attraction of the global minimizers of the non-convex sparse spikes estimation problem », Yann Traonmilin and Jean-François Aujol.

**« Abstract** : The sparse spike estimation problem consists in estimating a number of off-the-grid impulsive sources from under-determined linear measurements. Information theoretic results ensure that the minimization of a non-convex functional is able to recover the spikes for adequately chosen measurements (deterministic or random). To solve this problem, methods inspired from the case of finite dimensional sparse estimation where a convex program is used have been proposed. Also greedy heuristics have shown nice practical results. However, little is known on the ideal non-convex minimization to perform. In this article, we study the shape of the global minimum of this non-convex functional: we give an explicit basin of attraction of the global minimum that shows that the non-convex problem becomes easier as the number of measurements grows. This has important consequences for methods involving descent algorithms (such as the greedy heuristic) and it gives insights for potential improvements of such descent methods. »

• I will be presenting a follow-up of our work on optimal regularization at ITWIST 2018 (November 21st-23rd) (abstract): Is the 1-norm the best convex sparse regularization? , Yann Traonmilin, Samuel Vaiter and Rémi Gribonval. Stay tuned for the full manuscript to come!

**« Abstract** : The 1-norm is a good convex regularization for the recovery of sparse vectors from under-determined linear measurements. No other convex regularization seems to surpass its sparse recovery performance. How can this be explained? To answer this question, we define several notions of “best” (convex) regularization in the context of general low-dimensional recovery and show that indeed the 1-norm is an optimal convex sparse regularization within this framework. »

• I will be presenting the following work about the concept of optimal regularization at NCMIP 2018 (May 25th) and at the Institut de Mathématiques de Toulouse’s « séminaire MIP » (May 22nd) in May :

Optimality of 1-norm regularization among weighted 1-norms for sparse recovery: a case study on how to find optimal regularizations, Yann Traonmilin and Samuel Vaiter.

**« Abstract** : The 1-norm was proven to be a good convex regularizer for the recovery of sparse vectors from under-determined linear measurements. It has been shown that with an appropriate measurement operator, a number of measurements of the order of the sparsity of the signal (up to log factors) is sufficient for stable and robust recovery. More recently, it has been shown that such recovery results can be generalized to more general low-dimensional model sets and (convex) regularizers. These results lead to the following question: to recover a given low-dimensional model set from linear measurements, what is the » best » convex regularizer? To approach this problem, we propose a general framework to define several notions of » best regularizer » with respect to a low-dimensional model. We show in the minimal case of sparse recovery in dimension 3 that the 1-norm is optimal for these notions. However, generalization of such results to the n-dimensional case seems out of reach. To tackle this problem, we propose looser notions of best regularizer and show that the 1-norm is optimal among weighted 1-norms for sparse recovery within this framework. »

• The final version of our article « One RIP to rule them all » is free to access for a short amount of time on ACHA website : follow this link!

• With the support of the GDR MIA, we are organizing a thematic workshop on « Sparsity and applications » (journée parcimonie et application) in Bordeaux, May 3rd 2018. Check the website for our call for contributions and details about organization.

• I will be presenting my latest work on spikes super-resolution at the IOP seminar at IMB Bordeaux November, 9th, 2017 and at the SPOC seminar at Institut de Mathématiques de Bourgogne in Dijon, December 13th, 2017.

• Our preprint on compressive statistical learning (including results about spikes super-resolution) is finally out!

Compressive Statistical Learning with Random Feature Moments, R. Gribonval, G. Blanchard, N. Keriven and Y. Traonmilin.

**Past News**

• My latest work on statistical learning, super-resolution and phase unmixing will be presented at the SPARS 2017 workshop. We have three communications:

Spikes super-resolution with random Fourier sampling, Y. Traonmilin, N. Keriven, R. Gribonval and G. Blanchard.

Random Moments for Sketched Mixture Learning, N. Keriven, R. Gribonval, G. Blanchard and Y. Traonmilin.

Signal Separation with Magnitude Constraints : a Phase Unmixing Problem, A. Deleforge and Y. Traonmilin.

• I will be presenting my latest work on compressive statistical learning and super-resolution (follow-up of Compressive K-means) at the SMAI 2017 congress in June .

Super-résolution d’impulsions de Diracs par échantillonnage de Fourier aléatoire. Y. Traonmilin, N. Keriven, R. Gribonval and G. Blanchard. Abstract in French.

• A preprint of a digest of our work on compressed sensing in Hilbert spaces (with Gilles Puy, Rémi Gribonval and Mike E. Davies) is availaible :

Compressed sensing in Hilbert spaces. Y. Traonmilin, G. Puy, R. Gribonval and M. E. Davies

• Our work with Nicolas Keriven, Nicolas Tremblay and Rémi Gribonval has been accepted at ICASSP 2017. It showcases how a database can be compressed in order to perform the K-means clustering task on huge volume of data:

Compressive K-means. N. Keriven, N. Tremblay, Y. Traonmilin and R. Gribonval

Abstract: « The Lloyd-Max algorithm is a classical approach to perform K-means clustering. Unfortunately, its cost becomes prohibitive as the training dataset grows large. We propose a compressive version of K-means (CKM), that estimates cluster centers from a sketch, i.e. from a drastically compressed representation of the training dataset. We demonstrate empirically that CKM performs similarly to Lloyd-Max, for a sketch size proportional to the number of cen-troids times the ambient dimension, and independent of the size of the original dataset. Given the sketch, the computational complexity of CKM is also independent of the size of the dataset. Unlike Lloyd-Max which requires several replicates, we further demonstrate that CKM is almost insensitive to initialization. For a large dataset of 10^7 data points, we show that CKM can run two orders of magnitude faster than five replicates of Lloyd-Max, with similar clustering performance on artificial data. Finally, CKM achieves lower classification errors on handwritten digits classification. »

• Our work with Antoine Deleforge has been accepted at ICASSP 2017. This work opens an interesting new line of research in the domain of source separation.

Phase Unmixing : Multichannel Source Separation with Magnitude Constraints. A. Deleforge and Y. Traonmilin.

Abstract : « We consider the problem of estimating the phases of K mixed complex signals from a multichannel observation, when the mixing matrix and signal magnitudes are known. This problem can be cast as a non-convex quadratically constrained quadratic program which is known to be NP-hard in general. We propose three approaches to tackle it: a heuristic method, an alternate minimization method, and a convex relaxation into a semi-definite program. These approaches are showed to outperform the oracle multichannel Wiener filter in under-determined informed source separation tasks, using simulated and speech signals. The convex relaxation approach yields best results, including the potential for exact source separation in under-determined settings. »

• Our article Stable recovery of low-dimensional cones in Hilbert spaces: One RIP to rule them all has been accepted for publication in Applied and Computational Harmonic Analysis (follow the link to access « In press » ACHA version).

I gave a talk September 13th at the IEEE Information Theory Workshop 2016, Cambridge, UK.

Our latest preprint is available :

Stable recovery of low-dimensional cones in Hilbert spaces: One RIP to rule them all

Yann Traonmilin and Rémi Gribonval. https://hal.archives-ouvertes.fr/hal-01207987

I presented my work, March 10th at the mathematics for image processing seminar of Descartes University and Telecom Paristech in Paris.

I presented my work, January 29th 2016 at the Rennes Statistics Seminar.

I presented my work with R. Gribonval at the GDR ISIS day in Marseille October 8th : http://www.gdr-isis.fr/index.php?page=reunion&idreunion=280

Journée Science et Musique 2015, organised by PANAMA team in Rennes is on September 26th! http://jsm.irisa.fr/

I presented my work with R. Gribonval at the missDATA2015 conference in Rennes http://missdata2015.agrocampus-ouest.fr/infoglueDeliverLive/