We live in a world where vast amounts of data are being generated by a growing and varied number of sources in challenging real-world domains, and it is increasingly becoming necessary to employ methods from computational statistics and machine learning to infer structure and extract patterns from data.
Often the assumptions underlying this structure have a simple and intuitive interpretation, but developing efficient computational algorithms is challenging and involves tradeoffs between statistical guarantees and computational constraints involving large datasets.
Machine learning is also a driving force behind recent efforts to build general artificial intelligence systems, where deep learning methods are achieving state of the art results in a number of data driven disciplines (computer vision, speech recognition, etc.). This poses new challenges to theoreticians with many important questions yet to be answered.
Our research focus on machine learning theory and algorithms, with a focus on kernel methods, multitask and transfer learning, online learning, sparsity regularization, and statistical learning theory. Our work draws inspiration from different disciplines of mathematics and statistics, including approximation theory, empirical processes, and numerical optimization. Our research has a broad multidisciplinary component with applications spanning computer vision, bioinformatics and user modelling, among others.
- North Carolina State University (P.L. Combettes)
- INSEAD (T. Evgeniou)
- Universita' di Firenze (P. Frasconi)
- University of Oxford (R. Hauser)
- UC London (N. Bianchi-Berthouze, M. Herbster, J. Mourao-Miranda, )
- University of Cambridge (C. Mascolo)
- SUNY Albany (C.A. Micchelli (SUNY Albany)
- ENSAE Paris Tech (P. Alquier and A. Tsybakov)