Kilian Fatras

PhD Student
IRISA-INRIA Rennes & Obelix


Welcome to my personal website !

I am a final year PhD candidate under the supervision of Pr. Nicolas Courty and Pr. Rémi Flamary at IRISA-INRIA Panama and Obelix. My research focuses on optimal transport, machine learning and optimization with applications in large scale settings and noisy labels.

I graduated from both Ecole Polytechnique and ENSTA ParisTech in applied mathematics and machine learning. I was also an exchange student at UC Berkeley during the fall of 2018. For my final master internship, I was an intern at the University of British Columbia under the supervision of Pr. Mark Schmidt.

You can find my resume here.


Research interests


My work focuses on optimization for machine learning and the interaction between optimal transport and machine learning.


Papers


Optimal Transport

POT: Python Optimal Transport
Rémi Flamary, Nicolas Courty, Alexandre Gramfort, Mokhtar Z. Alaya, Aurélie Boisbunon, Stanislas Chambon, Laetitia Chapel, Adrien Corenflos, Kilian Fatras, Nemo Fournier, Léo Gautheron, Nathalie T.H. Gayraud, Hicham Janati, Alain Rakotomamonjy, Ievgen Redko, Antoine Rolet, Antony Schutz, Vivien Seguy, Danica J. Sutherland, Alexander Tong and Titouan Vayer
Journal of Machine Learning Research (JMLR) - Open Source Software, 2021

Paper Website Code Bibtex
  @article{JMLR:v22:20-451,
    author  = {R\'emi Flamary and Nicolas Courty and Alexandre Gramfort and Mokhtar Z. Alaya and Aur\'elie Boisbunon and Stanislas Chambon and Laetitia Chapel and Adrien Corenflos and Kilian Fatras and Nemo Fournier and L\'eo Gautheron and Nathalie T.H. Gayraud and Hicham Janati and Alain Rakotomamonjy and Ievgen Redko and Antoine Rolet and Antony Schutz and Vivien Seguy and Danica J. Sutherland and Romain Tavenard and Alexander Tong and Titouan Vayer},
    title   = {POT: Python Optimal Transport},
    journal = {Journal of Machine Learning Research},
    year    = {2021},
    volume  = {22},
    number  = {78},
    pages   = {1-8},
    url     = {http://jmlr.org/papers/v22/20-451.html}
  }

Unbalanced minibatch Optimal Transport; applications to Domain Adaptation
Kilian Fatras, Thibault Séjourné, Nicolas Courty and Rémi Flamary
International Conference on Machine Learning (ICML), 2021
Keywords: Unbalanced Optimal Transport, Minibatch, Concentration Bounds, (Partial) Domain Adaptation

Paper ArXiv Code Bibtex
@InProceedings{pmlr-v139-fatras21a,
title = 	 {Unbalanced minibatch Optimal Transport; applications to Domain Adaptation},
author =       {Fatras, Kilian and Sejourne, Thibault and Flamary, R{\'e}mi and Courty, Nicolas},
booktitle = 	 {Proceedings of the 38th International Conference on Machine Learning},
pages = 	 {3186--3197},
year = 	 {2021},
editor = 	 {Meila, Marina and Zhang, Tong},
volume = 	 {139},
series = 	 {Proceedings of Machine Learning Research},
month = 	 {18--24 Jul},
publisher =    {PMLR},
pdf = 	 {http://proceedings.mlr.press/v139/fatras21a/fatras21a.pdf},
url = 	 {http://proceedings.mlr.press/v139/fatras21a.html},
abstract = 	 {Optimal transport distances have found many applications in machine learning for their capacity to compare non-parametric probability distributions. Yet their algorithmic complexity generally prevents their direct use on large scale datasets. Among the possible strategies to alleviate this issue, practitioners can rely on computing estimates of these distances over subsets of data, i.e. minibatches. While computationally appealing, we highlight in this paper some limits of this strategy, arguing it can lead to undesirable smoothing effects. As an alternative, we suggest that the same minibatch strategy coupled with unbalanced optimal transport can yield more robust behaviors. We discuss the associated theoretical properties, such as unbiased estimators, existence of gradients and concentration bounds. Our experimental study shows that in challenging problems associated to domain adaptation, the use of unbalanced optimal transport leads to significantly better results, competing with or surpassing recent baselines.}
}  

Minibatch Optimal Transport distances; analysis and applications
Kilian Fatras, Younes Zine, Szymon Majewski, Rémi Flamary, Rémi Gribonval and Nicolas Courty
Preprint, 2021
Keywords: Optimal Transport, Minibatch, Concentration Bounds, GANs, Sub-Gaussian data

ArXiv Code Bibtex
  @misc{fatras2021minibatch,
    title={Minibatch optimal transport distances; analysis and applications}, 
    author={Kilian Fatras and Younes Zine and Szymon Majewski and Rémi Flamary and Rémi Gribonval and Nicolas Courty},
    year={2021},
    eprint={2101.01792},
    archivePrefix={arXiv},
    primaryClass={stat.ML}
}

Generating natural adversarial Remote Sensing Images
Jean-Christophe Burnel, Kilian Fatras, Rémi Flamary and Nicolas Courty
IEEE Transactions on Geoscience and Remote Sensing (TGRS), 2021
Keywords: Optimal Transport, GANs, Adversarial Examples, Remote Sensing

ArXiv Code Bibtex
@ARTICLE{burnel2021,
  author={Burnel, Jean-Christophe and Fatras, Kilian and Flamary, R{\'e}mi and Courty, Nicolas},
  journal={IEEE Transactions on Geoscience and Remote Sensing}, 
  title={Generating natural adversarial Remote Sensing Images}, 
  year={(to appear) 2021}}

Learning with minibatch Wasserstein: asymptotic and gradient properties
Kilian Fatras, Younes Zine, Rémi Flamary, Rémi Gribonval and Nicolas Courty
Proceedings of the 23nd International Conference on Artificial Intelligence and Statistics (AISTATS), 2020
Keywords: Optimal Transport, Minibatch, Concentration Bounds, Large Scale Color Transfer

Paper ArXiv Code Slides Poster Blog Bibtex
@InProceedings{pmlr-v108-fatras20a,
  title = 	 {Learning with minibatch Wasserstein  : asymptotic and gradient properties},
  author = 	 {Fatras, Kilian and Zine, Younes and Flamary, R\'emi and Gribonval, Remi and Courty, Nicolas},
  booktitle = 	 {Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics},
  pages = 	 {2131--2141},
  year = 	 {2020},
  editor = 	 {Chiappa, Silvia and Calandra, Roberto},
  volume = 	 {108},
  series = 	 {Proceedings of Machine Learning Research},
  address = 	 {Online},
  month = 	 {26--28 Aug},
  publisher = 	 {PMLR},
  pdf = 	 {http://proceedings.mlr.press/v108/fatras20a/fatras20a.pdf},
  url = 	 {http://proceedings.mlr.press/v108/fatras20a.html},
  abstract = 	 {Optimal transport distances are powerful tools to compare probability distributions and have found many applications in machine learning. Yet their algorithmic complexity prevents their direct use on large scale datasets. To overcome this challenge, practitioners compute these distances on minibatches i.e., they average the outcome of several smaller optimal transport problems. We propose in this paper an analysis of this practice, which effects are not well understood so far. We notably argue that it is equivalent to an implicit regularization of the original problem, with appealing properties such as unbiased estimators, gradients and a concentration bound around the expectation, but also with defects such as loss of distance property. Along with this theoretical analysis, we also conduct empirical experiments on gradient flows, GANs or color transfer that highlight the practical interest of this strategy.}
}

Wasserstein Adversarial Regularization (WAR) on label noise
Kilian Fatras*, Bharath Damodaran*, Sylvain Lobry, Rémi Flamary, Devis Tuia and Nicolas Courty
* equal contribution
IEEE Transactions on Pattern Analysis and Machine Intelligence (TPAMI), 2021
Keywords: Optimal Transport, Adversarial Training, label noise, Remote Sensing

Paper ArXiv Code Bibtex
@ARTICLE{Fatras2021WAR,
author={Fatras, Kilian and Damodaran, Bharath Bhushan and Lobry, Sylvain and Flamary, Remi and Tuia, Devis and Courty, Nicolas},
journal={IEEE Transactions on Pattern Analysis and Machine Intelligence}, 
title={Wasserstein Adversarial Regularization for learning with label noise}, 
year={2021},
doi={10.1109/TPAMI.2021.3094662}}

Optimization

Proximal Splitting Meets Variance Reduction
Fabian Pedregosa, Kilian Fatras and Mattia Casotto.
Proceedings of the 22nd International Conference on Artificial Intelligence and Statistics (AISTATS), 2019
Keywords: Proximal Splitting, Variance Reduction, Sparse Update

Paper ArXiv Code Bibtex
@InProceedings{pmlr-v89-pedregosa19a,
title = 	 {Proximal Splitting Meets Variance Reduction},
author =       {Pedregosa, Fabian and Fatras, Kilian and Casotto, Mattia},
booktitle = 	 {Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics},
pages = 	 {1--10},
year = 	 {2019},
editor = 	 {Chaudhuri, Kamalika and Sugiyama, Masashi},
volume = 	 {89},
series = 	 {Proceedings of Machine Learning Research},
month = 	 {16--18 Apr},
publisher =    {PMLR},
pdf = 	 {http://proceedings.mlr.press/v89/pedregosa19a/pedregosa19a.pdf},
url = 	 {http://proceedings.mlr.press/v89/pedregosa19a.html},
abstract = 	 {Despite the raise to fame of stochastic variance reduced methods like SAGA and ProxSVRG, their use in non-smooth optimization is still limited to a few simple cases. Existing methods require to compute the proximal operator of the non-smooth term at each iteration, which, for complex penalties like the total variation, overlapping group lasso or trend filtering, is an iterative process that becomes unfeasible for moderately large problems. In this work we propose and analyze VRTOS, a variance-reduced method to solve problems with an arbitrary number of non-smooth terms. Like other variance reduced methods, it only requires to evaluate one gradient per iteration and converges with a constant step size, and so is ideally suited for large scale applications. Unlike existing variance reduced methods, it admits multiple non-smooth terms whose proximal operator only needs to be evaluated once per iteration. We provide a convergence rate analysis for the proposed methods that achieves the same asymptotic rate as their full gradient variants and illustrate its computational advantage on 4 different large scale datasets.}
}


Workshops


This section is empty for the moment !

Talks


01/09/21 - CMAP Ecole Polytechnique : Unbalanced minibatch Optimal Transport; applications to Domain Adaptation
28/04/21 - Montréal Machine Learning and Optimization (MTL MLOpt) : Unbalanced minibatch Optimal Transport; applications to Domain Adaptation
09/07/19 - GDR-ISIS : Transport optimal en apprentissage statistique et traitement du signal

Projects and Volunteering


Here is a list of my volunteering activities and the different projects I contribute to:

  1. Python for Optimal Transport (POT) is an open source library for optimal transport in Python.
  2. Reviewer for JMLR, ICML, ECML, JOTA, IEEE TGRS.

Contacts


I use several networks, do not hesitate to reach me out !