Kilian Fatras

PhD Student
IRISA-INRIA Rennes & Obelix


Welcome to my personal website !

I am a second year PhD candidate under the supervision of Pr. Nicolas Courty and Pr. Rémi Flamary at IRISA-INRIA Panama and Obelix. My research focuses on optimal transport, machine learning and optimization with applications in large scale settings and noisy labels.

I graduated from both Ecole Polytechnique and ENSTA ParisTech in applied mathematics and machine learning. I was also an exchange student at UC Berkeley during the fall of 2018. For my final master internship, I was an intern at the University of British Columbia under the supervision of Pr. Mark Schmidt.

You can find my resume here.


News !


  1. Our submission Learning with minibatch Wasserstein: asymptotic and gradient properties has been accepted to AISTATS 2020 !

Research interests


My work focuses on optimization for machine learning and the interaction between optimal transport and machine learning.


Papers


Optimal Transport

Generating natural adversarial Remote Sensing Images
Jean-Christophe Burnel*, Kilian Fatras*, Remi Flamary and Nicolas Courty
* equal contribution
Preprint

Paper bibtex
@unpublished{burnelARGAN,
  TITLE = {{Generating natural adversarial Remote Sensing Images}},
  AUTHOR = {Burnel, Jean-Christophe and Fatras, Kilian and Flamary, R{\'e}mi and Courty, Nicolas},
  URL = {https://hal.archives-ouvertes.fr/hal-02558542},
  NOTE = {working paper or preprint},
  YEAR = {2020},
  MONTH = Apr,
  KEYWORDS = {Deep Learning ; Remote sensing ; Generative models ; Adversarial Examples},
  PDF = {https://hal.archives-ouvertes.fr/hal-02558542/file/ARGAN_TGRS_hal.pdf},
  HAL_ID = {hal-02558542},
}

Learning with minibatch Wasserstein: asymptotic and gradient properties
Kilian Fatras, Younes Zine, Remi Flamary, Rémi Gribonval and Nicolas Courty
Proceedings of the 23nd International Conference on Artificial Intelligence and Statistics (AISTATS), 2020

Paper Code Slides Poster Blog bibtex

@InProceedings{pmlr-v108-fatras20a,
  title = 	 {Learning with minibatch Wasserstein  : asymptotic and gradient properties},
  author = 	 {Fatras, Kilian and Zine, Younes and Flamary, R\'emi and Gribonval, Remi and Courty, Nicolas},
  booktitle = 	 {Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics},
  pages = 	 {2131--2141},
  year = 	 {2020},
  editor = 	 {Chiappa, Silvia and Calandra, Roberto},
  volume = 	 {108},
  series = 	 {Proceedings of Machine Learning Research},
  address = 	 {Online},
  month = 	 {26--28 Aug},
  publisher = 	 {PMLR},
  pdf = 	 {http://proceedings.mlr.press/v108/fatras20a/fatras20a.pdf},
  url = 	 {http://proceedings.mlr.press/v108/fatras20a.html},
  abstract = 	 {Optimal transport distances are powerful tools to compare probability distributions and have found many applications in machine learning. Yet their algorithmic complexity prevents their direct use on large scale datasets. To overcome this challenge, practitioners compute these distances on minibatches i.e., they average the outcome of several smaller optimal transport problems. We propose in this paper an analysis of this practice, which effects are not well understood so far. We notably argue that it is equivalent to an implicit regularization of the original problem, with appealing properties such as unbiased estimators, gradients and a concentration bound around the expectation, but also with defects such as loss of distance property. Along with this theoretical analysis, we also conduct empirical experiments on gradient flows, GANs or color transfer that highlight the practical interest of this strategy.}
}

Wasserstein Adversarial Regularization (WAR) on label noise
Bharath Damodaran*, Kilian Fatras*, Sylvain Lobry, Remi Flamary, Devis Tuia and Nicolas Courty
* equal contribution
Preprint

Paper bibtex
@article{damodaran2019war,
author = {Bhushan Damodaran, Bharath and Fatras, Kilian and Lobry, Sylvain and Flamary, Rémi and Tuia, Devis and Courty, Nicolas},
title  = {Wasserstein Adversarial Regularization (WAR) on label noise},
year   = {2019 (Submited)}
}

Optimization

Proximal Splitting Meets Variance Reduction
Fabian Pedregosa, Kilian Fatras and Mattia Casotto.
Proceedings of the 22nd International Conference on Artificial Intelligence and Statistics (AISTATS), 2019

Paper bibtex
@InProceedings{Pedregosa2019PSVR,
      author      = {Pedregosa, Fabian and Fatras, Kilian and Casotto, Mattia},
      title       = {Proximal Splitting Meets Variance Reduction},
      booktitle   = {AISTATS},
      year        = {2019}
                  }


Workshops


This section is empty for the moment !

Projects and Volunteering


Here is a list of my volunteering activities and the different projects I contribute to:

  1. Python for Optimal Transport (POT) is an open source library for optimal transport in Python.
  2. Reviewer for JMLR.

Contacts


I use several networks, do not hesitate to reach me out !