Kilian Fatras

PhD Student
IRISA-INRIA Rennes & Obelix


Welcome to my personal website !

I am a PhD candidate under the supervision of Pr. Nicolas Courty and Pr. Rémi Flamary at IRISA-INRIA Panama and Obelix. My research focuses on optimal transport, machine learning and optimization.

I graduated from both Ecole Polytechnique and ENSTA ParisTech in applied mathematics and machine learning. I was also an exchange student at UC Berkeley during the fall of 2018. For my final master internship, I was an intern at the University of British Columbia under the supervision of Pr. Mark Schmidt.

You can find my resume here.


Research interests


My work focuses on optimization for machine learning and the interaction between optimal transport and machine learning.


Papers


Optimal Transport

[NEW!] Pushing the right boundaries matters! Wasserstein Adversarial Training for Label Noise
Bharath Damodaran*, Kilian Fatras*, Sylvain Lobry, Remi Flamary, Devis Tuia and Nicolas Courty
Preprint
* equal contribution

Abstract: Noisy labels often occur in vision datasets, especially when they are issued from crowdsourcing or Web scraping. In this paper, we propose a new regularization method which enables one to learn robust classifiers in presence of noisy data. To achieve this goal, we augment the virtual adversarial loss with a Wasserstein distance. This distance allows us to take into account specific relations between classes by leveraging on the geometric properties of this optimal transport distance. Notably, we encode the class similarities in the ground cost that is used to compute the Wasserstein distance. As a consequence, we can promote smoothness between classes that are very dissimilar, while keeping the classification decision function sufficiently complex for similar classes. While designing this ground cost can be left as a problem-specific modeling task, we show in this paper that using the semantic relations between classes names already leads to good results.Our proposed Wasserstein Adversarial Training (WAT) outperforms state of the art on four datasets corrupted with noisy labels: three classical benchmarks and one real case in remote sensing image semantic segmentation.

@article{damodaran2019wat,
author = {Bhushan Damodaran, Bharath and Fatras, Kilian and Lobry, Sylvain and Flamary, Rémi and Tuia, Devis and Courty, Nicolas},
title = {Pushing the right boundaries matters! Wasserstein Adversarial Training for Label Noise},
year = {2019 (Submited)}
}

Optimization

Proximal Splitting Meets Variance Reduction
Fabian Pedregosa, Kilian Fatras and Mattia Casotto.
Proceedings of the 22nd International Conference on Artificial Intelligence and Statistics (AISTATS), 2019

Abstract: Despite the rise to fame of incremental variance-reduced methods in recent years, their use in nonsmooth optimization is still limited to few simple cases. This is due to the fact that existing methods require to evaluate the proximity operator for the nonsmooth terms, which can be a costly operation for complex penalties. In this work we introduce two variance-reduced incremental methods based on SAGA and SVRG that can efficiently take into account complex penalties which can be expressed as a sum of proximal terms. This includes penalties such as total variation, group lasso with overlap and trend filtering, to name a few. Furthermore, we also develop sparse variants of the proposed algorithms which can take advantage of sparsity in the input data. Like other incremental methods, it only requires to evaluate the gradient of a single sample per iteration, and so is ideally suited for large scale applications. We provide a convergence rate analysis for the proposed methods and show that they converge with a fixed step-size, achieving in some cases the same asymptotic rate as their full gradient variants. Empirical benchmarks on 3 different datasets illustrate the practical advantages of the proposed methods.

@InProceedings{Pedregosa2019PSVR,
      author      = {Pedregosa, Fabian and Fatras, Kilian and Casotto, Mattia},
      title       = {Proximal Splitting Meets Variance Reduction},
      booktitle   = {AISTATS},
      year        = {2019}
      note        = {(to appear)} 
                  }


Workshops


This section is empty for the moment !

Projects and Volunteering


Here is a list of my volunteering activities and the different projects I contributed to:

  1. Python for Optimal Transport (POT) is an open source library for optimal transport in Python.
  2. Reviewer for JMLR.

Contacts


I use several networks, do not hesitate to reach me out !