Ph.D. in Applied Mathematics

(Artificial Intelligence)

Version Française
English Version

Learning Representations using Neural Networks and Optimal Transport

September 2016 - October 2020


Ph.D. Defense (in French) - 8th October 2020, 2 PM at the MAP5 lab in Paris
Webcasting and realization: Jean Defontaine - Pierre Chosson // O·H·N·K

Chapters in the video:

Discussion with members of the jury:

I have spent four wonderful years diverting neural networks (with so-called deep learning techniques) from their normal use:

In layman's terms, this Ph.D. work was respectively about:

These problems share a common scientific questioning: how should we represent data? For that purpose, we revisit mathematical concept called Optimal Transport with a widely known algorithmical tool called Neural Networks (nicknamed “Deep Learning” since 2010 approximately).

Great people like Pr. Charles Bouveyron (my academic thesis supervisor), Dr. Stéphane Raux (my corporate thesis supervisor), Dr. Pierre-Alexandre Mattei and Pr. Andrés Almansa did me an honour by helping me accomplish this work in the warmth of the MAP5 lab and with the pugnacity of the Oscaro company.

Ph.D. elements:

For ease of reading, the chapters are separated here:

Unpublished works have begun during this thesis without appearing in the manuscript (and are currently in progress):

Warith Harchaoui