I am a research scientist at DeepMind. Before that, I was the Laplace Postdoctoral Chair in Data Science at École normale supérieure in Paris, where I worked with Emmanuel Dupoux and the CoML team. Before that, I was a PhD student at Carnegie Mellon University in Graham Neubig's NeuLab. In the past, I have also interned at Facebook and DeepMind. You can find my full CV here and the list of my publications there or on Google scholar.
My overall research goal is to develop machine learning models that can understand and interact with natural language. I am specifically interested in questions such as distribution shift, emergent communication and understanding deep learning models for NLP.
I also used to be an active contributor of DyNet, a toolkit for dynamic neural networks. Check it out!
Other than that I like reading sci-fi, sleeping, eating and playing video games. Recently I've picked up miniature painting and watercolor painting.
Before my PhD, I was an "Élève ingènieur" at École polytechnique in France.
My email is
pmichel31415[at]gmail.com. You can also find me on Twitter, where I mostly tweet about my own work.
- I just joined DeepMind in London as a Research Scientist.".
- Two papers accepted at ICLR 2022! My follow-up work on parametric DRO: "Distributionally Robust Models with Parametric Likelihood Ratios" and Lucio Dery's paper on "Should We Be Pre-training? An Argument for End-task Aware Training as an Alternative".
- I successfully defended my PhD thesis! Many thanks to my thesis committee members, Graham Neubig, Tatsunori Hashimoto, Zachary Lipton and Zico Kolter. Full thesis document now on arxiv.
Some projects I've been doing on the side (not so) recently:
- RIP, a bot composing iambic pentameters from Reddit comments
- Math-oriented tutorials for Dynet (featuring drawing a fractal or denoising an image)