Felix Petersen
Mail
|
Phone
|
Twitter
|Google Scholar |LinkedIn |GitHub |YouTube |Schedule an Appointment

My primary research interest is machine learning with differentiable algorithms. For example, I have made a general framework for making algorithms differentiable, and have also focussed on differentiable logic gate networks, differentiable sorting, and differentiable rendering. Beyond differentiable algorithms, my work on differentiability enhances various domains including stochastic gradient estimation, analytical distribution propagation, second-order optimization, uncertainty quantification, domain adaptation, individual fairness, and efficient neural architectures.

I am a postdoctoral researcher at Stanford University in Stefano Ermon's group and in collaboration with Christian Borgelt, Hilde Kuehne, Mikhail Yurochkin, Yuekai Sun, Oliver Deussen, among others.

I have been working, i.a., at the University of Konstanz, at TAU, DESY, PSI, and CERN.


News

Mar 2024 Our paper "Grounding Everything: Emerging Localization Properties in Vision-Language Transformers" was accepted to CVPR 2024! Congrats to Walid!

Jan 2024 Our paper "Uncertainty Quantification via Stable Distribution Propagation" was accepted to ICLR 2024!

Jul 2023 Our paper "Learning by Sorting: Self-supervised Learning with Group Ordering Constraints" was accepted to ICCV 2023!

May 2023 We released the Call-for-Papers for our ICML 2023 Workshop. Consider submitting a 4-page paper and join us in Hawaii on July 28: differentiable.xyz

Apr 2023 Our paper "Neural Machine Translation for Mathematical Formulae" was accepted to ACL 2023!

Apr 2023 Our workshop "Differentiable Almost Everything: Differentiable Relaxations, Algorithms, Operators, and Simulators" has been accepted for ICML 2023!

Feb 2023 Our paper "ISAAC Newton: Input-based Approximate Curvature for Newton's Method" was accepted to ICLR 2023!

Oct 2022 Our papers "Deep Differentiable Logic Gate Networks" and "Domain Adaptation meets Individual Fairness. And they get along" were accepted to NeurIPS 2022!

Jun 2022I submitted my thesis on "Learning with Differentiable Algorithms"!

Jun 2022 Our paper "Differentiable Top-k Classification Learning" was accepted to ICML!

Mar 2022 Our paper "GenDR: A Generalized Differentiable Renderer" was accepted to CVPR!

Feb 2022 Our paper "Monotonic Differentiable Sorting Networks" was accepted to ICLR!

Oct 2021 Our papers "Learning with Algorithmic Supervision via Continuous Relaxations" and "Post-processing for Individual Fairness" were accepted to NeurIPS!

Research

The focus of my research is differentiability and the study of making non-differentiable operations differentiable. Differentiable relaxations enable a plethora of optimization tasks: from optimizing logic gate networks [1] and optimizing through the 3D rendering pipeline [2, 3, 4] to differentiating sorting and ranking [5, 6] for supervised [7] and self-supervised [8] learning. Beyond differentiable algorithms, this branches out into various domains including stochastic gradient estimation [9], analytical distribution propagation [10], second-order optimization [9, 11], uncertainty quantification [10], fairness [12, 13], and efficient neural architectures [1, 14].


Learning by Sorting: Self-supervised Learning with Group Ordering Constraints
Nina Shvetsova, Felix Petersen, Anna Kukleva, Bernt Schiele, Hilde Kuehne
in Proc. of the International Conference on Computer Vision (ICCV 2023)


Neural Machine Translation for Mathematical Formulae
Felix Petersen, Moritz Schubotz, Andre Greiner-Petter, Bela Gipp
in Proc. of the 61st Annual Meeting of the Association for Computational Linguistics (ACL 2023)

YouTube


ISAAC Newton: Input-based Approximate Curvature for Newton's Method
Felix Petersen, Tobias Sutter, Christian Borgelt, Dongsung Huh, Hilde Kuehne, Yuekai Sun, Oliver Deussen
in Proc. of the International Conference on Learning Representations (ICLR 2023)

YouTube      Code


Deep Differentiable Logic Gate Networks
Felix Petersen, Christian Borgelt, Hilde Kuehne, Oliver Deussen
in Proceedings of the 36th International Conference on Neural Information Processing Systems (NeurIPS 2022)

Code


Domain Adaptation meets Individual Fairness. And they get along.
Debarghya Mukherjee*, Felix Petersen*, Mikhail Yurochkin, Yuekai Sun
in Proceedings of the 36th International Conference on Neural Information Processing Systems (NeurIPS 2022)


Learning with Differentiable Algorithms
Felix Petersen
PhD thesis (summa cum laude), University of Konstanz


Differentiable Top-k Classification Learning
Felix Petersen, Hilde Kuehne, Christian Borgelt, Oliver Deussen
in Proceedings of the 39th International Conference on Machine Learning (ICML 2022)

YouTube      Code


GenDR: A Generalized Differentiable Renderer
Felix Petersen, Christian Borgelt, Bastian Goldluecke, Oliver Deussen
in Proceedings of the Conference on Computer Vision and Pattern Recognition (CVPR 2022)

YouTube      Code


Monotonic Differentiable Sorting Networks
Felix Petersen, Christian Borgelt, Hilde Kuehne, Oliver Deussen
in Proceedings of the International Conference on Learning Representations (ICLR 2022)

YouTube      Code / diffsort library

Style Agnostic 3D Reconstruction via Adversarial Style Transfer
Felix Petersen, Hilde Kuehne, Bastian Goldluecke, Oliver Deussen
in Proceedings of the IEEE Winter Conf. on Applications of Computer Vision (WACV 2022)

YouTube



Learning with Algorithmic Supervision via Continuous Relaxations
Felix Petersen, Christian Borgelt, Hilde Kuehne, Oliver Deussen
in Proceedings of the 35th International Conference on Neural Information Processing Systems (NeurIPS 2021)

YouTube      Code / AlgoVision library



Post-processing for Individual Fairness
Felix Petersen*, Debarghya Mukherjee*, Yuekai Sun, Mikhail Yurochkin
in Proceedings of the 35th International Conference on Neural Information Processing Systems (NeurIPS 2021)

YouTube      Code


Differentiable Sorting Networks for Scalable Sorting and Ranking Supervision
Felix Petersen, Christian Borgelt, Hilde Kuehne, Oliver Deussen
in Proceedings of the 38th International Conference on Machine Learning (ICML 2021)

YouTube



AlgoNet: C Smooth Algorithmic Neural Networks
Felix Petersen, Christian Borgelt, Oliver Deussen

Pix2Vex: Image-to-Geometry Reconstruction using a Smooth Differentiable Renderer
Felix Petersen, Amit H. Bermano, Oliver Deussen, Daniel Cohen-Or
Towards Formula Translation using Recursive Neural Networks
Felix Petersen, Moritz Schubotz, Bela Gipp
in Proceedings of the 11th Conference on Intelligent Computer Mathematics (CICM), 2018
LaTeXEqChecker - A framework for checking mathematical semantics in LaTeX documents
Felix Petersen
Presented in the Special Session of the 11th Conference on Intelligent Computer Mathematics (CICM), 2018

Slides

Media Coverage
campus_kn

campus.kn article about my Schülerstudium (studying while in school):
English: “At university, knowledge counts more than age” – “Im Gespräch” interviews Felix Petersen, who completed a Schülerstudium
German: „An der Uni ist die Expertise wichtiger als das Alter“ – Im Gespräch mit Schülerstudium-Absolvent Felix Petersen


suedkurier-Aug27-2019

Südkurier article about me as a 19 years old PhD student: Dieser junge Mann ist 19 Jahre alt – und schreibt gerade seine Doktorarbeit. Über ein hochbegabtes Talent der Konstanzer Uni


jufo_TV_Feb14-2019

German TV: Short interview about my participation at Jugend forscht: „Jugend forscht“: Hamburger Schüler stellen Forschungsprojekte vor


suedkurier

Südkurier article about my work at DESY: Streng geheimes Forschungsprojekt: 17-jähriger Informatik-Student entwickelt neuartigen Röntgenlaser


Teaching
Binomial coefficients

Individualized Maths Introduction Course

WS 2019 – SS 2022


AnaLina

Tutor: Analysis and Linear Algebra

SS 2019 – WS 2020


MLP and CG

Seminar: Current Trends in Computer Graphics (+ Neural Networks, and Mathematical Language Processing)

WS 2019


mds

Tutor: Discrete Mathematics

SS 2017 and SS 2018


pk1

Tutor: Programming Course 1 (Java)

WS 2016 and WS 2017



Based on .