# Lecture by Christoph Hertrich (Johann-Wolfgang-Goethe-Universität Frankfurt): (Old and New) Facets of Neural Network Complexity

How to use discrete mathematics and theoretical computer science to understand neural networks? Guided by this question, I will focus on neural networks with rectified linear unit (ReLU) activations, a standard model and important building block in modern machine learning pipelines. The functions represented by such networks are continuous and piecewise linear. But how does the set of representable functions depend on the architecture? And how difficult is it to train such networks to optimality? In my talk I will answer fundamental questions like these using methods from polyhedral geometry, combinatorial optimization, and complexity theory. This stream of research was started during my doctorate within "*Facets of Complexity*" and carried much further since then.

### Time & Location

Jan 29, 2024 | 02:15 PM

Freie Universität Berlin

Institut für Informatik

Takustr. 9

14195 Berlin

Seminar room 053 (ground floor)