Theoretical advances in deep learning (IN2107, IN4409)

Pre-course Meeting

The pre-course meeting was held on 11.07.2023 at 18:00

You can find the slides here 

Please fill out this form to help us match you  


Neural networks, particularly deep networks, have achieved unprecedented popularity over the past decade. While the empirical success of neural networks has reached new heights, one of the major achievements in recent years has been new theoretical studies on the statistical performance of neural networks. 
This seminar will look at the following important topics on neural networks from a mathematical perspective:

  • Generalization error for neural networks and related concepts from learning theory
  • Optimization and convergence rates for neural networks
  • Sample complexity and hardness results
  • Connection of deep learning to other learning approaches (kernel methods etc)
  • Robustness of neural networks

Several recent papers from top machine learning conferences will be discussed during the seminar.

Previous Knowledge Expected

  • Machine learning (IN2064)
  • Introduction to deep learning (IN2346)


Upon completion of this module, the students will:

  • have acquired knowledge on the current trends in deep learning theory.
  • be familiar with recent theoretical works from top machine learning conferences
  • be able to apply mathematical tools to analyze performance of neural networks


Each student will be allotted a research paper. The student will have to submit a 3-4 page report/review on the paper (submission deadline in the middle of the semester). Additionally, there will be presentations will be held together as a block seminar. The slides have to be submitted before the presentation.
The final grades will depend on the presentation (60%) and a written report (40%).