Seminar: Efficient Inference and Large-Scale Machine Learning

News

  • The session on Monday 26.06. will start at 12:00. All other sessions will take place according to the regular schedule.
  • Slides with organizational updates can be found here

Description

As both the amount and complexity of available data grows, machine learning practitioners are interested in finding increasingly more sophisticated patterns and interactions in it. The framework of Bayesian statistics provides a powerful tool for modeling and learning these dependencies. Bayesian inference has found numerous applications in the domains like computer vision, natural language processing and data science. Nevertheless, new tools are constantly needed in order to perform inference in the models of ever-growing complexity. Moreover, scalable algorithms are becoming essential for handling the datasets of massive scale.

The main purpose of the seminar is to get the students acquainted with the recent advances in probabilistic machine learning research. The two core topics addressed are efficient (approximate) inference for Bayesian modeling and large-scale optimization.

Background

It might be a good idea to brush up your knowledge of the following topics in order to be better prepared for the seminar contents

Topic Reading Lectures
Bayesian Inference Murphy*: 2.2, 3, 5 (optional) Any statistics textbook: chapter on Bayes theorem Lecture 1
Lecture 2
Lecture 3
Probabilistic Graphical Models Bishop*: 8.1-8.3
Murphy*: 10.1-10.3, 19.1-19.4
Lecture 1(start at 34:30)
Lecture 2
Lecture 3

 

Topics

Date Topic Student Supervisor References Reviewer 1 Reviewer 2
24.04 Tensor Factorization Stephan Oleksandr Tensor Decompositions and Applications
Tensor Decomposition for Signal Processing and Machine Learning
Yu Haris
08.05 Variational Inference: Foundations Ivan Oleksandr Variational Inference: A Review for Statisticians
Stochastic Variational Inference Lecture Notes
Ukrit Can
15.05 Variational Inference: Scaling Up Jakob Aleksandar Black Box Variational Inference
Doubly Stochastic Variational Bayes for non-Conjugate Inference Tutorial on Variational Autoencoders
Ukrit Yu
22.05 Variational Inference: Beyond Mean Field Jan Oleksandr Normalizing flows
Hierarchical Variational Models
Jakob Can
29.05 Message Passing and Expectation Propagation Christoph Oleksandr Murphy*: 20
Bishop*: 8.4, 10.7
Expectation Propagation for approximate Bayesian inference
Deniz Raymond
12.06 Sampling: Foundations Mesut Oleksandr Bishop*: 11.2 - 11.4
Murphy*: 24.1 - 24.3
Slice sampling
Christoph Jan
19.06 Sampling: Advanced Techniques Raymond Aleksandar Bishop*: 11.5
The No-U-Turn Sampler
Deniz Mesut
26.06 Particle Filters Can Aleksandar Murphy*: 23.5
A Tutorial on Particle Filtering and Smoothing
Stephan Christoph
03.07 Natural Gradients (Cancelled) Haris Aleksandar New insights and perspectives on the natural gradient method
Revisiting natural gradient for deep networks
------ ------
10.07 Bayesian Optimization Yu Aleksandar A Tutorial on Bayesian Optimization Practical Bayesian Optimization of ML Algorithms Stephan Jan
Ivan
17.07 Probabilistic Numerics Deniz Oleksandr Probabilistic numerics and uncertainty in computations
Fast Probabilistic Optimization from Noisy Gradients
Jakob Raymond
24.07 Large-Scale Learning Systems Ukrit Amir MXNet
TensorFlow
Ivan Mesut

*Bishop  = Pattern Recognition and Machine Learning

*Murphy = Machine Learning: A Probabilistic Perspective

Both books are available in the TUM library

Organizational Details

  • 12 Participants
  • 5 ECTS
  • Language: English
  • Weekly meetings every Monday 12:30-14:00, room 00.08.055
  • Mandatory attendance of the weekly sessions
  • Please send your questions regarding the seminar to kdd-seminar-inference@in.tum.de.

Prerequisites

  • The seminar is designed for Master students of the Computer Science department.
  • This seminar deals with advanced and cutting edge topics in machine learning and data mining research. Therefore, the students are expected to have a solid background in these areas (e.g. having attended at least one of the related lectures, such as "Mining Massive Datasets", "Machine Learning", etc.). 

Requirements

  • Extended abstract: 1 page article document class with motivation, key concepts and results.
  • Paper: 5-8 pages in ACM format.
  • Presentation: 30 minutes talk + 15 minutes discussion. (Optional: Beamer template)
  • Peer-review process.

Dates

  • 27.01.2017 16:00: Pre-course meeting in Interims Hörsaal 2. Slides can be found here.
  • 03.02.17 - 08.02.17: Application and registration in the matching system of the department.
  • After 15.02.17: Notification of participants.
  • 01.03.2017 11:30: Kick-off meeting in the room 02.09.014. Slides can be found here.
  • Starting 24.04: Weekly meetings every Monday 12:30-14:00, room 00.08.055.

Deadlines

  • 1 week before the talk: submission of an extended abstract and slides
  • One day before the talk: submission of a preliminary paper for review
  • 1 week after the talk: receiving comments from reviewers
  • 2 week after the talk: submission of the final paper