Hasso-Plattner-Institut25 Jahre HPI
Hasso-Plattner-Institut25 Jahre HPI
 

Advanced Topics in Deep Learning (Wintersemester 2020/2021)

Dozent: Prof. Dr. Christoph Lippert (Digital Health - Machine Learning) , Jana Fehr (Digital Health - Machine Learning)

Allgemeine Information

  • Semesterwochenstunden: 2
  • ECTS: 3
  • Benotet: Ja
  • Einschreibefrist: 01.10.2020 - 20.11.2020
  • Lehrform: Seminar / Projekt
  • Belegungsart: Wahlpflichtmodul
  • Lehrsprache: Deutsch
  • Maximale Teilnehmerzahl: 10

Studiengänge, Modulgruppen & Module

IT-Systems Engineering MA
  • OSIS: Operating Systems & Information Systems Technology
    • HPI-OSIS-K Konzepte und Methoden
  • OSIS: Operating Systems & Information Systems Technology
    • HPI-OSIS-T Techniken und Werkzeuge
Data Engineering MA
Digital Health MA

Beschreibung

Please register by filling out the following Google-Form. We will confirm your enrollment in an E-Mail.

For the link to the first zoom meeting please send a mail to Jasmin

Machine Learning algorithms learn correlation patterns from observed data and use those to make predictions. But how can we know how certain model predictions are? - In this course, we will investigate this question and focus on the topic of probabilistic Machine Learning. The key idea behind probabilistic machine learning is to make assumptions based on prior knowledge and encode those into a model that explains observed data. These models can be used to make predictions under the hallmark of uncertainty.

In this seminar, we will work through a list of fundamental scientific publications about probabilistic machine learning, covering different approaches to perform Bayesian inference in deep learning, including Markov Chain Monte Carlo and variational approximations. We also will look at application examples. Every week, we will discuss one paper. Each course participant will present one paper and lead the subsequent discussion. The paper presentation should give a brief overview of the paper (Background, aims, methods, results), summarize take-home-messages and lead to the discussion with open questions. To foster a productive discussion in the group, we expect that every participant has read the paper in advance.

The seminar will be fully online using Zoom.

Please register by filling out the following Google-Form.
Please note that the seminar spots are limited and filling out the form is not a final confirmation of your participation.
We will send you a separate e-Mail informing you if you can attend the seminar.

Learning goals:

You will learn how to read scientific literature and how to extract, present and discuss the essential information.

Schedule:

The course duration is 12 weeks (from November 9th 2020 until February 8th 2021) with one appointment of 1.5 hours per week on Monday from 1:30-3:00pm.
In the first week, we will have short presentations to recap on probability theory and bayesian inference.
Then, we will discuss one paper per week for the following 10 weeks. In the last course slot, we will wrap up and summarize our key learnings from the course. 

Voraussetzungen

  • Having successfully attended the ‘Introduction to Deep Learning’ or ‘Mathematics for Deep Learning’ course of Prof. Lippert or an equivalent university course
  • Basic knowledge about probability theory
  • Attend all sessions (one session can be excused with prior notification)
  • Presenting one paper and leading the discussion
  • Stable internet connection with web-camera switched on

Literatur

  Advanced Topics in Deep Learning  
WS 20/21
 
 
Week Topic Session goal Resource Further resources
1 Probability Theory, Bayesian Inference, Bayesian Neural Networks tutorial Get an introduction to probabilistic learning and generative modeling. Understand the problem we are trying to solve in this course. mackay/itila/  cpt 20-22  
2 Mac Kay: ML, Inference, Laplace's Method Book Understand Laplace's method as a simple way to approximate the posterior distribution

mackay/itila/  cpt 23-24+27

 
3 MCMC: Metropolis Hastings: 2 people General purpose sampling methods. Hasings as a special case of MCMC mackay/itila/ cpt 29  
4 Efficient MCMC: Hamiltonian Can we speed up MCMC using gradients? mackay/itila/ cpt 30  
5 Variational Methods Mac Kay p 422 Instead of doing sampling, can we find an analytic posterior approximation with a simpler distribution? (but not as simple as Laplace's Method?)

mackay/itila/ cpt 33 (not 33.3)

 
6 Variational Methods: Weight Uncertainty in Neural Networks Appling previous methods on DNNs

arxiv.org/pdf/1505.05424

 
7

Variational Methods: A Simple Baseline for Bayesian Uncertainty in Deep Learning

How can we use Laplace's approximation combined with SGD to approximate the posterior - Bringing together Metropolis Hastings/Hamiltonian and VI

arxiv.org/abs/1902.02476

 
8 Bayesian DL: Bayes SGD (MCMC): Langevin Dynamics A MCMC method that combines gaussian noise with SGD to approximate the posterior stoclangevin_v6.pdf

arxiv.org/abs/2002.02405

arxiv.org/abs/1902.03932

arxiv.org/abs/1506.04696

9 A Simple Baseline for Bayesian Uncertainty in Deep Learning; On Last-Layer Algorithms for Classification: Decoupling Representation from Uncertainty Estimation; How to add uncertainty to the predictions, is it sufficient to use uncertainties only on the last layers instead of the whole network? arxiv.org/abs/1902.02476

arxiv.org/abs/2005.10419

10 Variational Autoencoder (VAE) Can we use a NN to parameterize a probability distribution to generate arbitraliy complex data? (Instead of learning a posterior over a NN)

VAE

 
11 Generative Adversarial Nets (GAN) Can we find a similar generative model withou using probabilities?

GAN

 
12 Summary over different methods, lessons learnt, future research directions      

Leistungserfassung

The grade for this seminar will be 80% presentation and discussion leading for your expert topic and
20% active participation in discussions throughout the seminar.

Termine

Mondays  13:30-15h (zoom meetings)

For the link to the first meeting please send a mail to Jasmin
Please register by filling out the following Google-Form. We will confirm your enrollment in an E-Mail.

Zurück