Hasso-Plattner-Institut25 Jahre HPI
Hasso-Plattner-Institut25 Jahre HPI
 

Einführung in das Probabilistische Maschinelle Lernen (Sommersemester 2023)

Dozent: Prof. Dr. Ralf Herbrich (Artificial Intelligence and Sustainability) , Jakob Walter (Artificial Intelligence and Sustainability)

Allgemeine Information

  • Semesterwochenstunden: 4
  • ECTS: 6
  • Benotet: Ja
  • Einschreibefrist: 01.04.2023 - 07.05.2023
  • Lehrform: Vorlesung / Übung
  • Belegungsart: Wahlpflichtmodul
  • Lehrsprache: Englisch

Studiengänge, Modulgruppen & Module

IT-Systems Engineering BA

Beschreibung

Probabilistic machine learning has gained a lot of practical relevance over the past 15 years as it is highly data-efficient, allows practitioners to easily incorporate domain expertise and, due to the recent advances in efficient approximate inference, is highly scalable. Moreover, it has close relations to causal inference which is one of the key methods for measuring cause-effect relationships of machine learning models and explainable artificial intelligence. This course will introduce all recent developments in probabilistic modeling and inference. It will cover both the theoretical as well as practical and computational aspects of probabilistic machine learning. In the course, we will implement all the inference techniques and apply them to real-world problems.

Voraussetzungen

It is highly recommended that you have completed the following courses

  1. HPI-MA3: Stochastik (BSc studies)
  2. HPI-MA2: Analysis und Lineare Algebra (BSc studies)

Literatur

Lern- und Lehrformen

We will have one weekly lecture and one weekly tutorial.

  • Lecture: Monday, 9:15am - 10:45am (L-E.03)
  • Tutorial: Tuesday, 1:30pm - 3:00pm (L-E.03)

HPI Moodle Course: https://moodle.hpi.de/course/view.php?id=430

Leistungserfassung

There will be three exercises throughout the semester counting for 40% of the points. The remaining 60% of points will be awarded through a written exam at the end of the course. 

Termine

  • Week 1 (17.4 & 18.4):  Probability Theory 
  • Week 2 (24.4 & 25.4):  Information Theory
  • Week 3 (2.5): Exercises on Information Theory
  • Week 4 (8.5 & 9.5):  Inference & Decision Making
  • Week 5 (15.5 & 16.5): Linear Basis Function Models
  • Week 6 (22.5 & 23.5): Bayesian Linear Regression
  • Week 7 (30.5): Exercises on Multivariate Gaussians
  • Week 8 (5.6 & 6.6): Gaussian Processes & Bayesian Model Selection
  • Week 9 (12.6 & 13.6): Classification Learning (Maximum Likelihood)
  • Week 10 (19.6 & 20.6): Bayesian Classification Learning
  • Week 11 (26.6 & 27.6): Bayesian Ranking
  • Week 12 (3.7 & 4.7): Bayesian Clustering
  • Week 13 (10.7 & 11.7): Bayesian Methods for Sequential Data

Zurück