Hasso-Plattner-Institut25 Jahre HPI
Hasso-Plattner-Institut25 Jahre HPI
 

Einführung in das Probabilistische Maschinelle Lernen (Sommersemester 2024)

Dozent: Prof. Dr. Ralf Herbrich (Artificial Intelligence and Sustainability) , Dr. Rainer Schlosser (Enterprise Platform and Integration Concepts)

Allgemeine Information

  • Semesterwochenstunden: 4
  • ECTS: 6
  • Benotet: Ja
  • Einschreibefrist: 01.04.2024 - 30.04.2024
  • Prüfungszeitpunkt §9 (4) BAMA-O: 24.07.2024
  • Lehrform: Vorlesung / Übung
  • Belegungsart: Wahlpflichtmodul
  • Lehrsprache: Englisch

Studiengänge, Modulgruppen & Module

IT-Systems Engineering BA

Beschreibung

Probabilistic machine learning has gained a lot of practical relevance over the past 15 years as it is highly data-efficient, allows practitioners to easily incorporate domain expertise and, due to the recent advances in efficient approximate inference, is highly scalable. Moreover, it has close relations to causal inference which is one of the key methods for measuring cause-effect relationships of machine learning models and explainable artificial intelligence. This course will introduce all recent developments in probabilistic modeling and inference. It will cover both the theoretical as well as practical and computational aspects of probabilistic machine learning. In the course, we will implement all the inference techniques and apply them to real-world problems.

Voraussetzungen

It is highly recommended that you have completed the following courses

  1. HPI-MA3: Stochastik (BSc studies)
  2. HPI-MA2: Analysis und Lineare Algebra (BSc studies)

Literatur

Lern- und Lehrformen

We will have one weekly lecture and one weekly tutorial.

  • Lecture: Monday, 11:00 - 12:30 (HS 1)
  • Tutorial: Tuesday, 15:15 - 16:45 (HS 1)

HPI Moodle Course: https://moodle.hpi.de/course/view.php?id=755

Leistungserfassung

There will be three exercises throughout the semester counting for 30% of the points. The remaining 70% of points will be awarded through a written exam at the end of the course. 

Termine

  • Week 1 (8.4 & 9.4):  History & Probability Theory 
  • Week 2 (15.4 & 16.4):  Inference & Decision Making
  • Week 3 (22.4 & 23.4): Graphical Models: Independence
  • Week 4 (29.4 & 30.4):  Graphical Models: Inference
  • Week 5 (6.5 & 7.5): Bayesian Ranking
  • Week 6 (13.5 & 14.5): Linear Basis Function Models
  • Week    (20.5 & 21.5): no lectures
  • Week 7 (27.5 & 28.5): Bayesian Regression
  • Week 8 (3.6 & 4.6): Baysian Classification
  • Week 9 (10.6 & 11.6): Non-Baysian Classification Learning
  • Week 10 (17.6 & 18.6): Gaussian Processes
  • Week 11 (24.6 & 25.6): Information Theory
  • Week 12 (1.7 & 2.7): Real-World Applications of Probabilistic Machine Learning
  • Week 13 (8.7 & 9.7): Exam Preparation

Zurück