Syllabus for Advanced Probabilistic Machine Learning

Avancerad probabilistisk maskininlärning

Syllabus

  • 7.5 credits
  • Course code: 1RT003
  • Education cycle: Second cycle
  • Main field(s) of study and in-depth level: Image Analysis and Machine Learning A1F, Mathematics A1F, Computer Science A1F
  • Grading system: Fail (U), Pass (3), Pass with credit (4), Pass with distinction (5)
  • Established: 2021-03-04
  • Established by: The Faculty Board of Science and Technology
  • Applies from: Autumn 2021
  • Entry requirements: 120 credits including Probability and Statistics, Linear Algebra II, Single Variable Calculus, Statistical Machine Learning, a course in multivariable calculus and a course in introductory programming. Proficiency in English equivalent to the Swedish upper secondary course English 6.
  • Responsible department: Department of Information Technology

Learning outcomes

On completion of the course, the student shall be able to:

  • discuss and determine if an engineering-related problem can be formulated as a supervised or unsupervised machine learning problem, and to make this formulation.
  • discuss similarities and differences (both practical and theoretical) between probabilistic and "traditional" machine learning methods.
  • account and motivate a probabilistic approach, and be able to interpret and explain the outcome from probabilistic machine learning methods.
  • analyze, implement and use the probabilistic models and methods that are included in the course.
  • analyze, implement and use methods for nonlinear dimensionality reduction.
  • critically examine and provide constructive criticism on other student's reports about machine learning.

Content

This is an advanced course in machine learning, focusing on modern probabilistic/Bayesian models: Bayesian linear regression, graphical models, Gaussian processes and variational autoencoders, as well as methods for exact and approximate inference in such models: Monte Carlo methods, moment matching and variational inference. Probability theory.

Instruction

Lectures, problem solving sessions (both with and without computer), laboratory work, feedback on written assignments.

Assessment

Written exam (4.5 credits). Oral and written presentation of assignments and written discussion and examination of the written assignments of others (3 credits). 

If there are special reasons for doing so, an examiner may make an exception from the method of assessment indicated and allow a student to be assessed by another method. An example of special reasons might be a certificate regarding special pedagogical support from the disability coordinator of the university.

Other directives

The course cannot be included in the same degree as 1RT705 Advanced probabilistic machine learning.

Reading list

Reading list

Applies from: Autumn 2022

Some titles may be available electronically through the University library.

  • Bishop, Christopher M. Pattern recognition and machine learning

    New York, NY: Springer, cop. 2006

    Find in the library

Last modified: 2022-04-26