CATALOG DESCRIPTION: Probabilistic graphical models are a powerful technique for handling uncertainty in machine learning. The course will cover how probability distributions can be represented in graphical models, how inference and learning are performed in the models, and how the models are utilized for machine learning in practice.
- This course fulfills the AI Depth course requirement.
Required Textbook: Koller and Friedman. Probabilistic Graphical Models: Principles and Techniques. The MIT Press.
Course Coordinator: Prof. Doug Downey
Course Goals: The primary learning objective of the course is for students to gain an understanding of how graphical models can be used to represent probability distributions, and how to perform inference and learning. Topics include directed and undirected graphical models, exact and approximate inference methods, and supervised and unsupervised parameter and structure learning.
Prerequisites: EECS 349 or permission of the instructor
Grades: Homework will involve both mathematical exercises and programming assignments, and will comprise 75% of the grade. A final exam comprises the remaining 25% of the grade.