In T3-2021, we will have a fully online course. All lectures will be offered online in the form of topic-based videos, which you can watch in your own time. Tutorials will be held in the timetabled time slots. We will have online tutorials through MS Teams.
We are following the lockdown developments in New South Wales. In the case the conditions change and the lockdown restrictions are lifted, we may propose a subset of tutorials to be offered face-to-face.
Assignments and quizzes will be done online as usual. The Final Exam will most likely be held online.
COMP9418/1UGA
COMP9418/1PGA |
Week 1-5,7-10
|
Pre-recorded lectures on
Youtube and Echo360 |
Online
|
Gustavo Batista
|
COMP9418/1UGA
COMP9418/1PGA |
Week 1-5,7-10 |
Thu 12:00 - 13:00
|
Online
MS Teams |
Gustavo Batista and
Jeremy Gillen |
T12A | tue12a | Week 1-5,7-10 | Tue 12:00 - 14:00 |
Online
MS Teams |
Payal Bawa |
T14A | tue14a | Week 1-5,7-10 | Tue 14:00 - 16:00 |
Online
MS Teams |
Payal Bawa |
T14B | tue14b | Week 1-5,7-10 | Tue 14:00 - 16:00 |
Online
MS Teams |
Peng Yi |
W11A | wed11a | Week 1-5,7-10 | Wed 11:00 - 13:00 |
Online
MS Teams |
Peng Yi |
W14A | wed14a | Week 1-5,7-10 | Wed 14:00 - 16:00 |
Online
MS Teams |
Payal Bawa |
W14B | wed14b | Week 1-5,7-10 | Wed 14:00 - 16:00 |
Online
MS Teams |
Peng Yi |
In this course, we will study a class of inference models known as Probabilistic Graphical Models (PGMs). PGMs are a great example of how Computer Science and Statistics can work together. PGMs use graph data structures to represent domains with large amounts of variables and specialised algorithms for efficient inference over these graphical models. Therefore, PGMs have pushed the limits of probability theory to the scale and rate necessary to provide automated reasoning in modern AI systems.
During this course, we will cover several graphical models, including Bayesian networks, Markov networks, Conditional Random Fields, Markov chains, Hidden Markov Models and Markov decision processes. We will have a clear understanding of how these models work as well as their main algorithms for inference and learning. We will also cover several algorithms used to learn parameters and make inferences such as Monte Carlo Markov Chains (MCMC), Gibbs Sampling, Viterbi and the Baum-Welch algorithms, among others.
Course Code | COMP9418 |
Course Title | Advanced Topics in Statistical Machine Learning |
Convenor | Gustavo Batista |
Admin | Jeremy Gillen |
Consultations | Mondays 14:00-15:00, MS Teams |
Units of Credit | 6 |
Course Website | http://cse.unsw.edu.au/~cs9418/21T3/ |
Handbook Entry | http://www.handbook.unsw.edu.au/postgraduate/courses/current/COMP9418.html |
Student Reps |
stureps@cse.unsw.edu.au
Email the stureps if you have any issues with the course. They will pass these anonymously to the relevant people to get the issues resolved. |
Knowledge of machine learning at the level of COMP9417.
We will use Python in the practical part of our bring-your-own-device tutorials. Students will have to install Jupyter Notebook on their computers to execute the practical part of the tutorials. Alternatively, they can remotely access the VLAB computers using a VPN (if outside UNSW) and software such as TigerVNC . However, it is expected that students unfamiliar with Python can get up to speed if they are able to construct (i.e., design, implement and test) working software in a general-purpose language such as C/C++ or Java, at least to the level of a first-year computing course (e.g., COMP1927 Computing 2 or equivalent). Having a good working knowledge of and be able to construct working software in standard data analysis languages such as Matlab/Octave, or R can also be helpful.
Since an important part of practical machine learning is "data wrangling" (i.e. pre-processing, filtering, cleaning, etc.) of data files, students are expected to master Unix tools such as those taught in COMP2041 Software Construction, or equivalents such as those in Matlab/Octave or R.
After completing this course, students will:
This course contributes to the development of the following graduate capabilities:
Graduate Capability | Acquired in |
Scholars capable of independent and collaborative enquiry, rigorous in their analysis, critique and reflection, and able to innovate by applying their knowledge and skills to the solution of novel as well as routine problems | Lectures, tutorials, assignments and exam |
Entrepreneurial leaders capable of initiating and embracing innovation and change, as well as engaging and enabling others to contribute to change | Assignments |
Professionals capable of ethical, self-directed practice and independent lifelong learning | Lectures, tutorials and assignments |
Global citizens who are culturally adept and capable of respecting diversity and acting in a socially just and responsible way |
Lectures, tutorials, assignments and student interactions
|
Machine learning is at the intersection of Artificial Intelligence, Computer Science and Statistics. While the main goal of this course is to go beyond the basics of machine learning as provided by COMP9417 (focused on probabilistic modelling and inference), we will adopt a similar teaching rationale, where theory, algorithms and empirical analysis are all important components of the course. Therefore, the lectures, tutorials and assessments are designed to address these components jointly.
The course involves lectures and practical work.
The Student Code of Conduct ( Information , Policy ) sets out what the University expects from students as members of the UNSW community. As well as the learning, teaching and research environment, the University aims to provide an environment that enables students to achieve their full potential and to provide an experience consistent with the University's values and guiding principles. A condition of enrolment is that students inform themselves of the University's rules and policies affecting them, and conduct themselves accordingly.
In particular, students have the responsibility to observe standards of equity and respect in dealing with every member of the University community. This applies to all activities on UNSW premises and all external activities related to study and research. This includes behaviour in person as well as behaviour on social media, for example Facebook groups set up for the purpose of discussing UNSW courses or course work. Behaviour that is considered in breach of the Student Code Policy as discriminatory, sexually inappropriate, bullying, harassing, invading another's privacy or causing any person to fear for their personal safety is serious misconduct and can lead to severe penalties, including suspension or exclusion from UNSW.
If you have any concerns, you may raise them with your lecturer, or approach the School Ethics Officer , Grievance Officer , or one of the student representatives.
Plagiarism is defined as using the words or ideas of others and presenting them as your own. UNSW and CSE treat plagiarism as academic misconduct, which means that it carries penalties as severe as being excluded from further study at UNSW. There are several on-line sources to help you understand what plagiarism is and how it is dealt with at UNSW:
Make sure that you read and understand this. Ignorance is not accepted as an excuse for plagiarism. In particular, you are also responsible that your assignment files are not accessible by anyone but you by setting the correct permissions in your CSE directory and code repository, if using. Note also that plagiarism includes paying or asking another person to do a piece of work for you and then submitting it as your own work.
UNSW has an ongoing commitment to fostering a culture of learning informed by academic integrity. All UNSW staff and students have a responsibility to adhere to this principle of academic integrity. Plagiarism undermines academic integrity and is not tolerated at UNSW. Plagiarism at UNSW is defined as using the words or ideas of others and passing them off as your own.
If you haven't done so yet, please take the time to read the full text of
The pages below describe the policies and procedures in more detail:
You should also read the following page which describes your rights and responsibilities in the CSE context:
Please note due dates are subject to change.
Item | Topics | Due | Marks |
Contributes to
(learning outcomes) |
Assignment 1
|
Weeks 1-2 | Week 5 | 15% | 1-2 |
Assignment 2
|
Weeks 3-7
|
Week 9 |
15%
|
3-4
|
Quizzes | Weeks 2-5,7-10 | Weeks 2,3,4,5,7,8,9,10 | 10% | 1-5 |
Final Exam | All topics | Exam period | 60% | 1-5 |
Quizzes are multiple-choice questions used to check the understanding during the course. The final mark for quizzes is the simple average of the eight quiz marks. Each quiz mark is normalised in the range 0-10.
There is a hurdle on the Final Exam; very poor performance in the exam will result in a fail, even if all your other assessment marks have been satisfactory. The following formula describes precisely how the mark will be computed and how the hurdle will be enforced.
quizzes = mark for quizzes (out of 10) ass1 = mark for assignment 1 (out of 15) ass2 = mark for assignment 2 (out of 15) finalExam = finalExam (out of 60) okExam = finalExam >= 24/60 mark = quizzes + labs + ass1 + ass2 + exam grade = HD|DN|CR|PS if mark >= 50 && okExam = FL if mark < 50 = UF if mark >= 50 && !okExam
Please note this is a tentative schedule. All dates are only indicative and subject to change.
Week | Lecture | Tutorial | Assignment | Quizzes |
1 | Course overview [Ch. 1], propositional logic [Ch. 2] and probability calculus [Ch. 3] | Graph representation, traversal and common algorithms | - | - |
2 | Bayesian networks representation and semantics [Chs. 4 and 5] | Probability calculus and factor implementation | - | Quiz 1 |
3 | Exact inference [Ch. 6]. Bayesian networks as classifiers | Bayesian networks | Ass1 released | Quiz 2 |
4 | Markov chains and hidden Markov models | Variable elimination | - | Quiz 3 |
5 | MAP inference [Ch. 10]. Markov networks | Markov chains and hidden Markov models | Ass1 due | Quiz 4 |
6 | Flexibility Week | - | - | - |
7 | Gaussian Bayesian Networks | Markov networks | Ass2 released | Quiz 5 |
8 |
The jointree algorithm [Chs. 7 and 9]
|
Gaussian Bayesian networks | - | Quiz 6 |
9 |
Belief propagation [Ch. 14]. Approximate inference by Sampling [Ch. 15]
|
Factor elimination and jointrees | Ass2 due | Quiz 7 |
10 | Learning parameters and graph structure [Ch. 17] | Belief propagation and sampling | - |
Quiz 8
|
This website also has links to the auxiliary material/documentation that you will need for the course. Solutions for all tutorial questions and exercises will also be made available.
This course is evaluated using the myExperience system.
In the previous offering of this course, students suggested some changes in the content sequence and the addition of new material covering continuous distributions. In conversation with the students, we also noted that the tutorial code needed to be faster to support their assessment implementations.
Based on their comments, we have placed the MAP lecture earlier in the course. We also reduced the content of this lecture to allow space for a new lecture covering Gaussian Bayesian networks. We reimplemented the tutorial code replacing an unordered dictionary with a NumPy array to increase code efficiency. We also improved code organisation using an objected-oriented implementation.
We thank all the students that provided feedback to this course through MyExperience, email and conversations. These students include Martin Eftimoski, Gareth Dando, and many others.
Resource created Thursday 19 August 2021, 11:18:04 AM, last modified Wednesday 27 October 2021, 02:21:20 PM.