Lectures
Lecture recordings are available on YouTube.
Tutorials are available on GitHub.
-
Lecture 1: Machine Learning Systems: Course Overview
tl;dr: This lecture reviews a brief overview of the course, its requirements, learning goals, policies, and expectations. -
Lecture 2: Machine Learning Systems: ML in Production
tl;dr: This lecture reviews challenges in building and deploying real-world ML systems in production. -
Lecture 3: Machine Learning Systems: Trustworthy AI
tl;dr: This lecture discusses important aspects of AI trustworthiness, including fairness, transparency, accountability, and robustness. -
Lecture 4: Machine Learning Systems: Designing ML Systems
tl;dr: This lecture discusses designing and building process of ML Systems. -
Lecture 5: Perceptrons and Logistic Regression
tl;dr: This lecture is about simple models and algorithms for supervised learning of binary and multi-class classifiers. -
Lecture 6: Optimization and Neural Networks
tl;dr: This lecture is about optimization techniques to train models and build deep nerual networks from multi-class logistic regression models. -
Lecture 7: Machine Learning System Stack
tl;dr: This lecture reviews the full-stack machine learning system development. -
Lecture 8: Backpropagation and Automatic Differentiation
tl;dr: This lecture reviews backprop and automatic differentiation. -
Lecture 9: Convolutional Neural Networks (CNNs, ConvNets)
tl;dr: This lecture reviews the foundation, mathematical operations, the architecture of CNNs, and contrasts it with dense networks in DNNs. -
Lecture 10: Hardware for Machine Learning Systems
tl;dr: This lecture reviews hardware layer of machine learning (deep learning) systems. -
Lecture 11: Model Compression, Pruning, and Quantization
tl;dr: This lecture discusses the key ideas behind DNN model compression techniques. -
Lecture 12: Scalable & Distributed Machine Learning
tl;dr: This lecture introduces variations of gradient descent and ideas how to it scale up using parallel computing. -
2021 Lecture 5: Naïve Bayes
tl;dr: This lecture is about simple classifiers based on Bayes Nets with (naive) independence assumptions between features. -
2021 Lecture 6: Machine Learning System Stack
tl;dr: This lecture reviews the full-stack machine learning system development. -
2021 Lecture 7: Convolutional Neural Networks (CNNs/ConvNets)
tl;dr: This lecture reviews the foundation, mathematical operations, the architecture of CNNs, and contrasts it with dense networks in DNNs. -
2021 Lecture 8: Hardware for Machine Learning Systems
tl;dr: This lecture reviews hardware layer of machine learning (deep learning) systems. -
2021 Lecture 9: Model Compression: Pruning and Quantization
tl;dr: This lecture discusses the key ideas behind DNN model compression techniques. -
2021 Lecture 10: Performance Tradeoff in Machine Learning Systems
tl;dr: This lecture discusses several research directions (multi-objective optimization, transfer learning, causal inference) in ML Systems. -
2020 Introduction to Machine Learning Systems (Uber Case Study)
tl;dr: This lecture reviews challenges of building a real-world ML system that scales. -
2020 Machine Learning Systems: Challenges and Solutions
tl;dr: This lecture reviews reactive strategies to incorporate ML-based components into a larger system. -
2020 Machine Learning Systems: Learning Theory
tl;dr: This lecture reviews basic concepts related to statistical learning theory (e.g., hypothesis space). -
2020 Machine Learning Systems: Backpropagation and Automatic Differentiation
tl;dr: This lecture reviews backprop and automatic differentiation. -
2020 Machine Learning Systems: Optimization and Performance Understanding of ML Systems
tl;dr: This lecture discusses performance optimization of machine learning systems. -
2020 Machine Learning Systems: Machine Learning Platforms
tl;dr: This lecture reviews a platform that facilitates building an ML pipeline in production at scale. -
2020 Machine Learning Systems: Scalable Machine Learning
tl;dr: This lecture introduces variations of gradient descent and ideas how to it scale up using parallel computing. -
2020 Machine Learning Systems: Distributed Machine Learning
tl;dr: This lecture introduces how to scale up deployment (over multiple nodes) to speed up training and inference. -
2020 Machine Learning Systems: Recurrent Neural Networks
tl;dr: This lecture studies RNNs and LSTM architectures for predicting rare events. -
2020 Machine Learning Systems: Intrinsic Dimension
tl;dr: This lecture introduces the concept of instrinsic dimension and its implications for model compression.