This class introduces the concepts and practices of deep learning. The course consists of three parts. In the first part, we give a quick introduction to classical machine learning and review some key concepts required to understand deep learning. In the second part, we discuss how deep learning differs from classical machine learning and explain why it is effective in dealing with complex problems such as image and natural language processing. Various CNN and RNN models will be covered. In the third part, we introduce deep reinforcement learning and its applications.
This course also gives coding labs. We will use Python 3 as the main programming language throughout the course. Some popular machine learning libraries such as Scikit-learn and Tensorflow 2.0 will be used and explained in details.
Yung-Cheng Chen
陳永承*
Yen-Ting Wang
王彥婷
Jou-Hsuan Yang
楊柔暄
Sheng-Wei Cheng
鄭勝偉
Wei-Hung Chang
張維紘
This course is intended for senior undergraduate and junior graduate students who have a proper understanding of
If you have any feedback, feel free to contact: shwu [AT] cs.nthu.edu.tw
What's ML? | What’s Deep Learning? | About This Courses | FAQ
This lab guides you through the setup of scientific Python environment and provides useful references for self-reading.
Span & Linear Dependence | Norms | Eigendecomposition | Singular Value Decomposition | Traces & Determinant
This lab guides you through the process of Exploratory Data Analysis (EDA) and discuss how you can leverage the Principle Component Analysis (PCA) to visualize and understand high-dimensional data.
Random Variables & Probability Distributions | Multivariate & Derived Random Variables | Bayes’ Rule & Statistics | Principal Components Analysis | Technical Details of Random Variables | Common Probability Distributions | Common Parametrizing Functions | Information Theory | Decision Trees & Random Forest
In this lab, we will apply the Decision Tree and Random Forest algorithms to the classification and dimension reduction problems using the Wine dataset.
Numerical Computation | Optimization Problems | Gradient Descent | Newton's Method | Stochastic Gradient Descent | Perceptron | Adaline | Constrained Optimization | Linear & Polynomial Regression | Generalizability & Regularization | Duality
In this lab, we will guide you through the implementation of Perceptron and Adaline, two of the first machine learning algorithms for the classification problem. We will also discuss how to train these models using the optimization techniques.
This lab guides you through the linear and polynomial regression using the Housing dataset. We will also extend the Decision Tree and Random Forest classifiers to solve the regression problem.
Learning Theory | Point Estimation | Bias & Variance | Consistency | Decomposing Generalization Error | Weight Decay | Validation
In this lab, we will guide you through some common regularization techniques such as weight decay, sparse weight, and validation.
In this competition, you are provided with a supervised dataset consisting of the raw content and binary popularity of news articles. What you need to do is to learn a function that is able to predict the popularity of an unseen news article.
Probabilistic Models | Maximum Likelihood Estimation | Linear Regression | Logistic Regression | Maximum A Posteriori Estimation | Bayesian Estimation and Inference | Gaussian Process
In this lab, we will guide you through the practice of Logistic Regression. We will also introduce some common evaluation metrics other than the "accuracy" that we have been used so far.
In this tutorial, you will learn the fundamentals of how you can build neural networks without the help of the deep learning frameworks, and instead by using NumPy.
We are going to use TensorFlow as our framework in the following lectures. In this lab, you will learn how to install TensorFlow and get a better understanding by implementing a classical deep learning algorithm.
KNNs | Parzen Windows | Local Models | Support Vector Classification (SVC) | Slacks | Nonlinear SVC | Dual Problem | Kernel Trick
In this lab, we will classify nonlinearly separable data using the KNN and SVM classifiers. We will show how to pack multiple data preprocessing steps into a single Pipeline in Scikit-learn to simplify the training workflow.
Cross Validation | How Many Folds? | Voting | Bagging | Boosting | Why AdaBoost Works?
In this lab, we will guide you through the cross validation technique for hyperparameter selection. We will also practice and compare some ensemble learning techniques.
NN Basics | Learning the XOR | Back Propagation | Cost Function & Output Neurons | Hidden Neurons | Architecture Design & Tuning
Momentum & Nesterov Momentum | AdaGrad & RMSProp | Batch Normalization | Continuation Methods & Curriculum Learning | NTK-based Initialization | Cyclic Learning Rates | Weight Decay | Data Augmentation | Dropout | Manifold Regularization | Domain-Specific Model Design
In this lab, we will introduce a neural network, called the word2vec, that embeds words into a dense vector space where semantically similar words are mapped to nearby points.
Convolution Layers | Pooling Layers | Variants & Case Studies | Visualizing Activations | Visualizing Filters/Kernels | Visualizing Gradients | Dreaming and Style Transfer | Segmentation and Localization | Object Detection | More Applications
In this lab, we will introduce two datasets, MNIST and CIFAR-10, then we will talk about how to implement CNN models for these two datasets using tensorflow. Then offer a guide to illustrate typical input pipeline of TensorFlow 2.0.
This lab guides how to load and use a pretrained VGG19 model and how to visualize what the CNN networks have learned in selected layers. This also introduces an interesting technique called "Style Transfer" and displays galleries of its creative outputs. Last but not least, we will also demonstrate how to save and load model during training and explain the TensorFlow family briefly.
In this competition, you should design a model to detect multiple objects in the image. Object detection is a multi-tasks learning problem, which means the model have to localize and classify several objects simultaneously.
Vanilla RNNs | Design Alternatives | Backprop through Time (BPTT) | Optimization Techniques | Optimization-Friendly Models & LSTM | Parallelism & Teacher Forcing | Attention | Explicit Memory | Adaptive Computation Time (ACT) | Visualization | Memory Networks | Google Neural Machine Translation | Transformers | Subword Tokenization |
This lab guides how to use recurrent neural networks to model continuous sequence like nature language, and use it on not only article comprehension but also word generation.
In this lab, we introduce how to design a model that can be given an image, and then generates suitable caption which can describe the image. To accomplish this, you'll use an attention-based model, which enables us to see what parts of the image the model focuses on as it generates a caption.
Text Models & Image Models | Clustering | Factorization | Dimesion Reduction | ChatGPT | Autoencoders & Manifold Learning | Variational Autoencoders (VAE) | Flow-based Models | Diffusion Models | Generative Adversarial Networks (GANs)
In this lab, we are going to introduce Diffusion Models.
In this competition, given a set of texts, your task is to generate suitable imagese to illustrate each of the texts. We will guide you to use GANs to complete this competition.
Markov Decision Process(MDP) | Model-Free RL using Monte Carlo Estimation | Temporal-Difference Estimation and SARSA | Exploration Strategies | Q-Learning
In this lab, we will introduce temporal-difference learning and then use Q-learning to train an agent to play "Flappy Bird" game.
Introduction | Deep Q-Network | Double DQN | Prioritized Reply | Dueling Network | NoisyNet and Scalable Implementations | Policy Gradient Methods & DDPG | Episodic Policy Gradient & REINFORCE | Reducing Variance | Baseline Subtraction | Function Approximation, Actor-Critic, and A3C
In this lab, we will introduce PPOxGAE and use it to train a frame-based and state-based agents to play "Flappy Bird" game.
Following provides links to some useful online resources. If this course starts your ML journey, don't stop here. Enroll yourself in advanced courses (shown below) to learn more.
For more course materials (such as assignments, score sheets, etc.) and online forum please refer to the eeclass system.
Ian Goodfellow, Yoshua Bengio, Aaron Courville, Deep Learning, MIT Press, 2016, ISBN: 0387848576
Trevor Hastie, Robert Tibshirani, Jerome Friedman, The Elements of Statistical Learning: Data Mining, Inference, and Prediction, Second Edition, Springer, 2009, ISBN: 0387848576
Christopher M. Bishop, Pattern Recognition and Machine Learning, Springer, 2006, ISBN: 0387310738
Sebastian Raschka, Python Machine Learning, Packt Publishing, 2015, ISBN: 1783555130