Quick survey of recent deep learning
- Lecture: S3-deepNNSurvey
- Version: current
- Please to Read: DNN Cheatsheets
- Recorded Videos: M1 + M2 + M3 + M4
Att: the following markdown text was generated from the corresponding powerpoint lecture file automatically. Errors and misformatting, therefore, do exist (a lot!)!
This lecture covers 10 deep learning trends that go beyond classic machine learning:
-
- Popular CNN, RNN, Transformer models are not covered much here
-
- DNN on graphs / trees / sets
-
- NTM 4program induction
-
- Deep Generative models/ DeepFake
-
- Deep reinforcement learning
-
5 . Few-shots / Meta learning / AGI?
-
- pretraining workflow / Autoencoder / self-supervised training
-
- Generative Adversarial Networks (GAN) workflow
-
- AutoML workflow / Learning to optimize /to search architecture
-
- Validate / Evade / Test / Verify / Understand DNNs
-
- Model Compression / Efficient Net
Disclaimer: it is quite hard to make important topics of deep learning fit on a one-session schedule. We aim to make the content reasonably digestible in an introductory manner. We try to focus on a modularity view by introducing important variables to digest deep learning into chunks regarding data/ model architecture /tasks / training workflows and model characteristics. We think this teaching style provides students with context concerning those choices and helps them build a much deeper understanding.