Autoplay
Autocomplete
HTML5
Flash
Player
Speed
Previous Lecture
Complete and continue
Flamethrower Core: Neural Networks
Course Introduction
Welcome to the Course
Why Build Your Own Deep Learning Library?
Tools and Workflows for Deep Learning
Components of a Deep Learning Library
Why Do We Care So Much About Neural Networks?
The Universal Approximation Theorem for Neural Networks
The Curse of Dimensionality (22:54)
Neural Network Module
The History of Neural Networks
The Module Base Class (19:54)
Practicalities: Using Logging
Parameter Initialization Strategies - Xavier Initialization (19:13)
Parameter Initialization Strategies - Glorot Uniform Initialization (6:10)
Parameter Initialization Strategies - The Implementation (11:24)
The Linear Layer (11:41)
Regression and Classification (10:35)
Activation Functions (17:35)
Statistical Estimators, Underfitting, and Overfitting (23:28)
Regularization for Better Generalization (25:48)
Regularization - Label Smoothing (29:16)
Regularization - The Elastic Net (8:20)
Implementing Regularization (7:04)
A Drop of Dropout - Part 1 (4:04)
A Drop of Dropout - Part 2 (22:50)
Implementing Dropout (8:38)
A Batch of Batch Norm (18:08)
Implementing Batch Norm (6:28)
Resources, References, and Course Credits
Resources and References
Course Credits
The Curse of Dimensionality
Lecture content locked
If you're already enrolled,
you'll need to login
.
Enroll in Course to Unlock