In this course, you will learn how to build your own neural network optimizers so that you can train your neural networks on data. This course is Part 3 in the Flamethrower Core series.You'll learn about various concepts associated with neural network optimization such as loss functions, gradient descent, learning rates, and hyperparameter search. You'll also learn about the mathematical and theoretical underpinnings of these strategies such as Maximum Likelihood Estimation and Maximum A Posteriori. As each of these concepts is introduced, you'll implement them in your library and test them out on real world data. View the entire course syllabus below, along with preview lessons. Be sure to click the drop down arrow to see the syllabus in its entirety.

Course Curriculum

  Course Introduction
Available in days
days after you enroll
  Why Do We Care So Much About Neural Networks?
Available in days
days after you enroll
  The Optimization Module
Available in days
days after you enroll
  Resources, References, and Course Credits
Available in days
days after you enroll

Select a pricing plan and sign up

$99

Unlimited Access

Get unlimited access to this course