10.18: Neural Networks: Backpropagation Part 5 – The Nature of Code

[ad_1]
In this video, I implement the formulas for “gradient descent” and adjust the bias in the train() function of my “toy” JavaScript neural network library. I also test the library with a simple XOR dataset.

This video is part of Chapter 10 of The Nature of Code (

This video is also part of session 4 of my Spring 2017 ITP “Intelligence and Learning” course (

Support this channel on Patreon:
To buy Coding Train merchandise:
To donate to the Processing Foundation:

Send me your questions and coding challenges!:

Contact:
Twitter:
The Coding Train website:

Links discussed in this video:
The Coding Train on Amazon:
Deeplearn.js:
Stochastic Gradient Descent on Wikipedia:

Videos mentioned in this video:
My Neural Networks series:
3Blue1Brown Neural Networks playlist:

Source Code for the all Video Lessons:

p5.js:
Processing:

The Nature of Code playlist:
For More Coding Challenges:
For More Intelligence and Learning:


Posted

in

by

Tags: