[ad_1]
In this video, I move beyond the Simple Perceptron and discuss what happens when you build multiple layers of interconnected perceptrons (“fully-connected network”) for machine learning.
Next video:
This video is part of Chapter 10 of The Nature of Code (
This video is also part of session 4 of my Spring 2017 ITP “Intelligence and Learning” course (
Support this channel on Patreon:
To buy Coding Train merchandise:
To donate to the Processing Foundation:
Send me your questions and coding challenges!:
Contact:
Twitter:
The Coding Train website:
Links discussed in this video:
The Nature of Code:
Session 4 of Intelligence and Learning:
Perceptron on Wikipedia:
My Simple Artificial Neural Network JavaScript Library:
My video on AND and OR:
My video on Perceptrons:
kwichmann’s Learning XOR with a neural net:
Books discussed in this video:
Tariq Rashid’s Make Your Own Neural Network:
Marvin Minsky’s Perceptrons:
Source Code for the all Video Lessons:
p5.js:
Processing:
The Nature of Code playlist:
For More Coding Challenges:
For More Intelligence and Learning:
Help us caption & translate this video!