9.2: Genetic Algorithm: How it works – The Nature of Code

In part 2 of this genetic algorithm series, I explain how the concepts behind Darwinian Natural Selection are applied to an computational evolutionary algorithm.

Support this channel on Patreon:

Send me your questions and coding challenges!:


Links discussed in this video:
The Nature of Code:

Source Code for the Video Lessons:


For More Genetic Algorithm videos:

For More Nature of Code videos:

Help us caption & translate this video!






23 responses to “9.2: Genetic Algorithm: How it works – The Nature of Code”

  1. Ahmad Haydar Avatar

    you're wasting my time in an educational way….. and this satisfies both the lazy and the nerd me.
    nice shirt btw

  2. CrimsonYeti Avatar

    Why can you not be my teacher for literally everything?

  3. kevnar Avatar

    I once made a project with a certain population of entities in a closed space, and they all had a random value for "beauty", on a scale of one to ten. As they wandered around and bumped into each other, they would decide if they wanted to mate. But each entity only said "Yes!" if the potential partner was their own beauty level or better. The child they had was given a value for beauty based on one of the parents for heredity.

    Eventually, the entities grew old and died off. I also kept a running tally of the average beauty of the population over all.

    After running the simulation for a while, the average beauty slowly went up until everybody was absolutely gorgeous. 10 / 10. And there wasn't even a single 9 left anywhere. It made me wonder why this hasn't happened in real life human populations. And then it occurred to me, "Oh yeah. Alcohol."

  4. Sarthak Avatar

    i love this channel

  5. Bobo Yarnfeild Avatar

    Unicorns? Rainbows? Cats????! Has the world gone mad?

  6. davidghug Avatar

    Nice videos, in reality evolution is more complex, for example millions of cars going on the road make the road becomes bad so cars fit initially may not be fit after a while, so the fitness function should change gradually, example people fit in the ice age may not be fittest when there has been global warming for a significant amount of time, everything is interconnected, the tread of the car wheels may gradually change

  7. Fedegas Avatar

    Damn, you're a great teacher

  8. Chad Pace Avatar

    I can see how some elements of human cognition would use a kind of genetic algorithm.

  9. Steven Avatar

    Great to see someone teaching coding with such passion, love your work!

  10. Sunny Beta Avatar

    I really wanted ti know what the car did at 2:03 🙁

  11. Bence Sárosi Avatar

    Not as if it made any difference, but at around 12:04 the fitness for "pancake" is actually either 0 or 2 depending on whether you take character position into account or not. If you do, then "n" is in the wrong position, if you don't, then "c" is a match too. Great material, though.

  12. Mclean Cabaneros Avatar

    You're so funny and at the same time informative. Kudos!

  13. Gene Avatar

    Nice video. Keep it up

  14. Mark Littlewood Avatar

    Nice presentation but you have not covered here how to replace the new child back into the population. I think you should have mentioned it here, presuming it gets mentioned in the upcoming code section

  15. Olf Mombach Avatar

    After watching this I decided to try to code the algorithm without "spoilers" from Dan's actual code video. I made it in Python using pygame for the interface and surprisingly the program ran succuessfully (!) with no single error at the first try. I decided to implement various crossover methods and what I observed is that using the following "half-half" crossover:

    def crossover_half(parentA, parentB):
    res = parentA[:math.ceil(len(parentA)/2)] + parentB[math.ceil(len(parentA)/2):]
    return res

    I got the result for "to be or not to be" in about 7 to 12 seconds. However, using the following "random" crossover:

    def crossover_random(parentA, parentB):
    res = ""
    for i in range(len(parentA)):
    res += (parentA[i], parentB[i])[random.randrange(0, 2)]
    return res

    I achieved results always in less than 2 seconds. Also a very unexpected result was that using a "checker" kind of crossover:

    def crossover_checker(parentA, parentB):
    res = ""
    for i in range(len(parentA)):
    if not i%2:
    res += parentA[i]
    res += parentB[i]
    return res

    I *never achieved a result*. The population was almost uniform with bad solutions. The mutations had no major effect.

    All measures were made using a population size of 400 and a mutation rate of 1%.

  16. Aniket Dembi Avatar

    ik im a bit l8 but Thiss issss AAAAMAAAAZINGGGGG

  17. Aashish Kumar Avatar

    good series you have explained it well. Keep up the good work

  18. Smriti Sharma Avatar

    This person is amazing! Thanks for the video 🙂

  19. Francisco Hanna Avatar

    Hi Daniel! Great series of videos. They are helping me a lot. However, I think there's kind of a mistake at the crossover explanation. In the 'single-point' crossover that you explain, I thought that it produced two childs as a result, in order to have, in the next generation, N individuals again. Am I correct?

  20. fauxscot1 Avatar

    Excessively animated presentation gets in the way of clarity, for me anyway. Maybe throttle back a bit? Good info, nice effort, a tad tedious.

  21. Ehsanul Haque Avatar

    Can I make a timetable generator with genetic algorithm?

Leave a Reply

Your email address will not be published. Required fields are marked *