A machine learning demonstration using Chrome's iconic "No Internet" dinosaur game. Watch 100 AI-controlled dinosaurs learn to avoid obstacles through genetic algorithms and neural networks—no manual programming required.
Each bar represents one generation. Colors show the proportional contribution of each rank tier to the total population score, sorted from highest (bottom) to lowest (top). Watch how gold dinos dominate more as evolution progresses.
Rank | Color | Alive | Avg Score | Best Score |
---|---|---|---|---|
No data yet. Start the simulation. |
This interactive demonstration recreates Google Chrome's offline dinosaur game as a testbed for neuroevolution— combining neural networks with genetic algorithms to evolve game-playing AI. Rather than training through backpropagation, the population improves through natural selection over generations.
Each dinosaur has its own neural network "brain" that controls its decisions to jump or duck based on what it sees.
Each dino's brain is a simple feedforward neural network with:
The network starts with random weights, so generation 1 performs terribly. But through evolution, successful patterns emerge.
After all 100 dinos die, a new generation is created through natural selection:
This process mimics biological evolution: successful traits propagate, unsuccessful ones die out, and mutations introduce variety.
Several patterns typically emerge:
Watch the dinos improve over generations. Early generations fail quickly as random neural weights produce poor decisions. As evolution progresses, successful jump timing emerges. Eventually, top performers can last thousands of frames. The gold-ranked dinos represent the best evolved strategies, while lower ranks show the population's diversity. Game speed increases over time, creating selection pressure that favors increasingly sophisticated obstacle avoidance behaviors.
The neural network architecture uses standard feedforward propagation with sigmoid activation functions. Weight initialization follows uniform random distribution, and mutations apply Gaussian noise to existing weights. The genetic algorithm implements elitism-based selection to preserve top performers while maintaining population diversity through stratified mutation rates. Fitness is measured purely by survival time, creating a clear optimization objective. This demonstrates core machine learning concepts: function approximation, gradient-free optimization, and the exploration-exploitation tradeoff.