Oh, this is a neat demo. I took Geoff Hinton's neural networks course in university 15 years ago and he did spend a couple of lectures explaining Boltzmann machines.
> A Restricted Boltzmann Machine is a special case where the visible and hidden neurons are not connected to each other.
This wording is wrong; it implies that visible neurons are not connected to hidden neurons.
The correct wording is: visible neurons are not connected to each other and hidden neurons are not connected to each other.
Alternatively: visible and hidden neurons do not have internal connections within their own type.
Alternatively: visible and hidden neurons do not have internal connections within their own type.
I'm a bit unclear on how that isn't just an MLP. What's different about a Boltzmann machine?
Edit: never mind, I didn't realize I needed to scroll up to get to the introductory overview.
What 0xTJ's [flagged][dead] comment says about it being undesirable to hijack or otherwise attempt to reinvent scrolling is spot on.
> I'm a bit unclear on how that isn't just a multi-layer perceptron. What's different about a Boltzmann machine?
In a Boltzmann machine, you alternate back and forth between using visible units to activate hidden units, and then use hidden units to activate visible units.
> What 0xTJ's [flagged][dead] comment says about it being undesirable to hijack or otherwise attempt to reinvent scrolling is spot on.
The page should be considered a slideshow that is paged discretely and not scrollable continuously. And there should definitely be no scrolling inertia.