https://www.uni-math.gwdg.de/aufzeichnungen/klein-scans/klein/
I hope you enjoy folks. It would be nice to see an English translation somewhere. Same for his encyclopedia.
Hi guys,
I was learning machine learning and I found something a bit confusing.
When I studied physics I saw the method of least squares to find the best parameters for the given data, in this case we assume we know the equation and we just minimize the error. So if it is a straight line model...
I am not saying that theory is bad or unnecessary. What I am looking for is a numerical example.
Schrodinger equation is fine, but once you compute the orbitals of the hydrogen atom you get a better understanding.
I don't understand why it is bad to ask for numerical examples and numbers...
The problem is that this looks like a magic thing, I don't know why is it "hidden" behind the bogus language "deep learning", "encoder", "decoder", "tokeninez input embeeding", "multi head self attention", "layer normalization", "feed forward network", "residual connection".... and all that...
https://en.wikipedia.org/wiki/Crookes_radiometer
WOuld it rotate faster if I use plaques with different materials where we get the photoelectric effect in one of them?
I have read that transformers are the key behind recent success in artificial intelligence but the problem is that it is quite opaque.
I wonder if anybody knows how to build and train one from scratch or if there is any book, video, or website explaining it.
Thanks
Enter here:
https://www.17centurymaths.com/
And look for this:
A translation of Euler's Methodus Inveniendi Lineas Curvas Maximi Minimive Gaudentes……… is now complete, i.e. the Foundations of the Calculus of Variations, and includes E296 & E297, which explain rather fully the changed view...