|
I didn't read any of the posts in this thread, but I did want to chime in and say that I never think about anything.
|
# ? Sep 18, 2016 19:25 |
|
|
# ? May 13, 2024 06:38 |
|
emoji posted:It's just minimizing error functions using straightforward curve-fitting math. The high level concepts were published in CS papers decades ago and basically outline strategies for applying gradient descent a bunch of times really fast in various combinations. The difference now is that computers are faster and we have user-friendly libraries and better input data. The internal states of the trained model AKA the 'thinking' parts are meaningless gibberish, just tables of coefficients for reducing data in highly specific input contexts, and just happen to work better than most people expected for certain types of problems but don't fundamentally bring us closer to true AI, OP.
|
# ? Sep 18, 2016 21:06 |