|
for those of you who haven’t worked with machine learning, here’s how it’s works: * you take a bunch of pictures of shoes or horses or whatever the gently caress you want to recognize. * then reduce those images to some ridiculous low resolution thing * then you cram those images through some matrices until you get some matrix that happens to tell you horse or not horse. * nobody knows why this matrix is horse, but it seems to work so you’re like, cool, I made a horse model. now you take your horse model and set out to detect some horses. you take some pics and run them through your model and it’s all here a horse, there a horse, no horse there. but then there’s one picture of a horse and your model says no, not horse. and you have no idea why it’s not horse. the matrix is just a bunch of 8-bit values, it doesn’t even look like a horse. the answer, which you will never discover, is that the picture has a reflection of the sun in the water under the horse, which is not where the sun goes and your model couldn’t deal with that. and then you realize that your model doesn’t understand poo poo, it hasn’t learned anything. it is incapable of learning, it’s a loving filter.
|
# ¿ May 30, 2019 06:12 |
|
|
# ¿ May 12, 2024 05:26 |
|
btw, ML processors that are the next magical computing wave are just parallel arithmetic engines that operate on matrices. many of them only support 8-bit arithmetic. the point is, whenever you read some article hyperventilating about TPUs and ML chips, you can mentally replace them with floating point units or vector instructions and see if the article still makes sense.
|
# ¿ May 30, 2019 15:39 |
|
Cybernetic Vermin posted:google along with a lot of other actors have realized that a lot of successful ml simply involves ridiculous amounts of semi-skilled man-hours tweaking parameters and topologies randomly until the thing seems to do the thing, so it is worth it to try to capture more hands early on. I’m not an academic type but if I’m reading this correctly you’re saying that ML is horseshit, and I can confirm this from practical experience.
|
# ¿ Jun 1, 2019 06:21 |