|
Rush Limbo posted:Its also the case in, say, video games and other AI simulations that they handle even fairly basic things like stairs incredibly badly, to the point where various workarounds are created including them just plain ignoring such changes in elevation from a purely AI standpoint and just moving them while having animation take over. Video game "AI" is its own can of worms and has no relation to academic or industrial AI. For one thing the goal in video game AI isn't to win. Creating a bot that beats the poo poo out of humans is trivial. The hard part is creating one that "acts" human with out kicking the humans rear end. The thing to keep in mind about AI is that we keep goal post shifting it. By the standards of early Computer Scientists Google, Wolfram Alpha, Alexa or a half dozen other things are AI. I expect this will continue indefinitely. We will always view "AI" as something to come, never what we have now.
|
# ¿ Nov 28, 2016 23:42 |
|
|
# ¿ May 15, 2024 09:38 |
|
BabyFur Denny posted:There are still many developments in this area where we're just at the beginning that are very promising. Deep Learning has just been open sourced, running your algorithm on GPUs is happening, cloud computing, hey, even MapReduce as a concept is barely a decade old and already outdated. As soon as you can parallelise a problem, it's basically solved. MR was old tech when Google rebranded DSNA as MapReduce. You may have added 10x capacity in 5 years, but CPU hardware is seeing 10-20% gains a year right now. Moore's law is dead and counting on ever faster hardware is foolish.
|
# ¿ Dec 2, 2016 20:18 |
|
Cingulate posted:Neural nets/deep learning happens on the CPU. Everyone is looking at NVIDIA, and so far they're delivering.
|
# ¿ Dec 3, 2016 01:13 |