Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Locked thread
Xae
Jan 19, 2005

Rush Limbo posted:

Its also the case in, say, video games and other AI simulations that they handle even fairly basic things like stairs incredibly badly, to the point where various workarounds are created including them just plain ignoring such changes in elevation from a purely AI standpoint and just moving them while having animation take over.

Video game "AI" is its own can of worms and has no relation to academic or industrial AI. For one thing the goal in video game AI isn't to win. Creating a bot that beats the poo poo out of humans is trivial. The hard part is creating one that "acts" human with out kicking the humans rear end.



The thing to keep in mind about AI is that we keep goal post shifting it.

By the standards of early Computer Scientists Google, Wolfram Alpha, Alexa or a half dozen other things are AI.

I expect this will continue indefinitely. We will always view "AI" as something to come, never what we have now.

Adbot
ADBOT LOVES YOU

Xae
Jan 19, 2005

BabyFur Denny posted:

There are still many developments in this area where we're just at the beginning that are very promising. Deep Learning has just been open sourced, running your algorithm on GPUs is happening, cloud computing, hey, even MapReduce as a concept is barely a decade old and already outdated. As soon as you can parallelise a problem, it's basically solved.
We increased the computing resources of our cluster by 10x over the past two years, could easily do another 10x (for a total 100fold increase inperformance) by just throwing a lot of money at it, and only if we go for another 10x after that would be where we had to invest some effort into making it run.

And yeah, it's still inefficient and expensive, but every new technology is at the beginning. It's all just a matter of time, and probably a lot less time than a lot of people are expecting.

MR was old tech when Google rebranded DSNA as MapReduce.

You may have added 10x capacity in 5 years, but CPU hardware is seeing 10-20% gains a year right now. Moore's law is dead and counting on ever faster hardware is foolish.

Xae
Jan 19, 2005

Cingulate posted:

Neural nets/deep learning happens on the CPU. Everyone is looking at NVIDIA, and so far they're delivering.
They are still running into the same barriers Intel is. They just started further back so they are hitting them later.

  • Locked thread