Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Locked thread
itstime4lunch
Nov 3, 2005


clockin' around
the clock
I've just recently begun reading up on and getting interested in this topic, but something that seems to stand out is that at a more abstract level, the fears that the well-known "insiders" (Hawking, Gates, Musk, etc...) are expressing about out-of-control AI actually boil down to fears of an uncontrollable feedback loop that, in one way or another, catches us in its vortex (not unlike a black hole, I suppose).

Although dangerous AI is a concept that may actually be on the not-too-distant horizon, the fear of this kind of loop is not new. For example, some global warming experts have predicted some kind of "runaway" warming is not an impossible scenario that could take place with trapped methane in the arctic circle being released, causing faster warming, causing more methane to be released, etc... The global arms race of the last century is another kind of example, in some ways. I wonder what this underlying fear of self-reinforcing processes that eventually spiral out of control says about us, and how rational a fear it is to hold.

The rapid technological development of the last 100+ years very much fits in with this paradigm, with new technology leading to a better understanding of how to build new technology and so forth. Are we, as a global society, somehow so traumatized by the abrupt transformation of our world and lifestyles to the point that we see ghosts around every corner, or are these very real threats to the continued existence of worthwhile human life on Earth?

Sometimes, exponential growth processes seem to reach a sort of equilibrium state rather than continuing to spiral into oblivion. I guess global population growth would be a good example. What features of a self-reinforcing loop are necessary to bring about this equilibrium? Are any of those present in self-improving AI, or is it too early to be able to make a guess? (The decline of population growth might actually have more to do with us finally running up against the physical limits of our environment...)

itstime4lunch fucked around with this message at 17:04 on May 13, 2015

Adbot
ADBOT LOVES YOU

itstime4lunch
Nov 3, 2005


clockin' around
the clock

Doctor Malaver posted:

Isn't population growth these days pretty much inversely proportional to the amount of resources a society has at its disposal?

Good point.

  • Locked thread