|
Guavanaut posted:Isn't the singularity just when computers become good enough that they can design better computers faster than humans, and so AI and systems development accelerates and then maybe some computers are smarter than people? Pretty much. It's been pretty hilarious/depressing recently watching a bunch of my childhood influences like Hawking and Sinclair pull a reverse Dorkins by utterly misunderstanding evolution in order to claim that such an event would be the downfall of humanity. (Machines are faster, so they'd evolve faster and out compete us for resources. Do not ask me why they would have the same desires, needs, or evolutionary criteria as us. My opinion on this is valid because I have a robot voice.)
|
# ? Jan 31, 2015 09:44 |
|
|
# ? Jun 8, 2024 08:17 |
|
Of course you'd want to downplay the Terminator-esque future that lies in wait for us, Renaissance Robot.
|
# ? Jan 31, 2015 09:49 |
|
Larry_Mullet posted:Why? Do you think we're gonna stop making computers more powerful or something? My thinking is either the environment or nuclear war sets us back. That or it turns out increasing the processing speed and memory of an intelligence doesn't actually make it superintelligent and either way it'll almost certainly take a hell of a lot longer for us to do it than all these "The singularity is near! iPhones!" people think. If you're solely talking about the singularity as an event where AI outstrips human intelligence rather than humans amplifying their own intelligence through technology thousands of fold and becoming digital gods, then yeah I can see us being stupid enough to build a machine that makes us useless. I just think we're a longass way from creating a god-machine and theres a hell of a lot of things that can happen in that time.
|
# ? Jan 31, 2015 09:50 |
|
e/ ^^ can I take a moment to point out that we still don't even have a model or measure of intelligence that's adequate for comparing humans to other humans? Let alone humans to, say, crows, which are running moderately similar hardware at least. Right now we couldn't tell if a machine is "more intelligent" because we're still figuring out what we want that to mean. e2/ machines that make us useless? We're already useless, unless you're talking from a Darwinian perspective Larry_Mullet posted:Why? Do you think we're gonna stop making computers more powerful or something? It certainly won't go down as advertised, because no amount of computing power is going to change universal constants such as the speed of light, the difficulty of getting off the planet, or the fact that most of us are dicks to each other. The fedoras like to imagine that it will though, somehow, and then we'll get star trek or something. Here, rationalwiki has some good quotes: Mitch Kapor on Ray Kurzweil posted:It's intelligent design for the IQ 140 people. This proposition that we're heading to this point at which everything is going to be just unimaginably different - it's fundamentally, in my view, driven by a religious impulse. And all of the frantic arm-waving can't obscure that fact for me, no matter what numbers he marshals in favor of it. He's very good at having a lot of curves that point up to the right. Renaissance Robot fucked around with this message at 10:04 on Jan 31, 2015 |
# ? Jan 31, 2015 09:56 |
|
Renaissance Robot posted:
I just meant as in our pretty much only fancy characteristic is being pretty smart. On a more crazy tangent we could have a use, we could basically act as guardians of the only viable environment for life we are aware of and maybe terraform other planets because, while there isn't a "point" to it in the fullness of time, life is pretty neat I think. Hell, you could even look after your fellow man and sentient too. We won't ever do this, in fact we're more likely to go Permian on the place but its a possible practical use for humanity. e: grammar Communist Thoughts fucked around with this message at 10:31 on Jan 31, 2015 |
# ? Jan 31, 2015 10:27 |
|
Anyone who has an interest in AI might enjoy this article: http://rationalwiki.org/wiki/Roko%27s_basilisk I *think* it may have come up in the thread before (not sure how I came across it), but the LessWrong community is ripe for a Weekend Web, especially with poo poo like that.
|
# ? Jan 31, 2015 10:31 |
|
I remember there being a LessWrong mock thread on the forums somewhere, not sure if it's still around. edit: oh yeah here it is
|
# ? Jan 31, 2015 11:02 |
|
So... PIP appeals process. How does that all work out, then? Despite being iller than she's ever been my wife didn't qualify after being on DLA no problem for years. Pretty high chance of winning appeals, isn't there? Edit: the scores they've given her bear no relation to the things she actually said in the interview, it's like things have been recorded incorrectly thehustler fucked around with this message at 11:41 on Jan 31, 2015 |
# ? Jan 31, 2015 11:32 |
|
Larry_Mullet posted:Why? Do you think we're gonna stop making computers more powerful or something? The Singularity is religion for Nerds. Processing power isn't some magic wand that will magically create the singularity. It would require huge, huge technological breakthroughs in AI and other fields. Considering we don't really have a working model of human intelligence or even truly understand how our own brain works it's a huge leap forward to make anything of that scale.
|
# ? Jan 31, 2015 11:39 |
|
Yeah I don't see why processing power has anything to do with it really. AI software would run on a supercomputer today, it would just be relatively slow to do the things that an AI would do. But where is this AI software? Nobody's developed any because nobody has a loving clue how to, and more processing power isn't going to change that. You can't just load up Solitaire on a Infinity-GHz computer and expect it to become self-aware.
|
# ? Jan 31, 2015 12:01 |
|
Look up some papers from this woman http://www.sussex.ac.uk/profiles/276 She along with most of the pioneers in AI think true AI will happen but probably not in our lifetimes. Boring I know but it seems to be the consensus.
|
# ? Jan 31, 2015 12:24 |
|
All Coppers etc"'Tasers for all front-line officers' - Police Federation posted:All front-line police in England and Wales should be offered Tasers in light of the increased terrorism threat, the head of the Police Federation says. Nonce Politician cover-up (my face is twisted in utter disgust while reading this) "Diplomat's 'sexual perversion' provoked security fears, says Thatcher adviser posted:A senior British diplomat who recorded sexual fantasies involving children was seen primarily as a security risk, a former top civil servant has revealed. Also, any ideas for the Feb thread title?
|
# ? Jan 31, 2015 13:05 |
|
The actual definition of the singularity is a point in human history where technology advances faster than existing society can come to terms with the changes it brings. It has nothing to do with AI, although AI would probably be necessary before it could occur.Pesky Splinter posted:Also, any ideas for the Feb thread title? UKMT Valentine's Day Edition - Labour are red, Tories are blue Jedit fucked around with this message at 13:09 on Jan 31, 2015 |
# ? Jan 31, 2015 13:07 |
|
Pesky Splinter posted:Also, any ideas for the Feb thread title?
|
# ? Jan 31, 2015 13:08 |
|
Pesky Splinter posted:Nonce Politician cover-up (my face is twisted in utter disgust while reading this) It was great because I was browsing the BBC and I read the BoJo "Terrorists are men who like porn" article and then I clicked that immediately afterwards.
|
# ? Jan 31, 2015 13:13 |
|
I'm waiting for the next big shocker: Terrorists are men who breathe oxygen and extract energy from digestion.
|
# ? Jan 31, 2015 13:18 |
|
Ddraig posted:I'm waiting for the next big shocker: Terrorists are men who breathe oxygen and extract energy from digestion. And that's why we're taking food out of the mouths of the poor. Tough on digestion, tough on the causes of digestion.
|
# ? Jan 31, 2015 13:36 |
|
OwlFancier posted:It's called the singularity because a singularity is a point after which existing predictive models are useless.
|
# ? Jan 31, 2015 13:57 |
|
Guavanaut posted:We've done it before though, with the invention of language, agriculture, and industry. The after was equally inconceivable to people living in the before. Just because this time it's new doesn't mean we shouldn't do it, the other step changes were all new at the time too. Personally I would say it's more something that's inevitable, not something to really be enthusiastic about, given that you could achieve much the same effect by blowing everything up and starting over. Maybe it'll be better, maybe it'll be worse, maybe it'll be more of the same, but being super happy about the coin flip that decides the fate of humanity is a bit weird. It's essentially a futurist form of accelerationism, not really a helpful ideology I think.
|
# ? Jan 31, 2015 14:10 |
|
There's also this view on the singularity from Pictures for Sad Children:
|
# ? Jan 31, 2015 14:52 |
|
Whig history vs. Tory history I guess. Some people believe that there was a golden age in the past and we've all been going downhill from then, and each new innovation just represents new and creative ways to go downhill. Others believe we're on a constant and inevitable path of improvement. They're both wrong, but personally I'd say the evidence is far more in favor of improvement. When you start giving the event the same religious significance as the LessWrong guy does, that's weird yeah. But it's okay to be excited about new things. I'm not sure about the 'rich and white' thing though, one of the common things about past singularities and technological leaps is that they have caused a shakeup in the leadership structure. (Of course, most of those didn't end up exactly as planned either )
|
# ? Jan 31, 2015 15:11 |
|
|
# ? Jun 8, 2024 08:17 |
|
Sorry to cut the, genuinly interesting, discussion on the technological singularity short, everyone, the new thread can be found here; February Thread
|
# ? Jan 31, 2015 15:24 |