|
SurreptitiousMuffin posted:What confuses me about the whole future robot hypothetical is: why torture? Axeman Jim posted:[B]ut surely if these simulations were sentient, they would be capable of genuinely experiencing suffering if the AI tortured them? That would mean that they would count towards the "minimize suffering" aim that this AI apparently has, for whatever reason. So by creating billions of them and then torturing them, the AI is massively increasing the amount of suffering in the universe. Whoever said that this is just a re-skinned religion, you got it right, this is really similar to the "Problem of Hell", only a lot more confusing and with more bullshit about future technology. (Not to derail, but thinking about the "Problem of Hell" actually led to my current belief system of universalism/universal reconciliation, and the Less Wrong stuff really reminds me of that.) Swan Oat posted:Well it's a sufficiently powerful AI, you see. Read the sequences. It is a calculus joke, so Yudkowsky wouldn't get it.
|
# ¿ Apr 21, 2014 23:23 |
|
|
# ¿ May 18, 2024 08:13 |
|
Lottery of Babylon posted:
fade5 posted:Make sure you don't confuse sequences with series though. To explain the joke a bit, and to make sure I actually learned this stuff correctly in Calculus: A sequence is a list of numbers, and the order in which the numbers are listed is important. 1, 2, 3, 4, 5, ... (this is an infinite arithmetic sequence) 4, 40, 400, 4000, 40,000 (this is an infinite geometric sequence) Sequences are usually based in a mathematical formula. A series is a sum of numbers, usually of a given sequence, so using the previous examples, 1 + 2 + 3 + 4 + 5 + ... 4 + 40 + 400 + 4000 + 40,000 + ··· are examples of series. Or, written in series notation, (this was harder to type that I thought it would be) ∞ ∑ K = 1 + 2 + 3 + 4 + 5 + ... K=1 ∞ ∑ 4(10)K-1= 4 + 40 + 400 + 4000 + 40,000 + ··· K=1 So, in summation, (more Calculus jokes) you don't know poo poo Yudkowsky. fade5 fucked around with this message at 21:13 on May 6, 2014 |
# ¿ May 6, 2014 21:10 |
|
SubG posted:This all seems more plausible and frightening if you know essentially nothing about either humans or technology. The superhuman AI will be fantastically terrifying until it spontaneously bluescreens. Or its plan for world domination is interrupted by a couple Comcast routers making GBS threads the bed. Or it decides to dedicate all of its human-like intelligence to surfing for porn and playing DOTA all day. Or whatever the gently caress. If you like your destruction natural, just let the weather take care of it: humidity, condensation, a thunderstorm, a flood, a hurricane/tsunami, a tornado, a snowstorm/ice-storm (below -40, most technology stops functioning), hail, a dust storm, or any other weather condition that makes technology poo poo itself (you know, most weather conditions). If you like your destruction more active, you can use a flamethrower, a giant loving industrial magnet (think the things the pick up cars with), a fire hose attached to water main, C4, a machine gun, liquid Nitrogen, or even just a bunch of dudes with sledge-hammers, among many, many other creative methods of destruction. AI fears notwithstanding, technology is still extremely fragile, and humans are really, really good at breaking poo poo. fade5 fucked around with this message at 05:23 on Oct 28, 2014 |
# ¿ Oct 28, 2014 05:15 |
|
BobHoward posted:As a thread-relevant aside: Kurzweil's other immortality related obsession is with bringing his long-dead dad back to life. Much like Yudkowsky, he believes this kind of resurrection will be possible via sufficiently advanced AI running some kind of dad-simulation based on his dad's notebooks, letters, and whatever other ephemera Kurzweil has sitting in a vault. All that'll end up happening is you'll lose some body parts while creating some horrifying, non-human thing. You won't get what you want, and you'll end up worse than when you started. (And you'll end up marked as a human sacrifice in a huge government plot.) Don't do it bro. BobHoward posted:Yup. Even more sad: note the disconnect between this and "I must live long enough for the Upload". It implies that he doesn't really believe dad-AI is a meaningful way to live after death, but he can't let himself fully acknowledge it because that would require accepting that his dad is dead forever. fade5 fucked around with this message at 00:06 on Dec 8, 2014 |
# ¿ Dec 7, 2014 23:58 |