|
http://lesswrong.com/lw/1pz/the_ai_in_a_box_boxes_you/ Wow, that is one hell of a threat
|
# ¿ Apr 20, 2014 00:48 |
|
|
# ¿ May 19, 2024 01:18 |
|
Djeser posted:Wasn't Voldemort's fatal flaw that he didn't understand the bond of love that protected Harry? I especially like how he theorizes that a friendly AI will do this. Because any AI, especially a friendly one, will expend energy to create a million perfect simulations of you in the past in the future to torture you, because you are a Bad Person.
|
# ¿ Apr 20, 2014 17:28 |
|
Grondoth posted:Holy poo poo I remember this story. I think it was linked in this forum, actually. For those unfamiliar, it's about humans meeting another alien race that does horrible things, and trying to figure out what we should do. I think they ate their kids or something? Then another alien race shows up and thinks the same thing about us, that we do horrible things and they need to change that. Interesting turnaround, right? There's gonna be lots of different species out there, who knows how they handle ethical questions and what they see as good and right. I remember this story too, it started out with an interesting ethical premise then blossomed into a wonderful flower of brilliance with lines such as, "'gently caress that up the rear end with a hedge trimmer,' said the Lord Pilot. 'Are we going to save the human species or not?'"
|
# ¿ Apr 23, 2014 01:42 |
|
During school breaks I had a 26 hour sleep schedule because my body hates me and what eventually happens is that you stall out at around 5am and crash or you force yourself through and fall asleep in the middle of the day to reset back to a sort of normal schedule. Basically you're really tired a lot. Maybe the singularity fixes this clear flaw of human behavior.
|
# ¿ Apr 24, 2014 03:26 |
|
have any of the less wrong people murdered someone because there was very high chance that he was a simulation
|
# ¿ Jul 27, 2014 18:50 |