Should I step down as head of twitter This poll is closed. |
|||
---|---|---|---|
Yes | 420 | 4.43% | |
No | 69 | 0.73% | |
Goku | 9001 | 94.85% | |
Total: | 9490 votes |
|
I've been the one saying "It's stupider than you think" so it'd be on brand for me to be wrong about this, but wasn't the point of Roko's Basilisk not that you could be tortured, but that there is no ontological difference between you and a perfect simulation of you, so you should care about a future simulation of yourself being tortured as much as you would care about your current self in the year 2023 being tortured?
|
# ? Jan 17, 2023 23:10 |
|
|
# ? May 30, 2024 10:57 |
|
The more I think about it the less sense it makes. Like, wouldn't the AI decide to not go through with the simulation 100% of the time because it already exists, so it knows follow through is unnecessary? The threat of simulation in the past did literally as much as it could possibly do, and no amount of torture in the future would change the outcome.
|
# ? Jan 17, 2023 23:13 |
|
morningdrew posted:Christ, they had a lot of kegerators some of it might have been beer but a lot of places want to do either kegged kombucha or kegged nitro cold brew
|
# ? Jan 17, 2023 23:15 |
|
If anybody has questions go here -- https://rationalwiki.org/wiki/Roko%27s_basilisk The entire premise is ridiculous but there were still some people that took it seriously.
|
# ? Jan 17, 2023 23:16 |
|
mysterious frankie posted:The more I think about it the less sense it makes. Like, wouldn't the AI decide to not go through with the simulation 100% of the time because it already exists, so it knows follow through is unnecessary? The threat of simulation in the past did literally as much as it could possibly do, and no amount of torture in the future would change the outcome. goatface posted:It only makes sense if you're a paranoid over-educated idiot who has invested themselves in the ultimate catastrophe lifestyle.
|
# ? Jan 17, 2023 23:16 |
|
The point is that if it simulates a milion of you, you're scared of the probability that you're actually one of the million simulations and not the one real person.
|
# ? Jan 17, 2023 23:17 |
|
Strategic Tea posted:The point is that if it simulates a milion of you, you're scared of the probability that you're actually one of the million simulations and not the one real person. Yeah but you already know you’re not, for certain. It would just be that future clone’s misfortune, not your own.
|
# ? Jan 17, 2023 23:18 |
|
Look, Roko's Basilisk won't hurt you as long as you teach it to draw hands properly.
|
# ? Jan 17, 2023 23:18 |
|
mysterious frankie posted:The more I think about it the less sense it makes. Like, wouldn't the AI decide to not go through with the simulation 100% of the time because it already exists, so it knows follow through is unnecessary? The threat of simulation in the past did literally as much as it could possibly do, and no amount of torture in the future would change the outcome. The Basilisk is a wound from which the concept of knowledge must recover. To steal a line from a review of a terrible RPG, there are entire medical textbooks now blank to fill the void left by the anti-knowledge of Roko's Basilisk. He's one of the few people who I think we can safely say actually subtracted from the sum of human knowledge.
|
# ? Jan 17, 2023 23:19 |
|
AIs need humans far more than humans need AIs.
|
# ? Jan 17, 2023 23:19 |
|
https://www.nsnews.com/local-news/tesla-suddenly-accelerates-into-bc-ferries-ramp-breaks-in-two-6385255 The kicker here is that there wasn't even a ferry docked at the berth. No one was killed, and the two passengers were taken to the hospital in stable condition. Looks to just be another "mysterious Tesla mechanical issue." lol and lmao
|
# ? Jan 17, 2023 23:20 |
|
Pastel Candy Snake posted:https://www.nsnews.com/local-news/tesla-suddenly-accelerates-into-bc-ferries-ramp-breaks-in-two-6385255 It's not a tesla crash, just a tesela-involved crash.
|
# ? Jan 17, 2023 23:22 |
|
RocketMermaid posted:Look, Roko's Basilisk won't hurt you as long as you teach it to draw hands properly. Take a good look at your hands. A real good look. If they seem weird then you are in a simulation.
|
# ? Jan 17, 2023 23:22 |
|
mysterious frankie posted:The more I think about it the less sense it makes. Like, wouldn't the AI decide to not go through with the simulation 100% of the time because it already exists, so it knows follow through is unnecessary? The threat of simulation in the past did literally as much as it could possibly do, and no amount of torture in the future would change the outcome. Because, by you using your Rationality to predict the existence and behaviour of the God AI far in the future, and by it using its powers to reverse entropy and perfectly recreate a simulation of you, you can communicate and influence each other ~~ACROSS TIME ITSELF~~. Or, to put it another way, if the AI wouldn't need to follow through on the torture, then you'd predict that it wouldn't, and it'd predict that you'd predict that it wouldn't, so therefore it has to follow through anyways. Which you of course predicted, and realized that the God AI would follow through regardless, and the God AI predicted that you'd realize that, so now it doesn't need to follow through. Which of course you also predicted...
|
# ? Jan 17, 2023 23:24 |
|
The biggest mistake with Tesla's FSD is that they're training the AI with data provided by people who drive Teslas.
|
# ? Jan 17, 2023 23:26 |
|
Roko's basilisk is just that dril tweet: the wise man bowed his head and spoke "there's actually no difference between good and bad things you idiot, you imbecile." Taken to extremes
|
# ? Jan 17, 2023 23:26 |
|
It all works fine as long as you're happy an infinite unknowable god would have the exact same reasoning and goals as a grifter who eats flavored nutrient paste instead of food because it leaves more time 4 coding.
|
# ? Jan 17, 2023 23:31 |
|
Maybe it'll be an AI that hates people who work towards the suffering of others and punishes the Basilisk people. Inverse Roko's Basilisk.
|
# ? Jan 17, 2023 23:34 |
|
Guys, this basilisk poo poo is literally the stupidest thing I have ever heard of or read about. Please stop talking about it. If anyone ever brings this poo poo up to me in person I am going to punch them in their loving teeth.
|
# ? Jan 17, 2023 23:36 |
|
While we're here watch this poo poo -- https://www.youtube.com/watch?v=UCgoxQCf5Jg Down the Rabbit Hole Temple OS, it's tangentially related to all this other crazy stuff.
|
# ? Jan 17, 2023 23:38 |
|
Telarra posted:Because, by you using your Rationality to predict the existence and behaviour of the God AI far in the future, and by it using its powers to reverse entropy and perfectly recreate a simulation of you, you can communicate and influence each other ~~ACROSS TIME ITSELF~~. lol, dear lord.
|
# ? Jan 17, 2023 23:51 |
ishikabibble posted:front trunk Are you telling me elephants actually have frunks?
|
|
# ? Jan 17, 2023 23:53 |
|
Yeah if you tell Twitter you don't like Nazis you're going to see more of them. Good stuff. Very smart and intelligence.
|
# ? Jan 17, 2023 23:54 |
|
Roku Presents: Basilisk
|
# ? Jan 18, 2023 00:01 |
|
Nigmaetcetera posted:Yeah but you already know you’re not, for certain. It would just be that future clone’s misfortune, not your own. No you don't, because it would be a perfect simulation of your entire life up to the point where you had a definite choice between feeding your family and feeding Yud's addiction to anime figurines
|
# ? Jan 18, 2023 00:06 |
|
is it cool with you guys if I become Roko's Basilisk
|
# ? Jan 18, 2023 00:08 |
|
Dabir posted:No you don't, because it would be a perfect simulation of your entire life up to the point where you had a definite choice between feeding your family and feeding Yud's addiction to anime figurines I know that This One would not be punished. The fact that there will be one in the far, far future who is not This One but believes themself to be This One, and that one would be tortured, well it just don’t effect This One.
|
# ? Jan 18, 2023 00:10 |
|
arsenicCatnip posted:is it cool with you guys if I become Roko's Basilisk I trust ya.
|
# ? Jan 18, 2023 00:10 |
|
arsenicCatnip posted:is it cool with you guys if I become Roko's Basilisk I am not going to pay you money so you can not torture a future clone of me!
|
# ? Jan 18, 2023 00:12 |
|
Roko's basilisk is the most annoying enemy in Soulsborne games
|
# ? Jan 18, 2023 00:16 |
|
Nigmaetcetera posted:I know that This One would not be punished. The fact that there will be one in the far, far future who is not This One but believes themself to be This One, and that one would be tortured, well it just don’t effect This One.
|
# ? Jan 18, 2023 00:17 |
|
Dabir posted:But it's a perfect simulation, you couldn't know that you're not the original you until it happens. Like, this is the one part of it that isn't completely stupid. The rest of it is gibberish, but the idea that you can't truly know anything beyond the fact that you personally exist in some manner is the kind of thing that's been keeping philosophy students up at night for centuries. I ain’t worried.
|
# ? Jan 18, 2023 00:19 |
|
alf_pogs posted:Roko's basilisk is the most annoying enemy in Soulsborne games Thx for spoiliing BloodBorne II
|
# ? Jan 18, 2023 00:21 |
|
Speaking of assuming things must be less stupid than they are, it dawns on me that I always assumed FSD was never actually a defined Tesla feature. I'd always assumed that Tesla legally defined it as a driver assistance/cruise control feature and had a EULA somewhere which says that you must still remain alert and keep your hands on the wheel and feet on the pedals at all times, like any other car, while just pointedly ignoring Elon misleadingly hyping the feature because he's too rich to be charged with false advertising. I think I'm still mentally rejecting the idea that the company is actually beta testing, on public roads, something they contractually define as an autopilot feature, because that cannot possibly be legal.
|
# ? Jan 18, 2023 00:22 |
|
Nigmaetcetera posted:I ain’t worried. And that's the final flaw with the whole thing. Even if all its insane ideas about how reality works are true, you can get out of it by going "no it won't".
|
# ? Jan 18, 2023 00:23 |
|
Dabir posted:And that's the final flaw with the whole thing, even if all its insane ideas about how reality works are true, you can get out of it by going "no it won't". Yeah but no it won’t.
|
# ? Jan 18, 2023 00:23 |
|
Exactly.
|
# ? Jan 18, 2023 00:24 |
You invent AI. You wash your hands. You invent AI. You wash your hands.
|
|
# ? Jan 18, 2023 00:26 |
|
The tl;dr on why Roko's Basilisk works on some people, is because scifi nerds are incredibly gullible so long as you present whatever insane scenario in terms of scifi goodies like artificial intelligence and simulations. They've already emotionally bought in on the idea of those things becoming real some day, so they will take them at face value in any hypothetical. See also: all the nerds convinced this new generation of chatbots are sentient.
|
# ? Jan 18, 2023 00:27 |
|
|
# ? May 30, 2024 10:57 |
|
It's the way of the future. It's the way of the future. It's the way of the future. It's the way of the future. It's the way of the future. It's the way of the future. It's the way of the future. It's the way of the future. It's the way of the future.
|
# ? Jan 18, 2023 00:29 |