|
As an idiot, it only just occurred to me that a less dried-up version of that alien corpse we saw would look quite a bit like the standard stubby machine.
|
# ? May 18, 2017 11:09 |
|
|
# ? May 31, 2024 01:48 |
|
Josuke Higashikata posted:He whisked off her shoes and panties in one movement, wild like an enraged shark. His bulky totem beating a seductive rhythm. Mary's body felt like it was burning, even though the room was properly air-conditioned. They tried all the positions - on top, doggy, and normal. I honestly can't tell if this is a joke or an actual excerpt.
|
# ? May 18, 2017 11:15 |
|
MiddleOne posted:I honestly can't tell if this is a joke or an actual excerpt. https://www.youtube.com/watch?v=PcFWSExNJVk
|
# ? May 18, 2017 11:26 |
|
Now that I'm remembering, there's a Strong Bad E-mail episode where one of the Easter eggs was about Tire Madness and one of the words was Revenganceful. That's what that word reminds me of. http://www.homestarrunner.com/sbemail85.html For those curious. Some of the bad endings now would be total Homestar moves. "Ooh, mackewel!"
|
# ? May 18, 2017 11:45 |
|
MiddleOne posted:I honestly can't tell if this is a joke or an actual excerpt. You should watch Garth Marenghi's Dark Place. Or at least most people should.
|
# ? May 18, 2017 13:47 |
|
necroid posted:I'm re-reading those novels right now and I'm still amazed at how well they hold today, considering they were written in the mid-late FIFTIES. Asimov's robots can be both superhuman and childlike in their interactions which is what makes them so believable as not-yet-human machines. Caves of Steel is even more amazing when you realize that Asimov had no idea what he was doing. He had no plan for the story, he was just banging out chapters and sending them off to the magazine so he could get paid enough to eat. He basically pulled every plot point out of his rear end, including the ending. Screaming Idiot posted:Reading Asimov's sex scenes is enough to make one wonder if the man was, in fact, a robot trying to simulate humanity. Thankfully he never learned of tight leather pants and tribal tats. We don't talk about Foundation and Earth.
|
# ? May 18, 2017 15:02 |
|
The Big Three of the Golden Era of science fiction had all the bases covered. Bad sex scenes from Asimov, all the sex forever and ever from Heinlein, or completely separate from sex from Clarke.
|
# ? May 18, 2017 17:48 |
|
Stories about robots going bonkers because they interpreted their programming weirdly are a favourite of mine. This discussion made me remember The Cyberiad and now I want to go dig it out and have a read.
|
# ? May 18, 2017 18:14 |
|
Sage Grimm posted:The Big Three of the Golden Era of science fiction had all the bases covered. Bad sex scenes from Asimov, all the sex forever and ever from Heinlein, or completely separate from sex from Clarke. Honestly, Clarke's approach seems the best of the bunch there, if only because there's lots of ground for SF to cover (and moreso back in the Golden Era) that basically have nothing to do with sex. McDragon posted:Stories about robots going bonkers because they interpreted their programming weirdly are a favourite of mine. This discussion made me remember The Cyberiad and now I want to go dig it out and have a read. Ever read Asimov's 'Runaround'? It's just a short story, but it's one of the ones on that theme I love because its explanation made a kind of intuitive sense to how eight-year-old me thought it should work. More to the point, it's a story that predicated its explanation on robots being robots, and not chrome-plated humans.
|
# ? May 18, 2017 18:30 |
|
The most disquieting robot series I've ever read is the Not Quite Human books by Seth McEvoy. They're about a robot named Chip whose scientist creator is sending him to school to see if he can learn how to be human. Pretty cliché, yes. But the scary thing is Chip DOES NOT HAVE EMOTIONS. And this is scary in a subtle way, because it never becomes horror. I didn't even realize this until rereading the books. He follows his programming to try and emulate humans, he measures responses, behaves in ways they seem to approve. Sometimes makes mistakes; but it's always stated that he only thinks about whether his programming "pretend to be a human" is successful or not. And we see his scientist creator and said creator's daughter treat him like family, and people at school do. But we, the audience, see his thinking, and he never considers them that except "that's what my programming says they should be called". So by the end of the series, he's good at acting like human, but inside, he's still the same, still ONLY interested in following his programming. To this day, I'm not sure if this was intentional or not. Disney made a few movies about this, but they go the usual Pinocchio route and he really does start thinking like a human. Of course, that has nothing to do with Nier Automata, because both machines and androids have emotions.
|
# ? May 18, 2017 20:33 |
|
Andyzero posted:Of course, that has nothing to do with Nier Automata, because both machines and androids have emotions. Or are they just good at emulating them?
|
# ? May 18, 2017 21:11 |
|
There is some good stuff on the subject on Wikipedia's page for the Philosophy of Artificial Intelligence. Neither machines nor androids in the Nierverse necessarily have emotions. They absolutely could just be following their programming, just like everyone else in the world except for you. I've said too much.
|
# ? May 18, 2017 21:16 |
|
Machines don't have feelings.
|
# ? May 18, 2017 21:17 |
|
Tatumje posted:There is some good stuff on the subject on Wikipedia's page for the Philosophy of Artificial Intelligence. Just like everyone in the world including you, you're just a side effect of the machine working underneath you, a passenger on a wild ride who feels they can change what the machine is spontaneously doing and rationalize away its behaviour.
|
# ? May 18, 2017 21:18 |
|
Fredrik1 posted:Or are they just good at emulating them? And now for the fun philosophy: what differentiates having emotions from emulating them? Why do you believe that you have emotions and aren't just emulating them? How are human emotions not "just" programming? Complicated buggy programming, but programming nonetheless. SIGSEGV posted:Just like everyone in the world including you, you're just a side effect of the machine working underneath you, a passenger on a wild ride who feels they can change what the machine is spontaneously doing and rationalize away its behaviour. We aren't a side effect, we are the machine. Not a passenger, but the vehicle.
|
# ? May 18, 2017 21:20 |
|
I've drastically simplified this whole philosophical debate by just going full-on solipsism. I don't care what any of you meat robots tell me.
|
# ? May 18, 2017 21:22 |
|
I guess navel-gazing and reminiscing about classic sci-fi is slightly better than a five page derail about wisdom teeth that we got in the last TDI lp thread.
|
# ? May 18, 2017 21:27 |
|
Don't be silly, androids don't have navels!
|
# ? May 18, 2017 21:39 |
|
InequalityGodzilla posted:I guess navel-gazing and reminiscing about classic sci-fi is slightly better than a five page derail about wisdom teeth that we got in the last TDI lp thread. I had some pretty loving big wisdom teeth, one was the size of my thumb phalanx final section thing? I'm not too good at this English thing. Anyways it was loving huge. Qrr posted:We aren't a side effect, we are the machine. Not a passenger, but the vehicle. Well, yes, of course. You're a part of the decision making process but is your consciousness really the part that makes the decisions? It could just be an impression.
|
# ? May 18, 2017 21:43 |
|
SIGSEGV posted:I had some pretty loving big wisdom teeth, one was the size of my thumb phalanx final section thing? I'm not too good at this English thing. Anyways it was loving huge. Still got nothing on the teeth derail in the Drakengard 3 thread. It's probably best you don't look for it. I can have moments of... eccentricity and sometimes be quite curious about things. Please forgive me if I do something foolish or rude.
|
# ? May 18, 2017 21:51 |
|
Determinism trumps free will. I can predict with 100% accuracy that no one wants to hear me talk about determinism. Take that free will advocates!
|
# ? May 18, 2017 21:52 |
|
Dzhay posted:As an idiot, it only just occurred to me that a less dried-up version of that alien corpse we saw would look quite a bit like the standard stubby machine. This never occurred to me until I read your comment and took a second look at the alien corpses, so you're not the only idiot
|
# ? May 18, 2017 21:55 |
|
Tatumje posted:Determinism trumps free will. I want to hear you talk about determinism! I can have moments of... eccentricity and sometimes be quite curious about things. Please forgive me if I do something foolish or rude.
|
# ? May 18, 2017 21:55 |
|
Tatumje posted:Determinism trumps free will. I want to, simply because I find the idea of free will to be quite inapplicable anyways. People make the decisions they believe to be the best at any point, including satisfying their spite, hate, love, desire to spend more time thinking on it and so on and so associate a value to those things. Therefore any decision they make was "the best decision" at the moment it was made. There's no free will if the only freedom is to make the best decision as it appears. Basically the entire concept seems kinda bunk to me.
|
# ? May 18, 2017 22:00 |
|
Andyzero posted:The most disquieting robot series I've ever read is the Not Quite Human books by Seth McEvoy. They're about a robot named Chip whose scientist creator is sending him to school to see if he can learn how to be human. I remember the movies they made on that.
|
# ? May 18, 2017 22:01 |
|
SIGSEGV posted:
Depends on the decision, but generally I think "I am going to do this thing" and then I do it, so yes. Unless my impression can predict the future.
|
# ? May 18, 2017 22:23 |
|
Qrr posted:Depends on the decision, but generally I think "I am going to do this thing" and then I do it, so yes. Unless my impression can predict the future. Yeah, I can't really float it for writing a thesis for example. Still need to refine it.
|
# ? May 18, 2017 22:35 |
|
SIGSEGV posted:Well, yes, of course. You're a part of the decision making process but is your consciousness really the part that makes the decisions? It could just be an impression. Consciousness is a culling mechanism for improving the operation of less conscious modes of thought. People can be ruled by their instincts, they can also override them if they think that might produce a better result.
|
# ? May 18, 2017 22:37 |
|
SIGSEGV posted:People make the decisions they believe to be the best at any point
|
# ? May 18, 2017 23:05 |
|
Well, they do. It's just that people are consistently pretty dumb so what seems like the best idea at the time rarely is. Not like we have loads of people going around saying they're going to see what happens when they shove a lit firecracker up a moose's rear end, at least not outside of Jackass. Edit: The show Jackass, not the android. Edit2: Actually, probably the android too.
|
# ? May 18, 2017 23:13 |
|
You can't prove where thoughts come from. All you know for sure it's that there are thoughts and you can observe them. Everything beyond that is based on supposition.
|
# ? May 18, 2017 23:13 |
|
Begemot posted:You can't prove where thoughts come from. All you know for sure it's that there are thoughts and you can observe them. The ball is round, a game lasts 90 minutes, everything else is pure theory. Off we go!
|
# ? May 18, 2017 23:16 |
|
It makes a lot more sense if you include things like spite, fun or trying to fit in a group in "finding the best decision". When you're completely drunk and trying to fit in, trying to see if you can piss 2 meters straight up in the air can make sense. Even if you know it's stupid because you're not quite dead drunk you estimate the social pressure and insults and not fitting in and so on and can arrive to the conclusion that your next leak ending up on liveleak is probably a good decision? How do zippers work again?
|
# ? May 18, 2017 23:18 |
|
Begemot posted:You can't prove where thoughts come from. All you know for sure it's that there are thoughts and you can observe them. We can derive where they come from because there aren't very many sources in the brain. Or rather, there are quite a lot of sources and they're all brain cells.
|
# ? May 18, 2017 23:43 |
|
Saying it like that kinda downplays the importance of the various support systems around there, especially the hormonal system. Neurons are vital other parts also play their roles.
|
# ? May 18, 2017 23:47 |
|
Qrr posted:We can derive where they come from because there aren't very many sources in the brain. Or rather, there are quite a lot of sources and they're all brain cells. Woah woah woah there buddy, you're already working on all sorts of unprovable suppositions. Talkin' about thoughts coming from brains as if sensation was reliable proof of anything.
|
# ? May 18, 2017 23:48 |
|
Begemot posted:Woah woah woah there buddy, you're already working on all sorts of unprovable suppositions. Talkin' about thoughts coming from brains as if sensation was reliable proof of anything. Yes, yes, we all know how this argument reduces to absurdity. Eventually we end up squabbling over whether anything has ever existed, or if we're all just brains in jars hooked up to computers and a chemical drip. And everyone goes home feeling smug about dropping trou and jerking off all over their own chests. That particular branch of philosophy has always struck me as being particularly braindead. It doesn't say anything. It doesn't explain anything. It does nothing. It is pointlessness on top of pointlessness.
|
# ? May 18, 2017 23:57 |
|
SIGSEGV posted:Saying it like that kinda downplays the importance of the various support systems around there, especially the hormonal system. Neurons are vital other parts also play their roles. Then we have a discussion about where "come from" means. If we're including every cause then we're blaming thoughts on everything that has ever happened. Hormones and other inputs to the brain (like sight and hearing and pain and so on) are certainly influential, but I wouldn't say that they're where thoughts come from.
|
# ? May 19, 2017 00:06 |
|
Consider determinism in terms of overcoming addiction: There are two main steps to beating any addiction. An individual must first realize that a behavior is worth stopping, and then the individual must cut out any opportunities to reengage with the old behavior. In order to meet the first step, the person must analyze the alternatives to their current path. This takes energy. If the individual doesn't find a suitable new path, they will not begin the rehabilitation process. If I'm addicted to water, there is no suitable alternative. I'm locked in. The second step (changing habits) also requires energy. If the individual doesn't have enough energy to change habits on their own and no one else around them is willing to help, they will relapse. If I want to lose weight, I need to address my relationship with food. I also need to change my relationship with friends, family, work acquaintances, strangers, and with myself. I must kill old habits in order to foster new ones. Energy is willpower, but will isn't free. There are energy requirements for any decision you will make. Willpower runs out eventually. In an ideal world, you will be far away from your vices that the effort to return to them will cost too much. In other words, 6O is the best girl.
|
# ? May 19, 2017 00:09 |
|
|
# ? May 31, 2024 01:48 |
|
apocalypticCritic posted:It is pointlessness on top of pointlessness. The long form Turing test.
|
# ? May 19, 2017 00:16 |