Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
(Thread IKs: Nuns with Guns)
 
  • Post
  • Reply
Bar Crow
Oct 10, 2012
Their primary skill is consuming media. Not math.

Adbot
ADBOT LOVES YOU

Arc Hammer
Mar 4, 2013

Got any deathsticks?
Yudkowski is the "single grain of sand in your eye is worse than 2 million years of robots torturing babies" guy, right?

muscles like this!
Jan 17, 2005


Arcsquad12 posted:

Yudkowski is the "single grain of sand in your eye is worse than 2 million years of robots torturing babies" guy, right?

Yes

Doctor Spaceman
Jul 6, 2010

"Everyone's entitled to their point of view, but that's seriously a weird one."

Arcsquad12 posted:

Yudkowski is the "single grain of sand in your eye is worse than 2 million years of robots torturing babies" guy, right?

Kinda

A very smart person posted:

I asked what was the least bad, bad thing that could happen, and suggested that it was getting a dust speck in your eye that irritated you for a fraction of a second, barely long enough to notice, before it got blinked away. And conversely, a very bad thing to happen, if not the worst thing, would be getting tortured for 50 years.

Now, would you rather that a googolplex people got dust specks in their eyes, or that one person was tortured for 50 years? I originally asked this question with a vastly larger number - an incomprehensible mathematical magnitude - but a googolplex works fine for this illustration.

Most people chose the dust specks over the torture. Many were proud of this choice, and indignant that anyone should choose otherwise: "How dare you condone torture!"

KingKalamari
Aug 24, 2007

Fuzzy dice, bongos in the back
My ship of love is ready to attack

Libluini posted:

I can't even try to comprehend how you could even fall for that. Just as a single point of failure, imagine the hypothetical scenario of meat-you running still around while someone else made a simulation of you. Which one of them is the "real" you?

Obviously one of them is, and the other is just some unrelated person carrying similar memories up to the point where both start having different experiences. Torturing a simulation of you is just like grabbing some unrelated rando of the streets and pretending it's the person you wanted to harm. But you believing bullshit doesn't magically make it real, it's still someone else getting tortured.

This basic failure in applying logic to your thoughts makes me wonder how those weirdos can even manage to post on the internet without their computers catching fire

The work-around for the difference between simulation you vs original meat-you is actually an extension of earlier logical leaps to make Yudkowsky's preferred brand of transhuman immortality viable from a philosophical perspective. The first thing you have to understand is that Yudkowsky, like many other transhumanists, is terrified by the concept of death. This fear is something that informs a lot of his belief in transhumanism and future machine gods and all sorts of other kooky spooky bullshit but the important beliefs of Yudkowsky's we need to focus on in this case are the ones that directly relate to mind uploading and cryonics. So mind uploading is a concept in the transhuman (and science fiction) community that once we have developed the capacity to simulate an artificial consciousness on a computer there's nothing theoretically stopping us from simulating someone's existing consciousness on said computer. We'd basically be able to take your meat brain and perfectly simulate all the physical stuff it's doing digitally.

The problem that crops up here is the issue of continuity of consciousness. See, from a purely practical standpoint, this process isn't jamming your mind into a computer, it's creating an emulation of your mind using computer processes and can philosophically be boiled down to "We're going to kill the original you and create a new, digital you with all of your memories to take its place". This leads to the existentially horrifying realization that from the original you's perspective they didn't actually achieve immortality, they just died and had an immortal copy take their place. This idea has been fairly thoroughly explored already in a number of thought experiments, most notably Donald Davidson's Swampman scenario (In which a hypothetical guy is struck by lightning in a swamp and disintegrated while an exact copy is simultaneously produced out of matter from a nearby tree and continues on in the original's place) or a recurring thought experiment involving a teleporter that works by destroying the person who uses it and creates a perfect copy of them at their destination as explored in the Canadian animated short film To Be.

There have been some proposals by transhumanists to try and work around the problem of continuity of consciousness, most notably by keeping the subject fully conscious and awake as their brain is replaced piece by piece with mechanical emulations, but most of these solutions aren't compatible with Yudkosky's preferred approach to transhuman immortality: Cryonics. See, cryonics is the idea of preserving someone's existing physical body by freezing it and then reconstructing it later. There are a lot of very real reasons why cryonics will not work as a means of cheating death, greatest of which is that freezing a body and then thawing it back out invariably damages parts of the original tissue, most worryingly the brain. Yudkowsky usually handwaves these problems away with do-anything, post singularity magic technology that would be able to overcome or repair the damage done to the brain by cryonics. From a philosophical perspective, though, there's not really a way to avoid dealing with the problem of continuity of consciousness since there's not really a way to preserve your brain via freezing that doesn't kill you.

So, Yudkowsky's written quite a few words and jumped through a number of logical hoops in order to justify why continuity of consciousness is not actually a problem and someone making a perfect copy of you after the original you dies is fundamentally the same as just bringing the original you back to life. The problem with this is that it means that if you someone is able to create an accurate enough copy of you (Or an accurate enough simulation of your consciousness) you should, from a philosophical standpoint, view anything done to that copy identically to how you would treat it being done to the original meat you currently thinking about it.

Roko's Basilisk then demonstrates a fairly major flaw in this mindset because it extrapolates a scenario where the all powerful, "benevolent", post-singularity super AI that is a core essential assumption to Yudkowsky and his followers' transhuman religious beliefs, has the capacity and motivation to create and torture identical copies of them, which their existing belief system dictates they should treat as the future AI just bringing the original versions of them back to life in the future and torturing for ever and ever.

That there are a number of practical reasons that this would be impossible to do (The scenario assumes that an AI singularity is possible, that it's possible to recreate perfect simulations of long dead individuals based solely on extrapolated data, that a future post-singularity Gangster Computer God will adhere to Yudkowsky's particular brand of Bayesian reasoning and decide torturing simulations to motivate the originals to retroactively create is faster is a productive use of its time and resources...) but these are largely irrelevant to the people on LessWrong because their worldview isn't based on examining the practical concerns and applications of real world technologies, it's about confusing pure, philosophical extrapolation with reality.

The Chad Jihad
Feb 24, 2007



It seems like the counter to that reasoning would be something like "pain and suffering scales exponentially and no amount of eye blinks will ever counter balance even a day of torture"

Piell
Sep 3, 2006

Grey Worm's Ken doll-like groin throbbed with the anticipatory pleasure that only a slightly warm and moist piece of lemoncake could offer


Young Orc

The Chad Jihad posted:

It seems like the counter to that reasoning would be something like "pain and suffering scales exponentially and no amount of eye blinks will ever counter balance even a day of torture"

Their counter to that, because I've seen it, is "ah, but even exponential scaling fails when you consider a sufficiently large number of eyeblinkers, you clearly don't understand large numbers!!!!"

KingKalamari
Aug 24, 2007

Fuzzy dice, bongos in the back
My ship of love is ready to attack
Yudkowsky and LessWrong's approach to reasoning in a nutshell:

Libluini
May 18, 2012

I gravitated towards the Greens, eventually even joining the party itself.

The Linke is a party I grudgingly accept exists, but I've learned enough about DDR-history I can't bring myself to trust a party that was once the SED, a party leading the corrupt state apparatus ...
Grimey Drawer

KingKalamari posted:

The work-around for the difference between simulation you vs original meat-you is actually an extension of earlier logical leaps to make Yudkowsky's preferred brand of transhuman immortality viable from a philosophical perspective. The first thing you have to understand is that Yudkowsky, like many other transhumanists, is terrified by the concept of death. This fear is something that informs a lot of his belief in transhumanism and future machine gods and all sorts of other kooky spooky bullshit but the important beliefs of Yudkowsky's we need to focus on in this case are the ones that directly relate to mind uploading and cryonics. So mind uploading is a concept in the transhuman (and science fiction) community that once we have developed the capacity to simulate an artificial consciousness on a computer there's nothing theoretically stopping us from simulating someone's existing consciousness on said computer. We'd basically be able to take your meat brain and perfectly simulate all the physical stuff it's doing digitally.

The problem that crops up here is the issue of continuity of consciousness. See, from a purely practical standpoint, this process isn't jamming your mind into a computer, it's creating an emulation of your mind using computer processes and can philosophically be boiled down to "We're going to kill the original you and create a new, digital you with all of your memories to take its place". This leads to the existentially horrifying realization that from the original you's perspective they didn't actually achieve immortality, they just died and had an immortal copy take their place. This idea has been fairly thoroughly explored already in a number of thought experiments, most notably Donald Davidson's Swampman scenario (In which a hypothetical guy is struck by lightning in a swamp and disintegrated while an exact copy is simultaneously produced out of matter from a nearby tree and continues on in the original's place) or a recurring thought experiment involving a teleporter that works by destroying the person who uses it and creates a perfect copy of them at their destination as explored in the Canadian animated short film To Be.

There have been some proposals by transhumanists to try and work around the problem of continuity of consciousness, most notably by keeping the subject fully conscious and awake as their brain is replaced piece by piece with mechanical emulations, but most of these solutions aren't compatible with Yudkosky's preferred approach to transhuman immortality: Cryonics. See, cryonics is the idea of preserving someone's existing physical body by freezing it and then reconstructing it later. There are a lot of very real reasons why cryonics will not work as a means of cheating death, greatest of which is that freezing a body and then thawing it back out invariably damages parts of the original tissue, most worryingly the brain. Yudkowsky usually handwaves these problems away with do-anything, post singularity magic technology that would be able to overcome or repair the damage done to the brain by cryonics. From a philosophical perspective, though, there's not really a way to avoid dealing with the problem of continuity of consciousness since there's not really a way to preserve your brain via freezing that doesn't kill you.

So, Yudkowsky's written quite a few words and jumped through a number of logical hoops in order to justify why continuity of consciousness is not actually a problem and someone making a perfect copy of you after the original you dies is fundamentally the same as just bringing the original you back to life. The problem with this is that it means that if you someone is able to create an accurate enough copy of you (Or an accurate enough simulation of your consciousness) you should, from a philosophical standpoint, view anything done to that copy identically to how you would treat it being done to the original meat you currently thinking about it.

Roko's Basilisk then demonstrates a fairly major flaw in this mindset because it extrapolates a scenario where the all powerful, "benevolent", post-singularity super AI that is a core essential assumption to Yudkowsky and his followers' transhuman religious beliefs, has the capacity and motivation to create and torture identical copies of them, which their existing belief system dictates they should treat as the future AI just bringing the original versions of them back to life in the future and torturing for ever and ever.

That there are a number of practical reasons that this would be impossible to do (The scenario assumes that an AI singularity is possible, that it's possible to recreate perfect simulations of long dead individuals based solely on extrapolated data, that a future post-singularity Gangster Computer God will adhere to Yudkowsky's particular brand of Bayesian reasoning and decide torturing simulations to motivate the originals to retroactively create is faster is a productive use of its time and resources...) but these are largely irrelevant to the people on LessWrong because their worldview isn't based on examining the practical concerns and applications of real world technologies, it's about confusing pure, philosophical extrapolation with reality.

The sad thing is, Yudkowsky and his followers waste everyone's time, as this line of reasoning is on par on making up reasons for why your mind is magically transferred into your children. It's not necessary to have a complete line of continuity from mind to mind, since as long as someone almost like you continues onwards, death shouldn't really scare you anymore. The individual you may be dead, but your inheritor steps up to continue.

I mean sure, no-one would complain if they wake up after death and realize they can continue living because the soul is real and just moved on to the next best thing after your body got disintegrated, but someone else with the same personality and memories continuing after your death isn't so bad, either. No reason to reject reality and flee towards unhealthy fantasies, imho

Though I get the feeling suggesting Yudkowsky and Co. take some medication to counter their unhealthy fear of death wouldn't go over well with them. :v:

Piell
Sep 3, 2006

Grey Worm's Ken doll-like groin throbbed with the anticipatory pleasure that only a slightly warm and moist piece of lemoncake could offer


Young Orc

Libluini posted:

The sad thing is, Yudkowsky and his followers waste everyone's time, as this line of reasoning is on par on making up reasons for why your mind is magically transferred into your children. It's not necessary to have a complete line of continuity from mind to mind, since as long as someone almost like you continues onwards, death shouldn't really scare you anymore. The individual you may be dead, but your inheritor steps up to continue.

I mean sure, no-one would complain if they wake up after death and realize they can continue living because the soul is real and just moved on to the next best thing after your body got disintegrated, but someone else with the same personality and memories continuing after your death isn't so bad, either. No reason to reject reality and flee towards unhealthy fantasies, imho

Though I get the feeling suggesting Yudkowsky and Co. take some medication to counter their unhealthy fear of death wouldn't go over well with them. :v:

I don't get this. I don't care if someone exactly like me is walking around, I'm dead, I can't care about anything. If I'm looking forward, I can say "well, my family would be better off if this copy of me was around" but that doesn't change the part where I'm going to die, which I don't want and another me walking around doesn't help

Farg
Nov 19, 2013
yeah i dont care if there is a "me" doing "me" stuff forever. i only care that ME is hale and hearty.

Groovelord Neato
Dec 6, 2014


Speaking of Roko Basilisk morons YouTuber HeyIt'sVadim just dropped a video about Elon Musk and how the SNL episode worked as propaganda for him.

https://www.youtube.com/watch?v=oAacKKSnkrY

This comes up in the CSPAM Elon Musk thread a lot but it's mind boggling that he has the following he does if you ever actually hear him speak. It's not just that what he says his stupid it's that he can barely form sentences when he's speaking.

Groovelord Neato fucked around with this message at 18:37 on Jun 19, 2021

Soup du Jour
Sep 8, 2011

I always knew I'd die with a headache.

Terrible Opinions posted:

It's still super funny to me that HPMOR isn't the only Harry Potter fanfiction to start a cult, because the author of Dumbledore's Army and the Year of Darkness also started a cult.

this is a story that needs a youtube deep dive story because it’s seriously wild from what I remember

Violet_Sky
Dec 5, 2011



Fun Shoe

Soup du Jour posted:

this is a story that needs a youtube deep dive story because it’s seriously wild from what I remember

The Harry Potter fandom is loving wild.

stillvisions
Oct 15, 2014

I really should have come up with something better before spending five bucks on this.
Nyx Fears covers the "Disturbing Movie Iceberg" https://www.youtube.com/watch?v=lPld_tFBBQY (part 1)

While this only covers the top three tiers of the iceberg it's safe to say CONTENT WARNING.

I'm glad May is putting out stuff in her wheelhouse but at the same time I was kinda hoping she'd stick to her plan of "not going to super dark movie place" anytime soon.

MonsieurChoc
Oct 12, 2013

Every species can smell its own extinction.
After Meltdown May, Depression June?

Jamie Faith
Jan 13, 2020

Groovelord Neato posted:

This comes up in the CSPAM Elon Musk thread a lot but it's mind boggling that he has the following he does if you ever actually hear him speak. It's not just that what he says his stupid it's that he can barely form sentences when he's speaking.

I mean that never stopped trump from being popular. I guess stupid people relate to them or something

fun hater
May 24, 2009

its a neat trick, but you can only do it once
i will watch the disturbing movie iceberg bc iirc ive seen an upsetting amt of them and they're all terrible lol. being 13 with an internet connection is an american mistake


i get any lesswrong thing is supposed to be a grand thought experiment but even as a teen i used to go on there with multiple account and ask deliberately stupid questions to upset them. like how come we live in a future where we have a magic AI that can create infinite clones of a person to torment but we can't turn it off. if we've conquered infinite resources, then why did we forget how to make off buttons or simply program out the "torture" capabilities if its benevolent.

Libluini
May 18, 2012

I gravitated towards the Greens, eventually even joining the party itself.

The Linke is a party I grudgingly accept exists, but I've learned enough about DDR-history I can't bring myself to trust a party that was once the SED, a party leading the corrupt state apparatus ...
Grimey Drawer

Piell posted:

I don't get this. I don't care if someone exactly like me is walking around, I'm dead, I can't care about anything. If I'm looking forward, I can say "well, my family would be better off if this copy of me was around" but that doesn't change the part where I'm going to die, which I don't want and another me walking around doesn't help

Farg posted:

yeah i dont care if there is a "me" doing "me" stuff forever. i only care that ME is hale and hearty.

See, and I don't get this. There are all manner of people on this world. :shrug:

josh04
Oct 19, 2008


"THE FLASH IS THE REASON
TO RACE TO THE THEATRES"

This title contains sponsored content.

All you really need to know about neoreaction stuff is that Trump getting elected completely sucked the wind out of their sails because you don't need a fancy intellectual grounding for your racism if it turns out you can just be racist. Nick Land eat your heart out.

I AM GRANDO
Aug 20, 2006

Trump winning was the most gigantic death blow possible to the entire alt-right project. They all just crumpled under the weight of stupid victory and vanished.

Piell
Sep 3, 2006

Grey Worm's Ken doll-like groin throbbed with the anticipatory pleasure that only a slightly warm and moist piece of lemoncake could offer


Young Orc

Libluini posted:

See, and I don't get this. There are all manner of people on this world. :shrug:

If you die then you're dead, it makes zero difference to you if there's a copy of you walking around, you're still dead. Just because you get a new goldfish when your first one dies doesn't mean the first goldfish is still alive.

If instead of getting killed you're instead kidnapped and kept in an underground bunker while a copy of you moves into your life are you totally cool with that? If not, what's the difference between that and if someone just killed you instead of kidnapping you?

Piell fucked around with this message at 21:05 on Jun 19, 2021

Groovelord Neato
Dec 6, 2014


Libluini posted:

See, and I don't get this. There are all manner of people on this world. :shrug:

You don’t get how a copy of you isn’t you and if you’re dead you’re dead?

sexpig by night
Sep 8, 2011

by Azathoth
AVGN did a fun look back on the frankly odd teamup of Howard Johnson hotels and SEGA

https://www.youtube.com/watch?v=WF9eUXKgzCA

I AM GRANDO
Aug 20, 2006

The long-haired host guy has an extremely familiar voice.

achillesforever6
Apr 23, 2012

psst you wanna do a communism?
https://www.youtube.com/watch?v=sF5orzvw68o
Interesting Knic-

:stare:

achillesforever6 fucked around with this message at 22:56 on Jun 19, 2021

I AM GRANDO
Aug 20, 2006

Suddenly the Dan Schneider stuff makes a little more sense.

fun hater
May 24, 2009

its a neat trick, but you can only do it once
wtf there was an animated adaptation of mr arashi's amazing freakshow?! oh my god reading about it, it was the work of one very insane impassioned weirdo. oh my god lol impassioned for the WRONG reasons

MonsieurChoc
Oct 12, 2013

Every species can smell its own extinction.
That dumbass take on morality falls apart the moment he equivocates between getting dust in your eyes and getting tortured. Even if you go with his dumb idea of "pain point" or whatever, these are two different things. Mild irritation is not sufferring, it's comparing apples to oranges. A better moral quandary would be if it's better to torture someone for 50 hours or torture 50 people for 1 hour. At least in that dumb hypothetical you're comparing the same thing.

And yes I know everyone here already agrees with me that Roko's Basilisk is dumb but goddamit I had to get this off my chest.

Groovelord Neato
Dec 6, 2014


fun hater posted:

wtf there was an animated adaptation of mr arashi's amazing freakshow?! oh my god reading about it, it was the work of one very insane impassioned weirdo. oh my god lol impassioned for the WRONG reasons

The only reason I know it exists was a clip of the dwarf loving everybody up with his hallucination powers.

Arc Hammer
Mar 4, 2013

Got any deathsticks?
https://www.youtube.com/watch?v=qhtUzMY-Yi0

Kliksphillip discusses Youtube Shorts and how he might implement it into his channel.

Ghostlight
Sep 25, 2009

maybe for one second you can pause; try to step into another person's perspective, and understand that a watermelon is cursing me



MonsieurChoc posted:

That dumbass take on morality falls apart the moment he equivocates between getting dust in your eyes and getting tortured. Even if you go with his dumb idea of "pain point" or whatever, these are two different things. Mild irritation is not sufferring, it's comparing apples to oranges. A better moral quandary would be if it's better to torture someone for 50 hours or torture 50 people for 1 hour. At least in that dumb hypothetical you're comparing the same thing.
He doesn't understand this because his belief system requires that everything experiential can be quantifiable. Quantities of things, even different things, can be directly compared; thus sandgrain vs torture. If everything experiential can't be quantified then you can't be simulated and he's going to die.

Skippy McPants
Mar 19, 2009

Ghostlight posted:

If everything experiential can't be quantified then you can't be simulated and he's going to die.

MonsieurChoc
Oct 12, 2013

Every species can smell its own extinction.
I too suffer from existential dread and the fear of nothing after death. Somehow I can still realize that comparing grain of sands in your eyes to torture doesn't work.

Dapper_Swindler
Feb 14, 2012

Im glad my instant dislike in you has been validated again and again.

Antifa Turkeesian posted:

Trump winning was the most gigantic death blow possible to the entire alt-right project. They all just crumpled under the weight of stupid victory and vanished.

i think it was the triple blow of trump winning(which made more then just us notice them) then Charlottesville(which was supposed to be the big open nazi coming out party but instead it broke their backs chased off a bunch of the fence sitters and basically hosed the hardliners and fractured the movement) and the 6th(hurt the alt lite and hosed over a ton of trumpists and basicaly showed the trumpists whole asses to the world) like obviously they still exist but the mainstream doesn't give a gently caress about their weird culture war bullshit anymore and thats all these ghouls talk about.

MonsieurChoc posted:

I too suffer from existential dread and the fear of nothing after death. Somehow I can still realize that comparing grain of sands in your eyes to torture doesn't work.

same. its also why SOMA is one of the few horror games that makes me uncomfortable, anyway hats why i just spend time on here or with family or pets or making money or whatever. to quote a funny but overacted shot “Nobody exists on purpose. Nobody belongs anywhere. Everybody's gonna die. Come watch TV”. maybe there is some magic sky god with happy personal paradise, maybe i will burn forever for some gently caress up or action, maybe i will be something new again, maybe i will cease, either way at least i enjoyed the now.

Nuebot
Feb 18, 2013

The developer of Brigador is a secret chud, don't give him money

Dapper_Swindler posted:

same. its also why SOMA is one of the few horror games that makes me uncomfortable, anyway hats why i just spend time on here or with family or pets or making money or whatever. to quote a funny but overacted shot “Nobody exists on purpose. Nobody belongs anywhere. Everybody's gonna die. Come watch TV”. maybe there is some magic sky god with happy personal paradise, maybe i will burn forever for some gently caress up or action, maybe i will be something new again, maybe i will cease, either way at least i enjoyed the now.

I had a friend quit playing Soma about halfway through because it made them too uncomfortable once they started realizing the breadth of the scenario and that the copies of people weren't just robots.

Solitair
Feb 18, 2014

TODAY'S GONNA BE A GOOD MOTHERFUCKIN' DAY!!!

Soup du Jour posted:

this is a story that needs a youtube deep dive story because it’s seriously wild from what I remember

I've only barely heard of the fic in question and I'm very curious to see how buckwild the following gets.

Dapper_Swindler
Feb 14, 2012

Im glad my instant dislike in you has been validated again and again.

Terrible Opinions posted:

It's still super funny to me that HPMOR isn't the only Harry Potter fanfiction to start a cult, because the author of Dumbledore's Army and the Year of Darkness also started a cult.

please explain/expound.


Nuebot posted:

I had a friend quit playing Soma about halfway through because it made them too uncomfortable once they started realizing the breadth of the scenario and that the copies of people weren't just robots.

yeah. the whole thing is "oh poo poo, its THAT kinda story".

Pachylad
Jul 12, 2017

Dapper_Swindler posted:

please explain/expound.

quote:

Andy Blake, a.k.a. Thanfiction, is a fanwriter and fanartist best known for his involvement in a number of high-profile fandom controversies. Blake is notorious being the center of incidents which include starting a cult in LOTR fandom, another cult in Harry Potter fandom, failed convention organizing, his many fraudulent identities and pseuds, faking the deaths of both his real and fabricated identities, and being witness to the homicide of his roommates/cult members.

The whole thing is just fuckin :stare:, here's just one excerpt:

quote:

Unbeknownst to most Bit of Earth members, almost from the beginning Andy had been claiming to have psychic abilities. He began to convince some friends and followers that Tolkien's works comprised the real history of a long-forgotten world than had been revealed to Tolkien when he discovered the Red Book of Westmarch. Blake claimed he had the ability to channel (allow spirits to inhabit one's body and speak to or for them) a number of characters/souls from this world, particularly hobbits such as Merry Brandybuck, and would conduct "spirit battles." He also claimed to channel actor Elijah Wood.

Along with a few friends, Blake moved to Oregon and formed and led a group centered on these spiritual beliefs, that many, including former members, call a cult. The group called itself the Bagenders. Among the members of this group were Abbey Stone (Orangeblossom Brambleburr), whom Blake dated, and Jeanine Renne (Turimel, who helped to organize Tentmoot, a proposed Lord of the Rings convention).

Blake then claimed that his soul had died and been replaced by the "duplicate soul" of Elijah Wood. In November 2002, Victoria Bitter made a last post online and faked suicide. Blake then began using the name "Jordan Wood". He created a LiveJournal account, "padawans," in which he wrote as Elijah Wood, under the premise that Wood was in a secret forbidden relationship with Dominic Monaghan, attracting attention from Domlijah tinhats.

Blake faked his own death, sending his parents a suicide letter in October 2003, and became legally classified as a missing person. He attempted to illegally obtain IDs under the name "Jordan Gabriel Wood," which drew the attention of law enforcement.

Adbot
ADBOT LOVES YOU

No.1 Special
Apr 4, 2011
Super Bunnyhop - Umurangi Generation, Colonialism, UAPs and UFOs and Alien Invasion Stories

https://www.youtube.com/watch?v=FhcYszdVRsw

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply