Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
vegetables
Mar 10, 2012

I dunno how I feel about the concept of Existential Risk as a thing because to me it feels like a lot of completely different things lumped together under the fact that they kill us, and so there are generalisations made that maybe aren’t very helpful:

-there are risks where civilisation ends for a completely contingent reason— technology breaks down in NORAD, and all the nuclear missiles get launched because of a threat that isn’t there.

-there are risks where civilisation ends due to gradually destroying the conditions under which it can exist, as with climate change.

-there are risks where civilisation inevitably produces a singular event that leads to its own destruction, as with creating a superintelligent robot that eats everyone.

I don’t know if these are all the same kind of thing? That it’s useful to approach them in a single way under a single banner? Definitely as someone with nuclear weapon induced sadbrains I think we don’t always say “of course maybe we all just get nuked tomorrow for a stupid and preventable reason;” like we leave out some of the endings of civilisation that feel a bit trivial.

Adbot
ADBOT LOVES YOU

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply