|
I dunno how I feel about the concept of Existential Risk as a thing because to me it feels like a lot of completely different things lumped together under the fact that they kill us, and so there are generalisations made that maybe aren’t very helpful: -there are risks where civilisation ends for a completely contingent reason— technology breaks down in NORAD, and all the nuclear missiles get launched because of a threat that isn’t there. -there are risks where civilisation ends due to gradually destroying the conditions under which it can exist, as with climate change. -there are risks where civilisation inevitably produces a singular event that leads to its own destruction, as with creating a superintelligent robot that eats everyone. I don’t know if these are all the same kind of thing? That it’s useful to approach them in a single way under a single banner? Definitely as someone with nuclear weapon induced sadbrains I think we don’t always say “of course maybe we all just get nuked tomorrow for a stupid and preventable reason;” like we leave out some of the endings of civilisation that feel a bit trivial.
|
# ¿ Mar 19, 2021 15:05 |
|
|
# ¿ May 12, 2024 19:39 |