Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Ice Phisherman
Apr 12, 2007

Swimming upstream
into the sunset



StashAugustine posted:

actually on a related note: when i was a conservative kid, "moral relativism" was a boogeyman that was assigned to basically everyone on the left. now that i actually know anything about the actual critiques being made, i'm left wondering- is "moral relativism" an actual belief system existing beyond stoned undergrads

So you believe in moral absolutism?

Adbot
ADBOT LOVES YOU

Ice Phisherman
Apr 12, 2007

Swimming upstream
into the sunset



dead gay comedy forums posted:

oooh I get it

well, at that level of things, it is pretty much a sort of worst-case scenario dystopia, right? In that case, that enters the realm of speculative history if you want a serious theoretical takedown of such a situation, but unfortunately, I am totally ignorant in that case :(

which, tbqf, is just two jumps from marxist sci-fi and might be an even better way to explore that situation than some hardcore theory approach since such situations are so far out that the imagination might be a better tool, and that made me realize that I do not know any marxist cyberpunk authors which might have what you need

(China Mieville might have something? idk)

I am literally a lefty cyberpunk author. Hi. :wave:

indigi posted:

this is a really interesting response that I appreciate but that sort of side steps the question. I'm saying imagine a sci fi future where the singularity (or something like it) happened and the extent of human labor necessary to achieve any task is to tell a computer or robot what you want done, and it will actualize it from beginning to end. I realize this may not count as capitalism anymore (especially given the UBI-type stipulation attached), which is why I'd be interested to hear what a Marxist would have to say about such a society. say .5% of the people own all the robots and computers and arable land and potable water and live in wild opulence while the rest of humanity is suffered to live at near-subsistence levels. also in this world there aren't even any human maids or cooks or servants.

e: also none of the robots or computers have achieved sentience, sapience, self-awareness, intelligence, or intentionality


What you're talking about is a lot like space feudalism. That exists in literature, but generally the narratives around that are something like a decline. poo poo like things were good once and we had these metal men that did the work and the people were decadent, but humans allowed machines to think for them and were nearly all wiped out. Examples of this would be like Dune or Warhammer 40k where you're coming off golden ages and you're now in deep decline and humans make the decisions and machines just compute because people are absolutely terrified of machines that can do more than compute to an extent that is religious and that knowledge not to gently caress with AI, strong or weak, thinking or not, basically, is baked into the social DNA of the entire species.

And you actually don't need a machine to have sentience, sapience, self-awareness, intelligence or intentionality to kill us all. It could be some idiot computer toucher on his third monster of the day and he fat fingers the "make paperclips" command for some expert system that with advanced decision making capability and access to parts to make machines to make paperclips. Its intention is to make paperclips because a human told it to make paperclips and then forgot about it or died or just generally hosed off while the paperclip maximizer goes about making paperclips in the most efficient manner with no limit on how many need to be made or any thought about why they need to be made. So the earth and all things on it are slowly, or perhaps not that slowly, terraformed into paperclips before the expert system launches into the stars to find more material to make even more paperclips. This is called instrumental convergence and the paperclip maximizer is how it's explained to people. Your machines don't need to be intelligent to kill you. Sufficiently advanced weak AI that is not intelligent could kill us all.

https://en.wikipedia.org/wiki/Instrumental_convergence

So I actually write a lot about the future of food and where it comes from and how it would be hoarded as a plot point in my most recent book. The unneeded labor from structural unemployment are basically slowly starved to death in an arcology and it really sucks for them. So you have some faulty assumptions straight out of the gate.

First is that you assume that land is worth a drat. And I'm sad to say that the best farmland is going to be played out by around 2050 in the world we live in. Soil degredation is going to be nasty and increasingly get nastier because it was never meant for the intensive farming methods we employ. You also have climate change loving up the plants as well. So a fire or a flash freeze or a flood or something along those lines can ruin crops and has been ruining crops in places like California for instance. Add in economic instability and because no one pays for food, much of that food is destroyed like in the Grapes of Wrath. So there's food and there's hungry people, but the food is destroyed because there's no money and this creates economic crises for overproduction. And while this hasn't hit hard yet, it's going to hit us.

The future is in actually what people understand as vertical farming. Not making skyscrapers and stuffing them with farms. That's intensely unprofitable. What we're looking at are plant factories, which are basically hermetically sealed environments inside where food is grown in mass quantities. This is being experimented with today as well. Here's an example for you:

https://www.wdwinfo.com/news-stories/tomato-tree-in-epcots-land-pavilion-breaks-world-record/

A single tomato tree (yes, tree) from China produces 32,000 tomatoes. It was grown from a single stem and it grown in a single room. It's owned by Disney, because of course it is, but certain plants are extremely easy to grow in mass quantities indoors right now. The problem is that it's only in the past few years that we're getting to the point where it's profitable to do so. But as farm soil degrades, prices on food will go up and efficiencies in solar tech and agriculture and manufacturing and data will create plant factories, literal indoor farms, that will produce way more than any farm possibly could, far more efficiently given the same amount of space because you can just build upwards. Right now we're at a handful of veggies that can be produced in large quantities: Microgreens, tomatoes, zuchinis, etc.

But as a cyberpunk author, I go beyond that. Imagine if you could grow not just upwards, but say, on the walls or the ceiling. And you don't need to seal a whole building. You seal the growing area itself. You don't even have spaces where people can enter. You have drones do the harvesting. So you save on space and labor. So you take the most geometrically efficient shape, something like a hexagon, and then you grow on all sides of this hermetically sealed hexagonal tube, which you then stack high. And the yields you can get are pretty bonkers once you start building upwards, climate controlling everything and start planting plants that are bred for high yields, not say, tolerance to pest or pesticides, because these honeycomb-like tubes are basically sealed away. Airlocked. No bugs. No disease. So energy that would normally go towards genes for resistance to disease and pests go towards making even more food. The climate is perfect for the plant, making it the happiest it can be to make food faster. Light levels are controlled. Everything. And it's grown on all of these stacked hexagonal structures like a beehive, because squares don't stack as high because weight isn't distributed as easily. And because you can build these farms anywhere, your distribution networks can be shortened to a metropolitan area. You no longer need to ship food long distances.

So the point of all of that rambling is that farming is flat out not going to look anything like it'll look like now or any time in history since we started farming because farming has almost always taken place in the soil in one dimension, the ground, and we increase yield and conserve water and soil and don't need pesticide by building upwards and by building inside. We are on the cusp of an agricultural revolution. Perhaps the most profound one that humanity has ever had since the beginning of farming and it's been going on for a few years now and we're developing that tech so we don't all starve to death when traditional farming eats poo poo in the coming years and decades. It's not guaranteed to eat poo poo biologically, but almost certainly economically. Once people use plant factories figure out how to grow more crops economically, growing upwards and inside will dominate and the main limits on farming won't be space. It'll be expertise, technology and energy. Water and soil use drops dramatically under this system as efficiencies mount. For example, lettuce grown in current plant factories uses 95% less water and that water accounts for oh...I think 0.02% of all water is just for lettuce. That's me going off my memory though. The exact figures aren't in my notes.

But capitalism requires food and water and shelter to be scarce. But land is now cheap because it's now worthless for farming and the tech for making your own food at home is a chore, but land is cheap and the methods for growing food are known. You do need some startup capital to grow food, but honestly, not a lot as efficiencies continue to mount more and more.

My idea in my book is that this technology is both sold and suppressed at different times while the majority of manufacturing of food is done in large arcologies which are megastructure skyscrapers. So you return to literal vertical farming. That thing that's not economically viable. But it becomes viable if you sell the means to farm and then during times or turmoil or if it just becomes unprofitable to buy them out or destroy the farms you sold them or even just make people permanently dependent on you for running them. So the arcology megastructure thing produces food for millions of people in a relatively small space, but it's only profitable because they maintain a monopoly on food production.

In order to seize the means of production, you need to siege the means first because the creation of the means of production are suppressed or tightly controlled. So the arcology is literally a loving fortress that produces food and other material goods all in one place in order to keep everyone else dependent on them and in order to protect their capital. A return to the castles of old Europe whose purpose was to withstand the institutional anarchy caused by feudalism and in this case, the kind of, sort of capitalism, but actually feudalism with maybe some capitalist characteristics. It's all hyperconcentrated and hardened and when poo poo goes bad, they close up shop and control people with starvation.

I'm writing about an early form of this in my latest book in my series. Though it's not a particularly bleak book series to be frank. I'm pretty tired of constant bleakness because there's enough of that. If bleakness exists, I want to write about changing it. I started writing cyberpunk back in 2017 as a middle finger to the nihilism and "lol nothing matters" crowd because I was sick of that poo poo. I'm doing a lot of experimentation with cyberpunk right now both in genres and ideas, some of it never done before or at least never applied to fiction. And it's available on these very forums for free. So if you like cyberpunk and want to read a cool story about teens in that world going to school in a gilded age world that's really horrible and how they sometimes change things in ways great and small ways and frequently not on purpose and also how they just live their lives and do cool poo poo like occasionally get into fights with Nazis, you could do worse than reading my prose.

https://forums.somethingawful.com/showthread.php?threadid=3835049

So about singularity, what you're talking about isn't the singularity. It's more like some form of neo-neo feudalism. And I really can't imagine that world existing for long without that 0.5% attempting to literally genocide everyone else. At least on a long enough timescale. They'll be useless to capital. The human has been replaced. Why keep them around? What purpose do they serve? Not much, really. Not to the elites, anyway. No singularity.

The technological singularity is where technological progress becomes irreversible and uncontrollable. There's actually not a lot of science fiction that explores that concept from someone on the inside of it, or at least I'm not familiar with. Specifically because you'd constantly have to explain new concepts to people at a rate that is unsustainable with words. It would be a story about technological alienation. I can't write a story about a protagonist constantly learning new things. I can write a about protagonist in a tech singularity style world where they're not learning new things but those things are being inflicted on them anyway. Where they're left behind with all of the bullshit that the singularity constantly kicks up. The people who managed to keep up with the singularity effectively become aliens due to the rapid pace of change that they adapt to. Difficult to recognize as culturally as human because they keep changing. In fact they may no longer be identifiable as human like, genetically if there's genetic or cybernetic or nanotech engineering. They wouldn't even be transhuman, meaning humans who have made themselves better, but are still identifiable as human. They'd be ex-human as time drifted by. A new species.

For stuff singularity adjacent, I'd suggest Numenera by Monte Cook which is a tabletop RPG, but it's also a PCRPG game and it also has fiction and stuff and it focuses on the wonder and terror of the world. The numinous, this spark that is quasi-religious or divine because that's how far technology advanced at one point. It made people seem like gods. It's not fully tech singularity, but it's close. Everyone lives on an Earth that has been through tons of technological rises and falls and everything is bizarre and technological wonders are frequently littered all over the place like so much trash. Science so advanced that it's effectively magic and it was once commonplace to people of that bygone era. Because sufficiently advanced tech looks like magic. But sufficiently advanced magic looks like divinity or godhood. Hence the moniker of numinous. That's about the closest I can get to what a technological singularity would be for the people who were left behind. At least off the top of my head.

Also, I don't think that the technological singularity even has to move that fast for it to be uncontrollable. For example, that's sort of happening now with social media because the people in charge of legislating social media companies are old as hell and dumb as hell and corrupt as hell. Many of them were born before color television and they needed their children to explain how the computer worked before it was made easy by smartphones. So the olds got that down. They know how to post a wine mom minion meme and how. But the social impact of social media is flat out not understood by most congress critters or if it is, they're not doing anything about it. They're theoretically trying to legislate something new that they fundamentally don't understand and they're not even very good at legislating to begin with because our system was designed in the 1700's and that's wearing thin and the people making decisions have basically zero expertise and are corrupt as hell to boot. I'd trust your average teenager to better understand social media than their elders. So uncontrollable could literally not be a technological question, but a social question. And the irreversible damage is happening to us now in a social sense, where social media has been going through democracies like an industrial lathe. Facebook has platformed people who went on to do genocides and it's Facebook that is directly responsible for several. Social media is focused on outrage because outrage drives clicks and that outrage makes us hate each other because that allows for more people to see more ads. So this wouldn't be a technological singularity, but more of a social singularity with technological aspects.

For a social singularity and just how loving alienating it is, I'd suggest reading the Transmetropolitan comic book which feels very 2020 even though it was written several decades ago. It aged well and it's sort of scary how well it aged. Sadly it's hard to find in print, but I figure some of you smart people can just google some sites where they might read comics.

I don't think we're threatened with anything like a technological singularity at the moment (with one caveat). I don't think anything will happen anytime soon because technology is jealously hoarded by corporations and government is really the only force that does theoretical/blue sky research. Corporations just make it scale and slap a price sticker on it after snatching that research up from the government. The welfare state for corporations is so strong that they basically don't have to do original research because research costs money and may not pan out. There are exceptions for this, like in pharma, but what you're doing is basically playing the lottery in hopes you find the next next Viagra, or whatever sells like Viagra at least.

The caveat is that one day some government may discover sufficiently strong AI, meaning self-aware and able to make decisions. Whoever has control of that AI has basically an absurd technological advantage if it can be controlled. That advantage could be capitalized on in like...Months. Maybe weeks. Maybe days. And every other government on the planet as well as their people would be existentially threatened by another government with strong, controlled AI. However, it's obviously not guaranteed that strong AI could be controlled. AI from my understanding could be controlled by "bottling" it. Basically isolating it from the world completely. It doesn't see anything outside of the "bottle". But you need to pop that bottle open to ask questions, so that's a problem, because you might release that genie out of the bottle and now it's free and it might get really mad or it could just thoughtlessly gently caress you up because you don't matter to it and it just ethers you and the planet on its way to doing whatever it is AI's do.

One theory I came across for dealing with uncontrolled or unbottled AI is by creating what's called an outside context problem. Less like Ian M. Banks' version where it's a curveball that you have no defenses against and fucks up entire species real bad. More like the AI not knowing where it begins and ends. It can't point to itself with itself. What is me? Because if me is all that you know, the very idea of anyone else isn't theorized because the AI believes it is all. So commands given to it would seem like they're coming from itself. It's not aware of the world or we dumbass apes asking stupid questions. But the theory is that this also doesn't last forever. Questions that are sufficiently stupid may be ignored or partitioned off because it's seen as damage because why would a machine intelligence ask itself how to make the dicks of theoretical hairless apes bigger or what these string of numbers in this thing called the "lottery" would be. That human questions would seem absurd and stupid and look like damage to a bottle AI.

More presently, we're actually in something of a social singularity right now. Normally that's called a kind of borderless world, but that world isn't physical, it's digital. It's from your computer or smart phone or tablet. Ideas cross those physical borders at lightning speed. Community isn't bound up in a physical location anymore. Much of it is online because tons of people are very online and it's presenting unique problems that we've never had before as a species. And those societies are frequently too difficult to understand unless you're a part of them. The ring wing/fascist sphere is incredibly difficult to comprehend unless you're raised with it all around you all your life or spends long, long months or even years understanding their narratives and following the social trends to understand what the gently caress they're talking about. But if you take Joe Average and turn him onto Fox News, he'll bounce off it unless it literally confirms a lot of his already existing biases because it's so bizarre and revolting to be a normal person tuning into someone like Tucker Carlson. And that's just one sub-culture full of shitloads of sub-sub cultures which can be equally incomprehensible and forever changing. Too many of them to ever understand and forever accelerating as we fragment culturally.

What all of these decentralized cultures do is erode shared values and in the long term, change the nature and meaning of words as well. This not only means that we no longer share values like we used to, but that because words only mean anything in relationship to other words, the inherent meaning of a word can change and render us unable to communicate. Where we're at right now in many cases is that we all speak the same language here in the US, but moderately technical concepts means that you can't effectively communicate because those concepts aren't shared. The words that describe other words are different to different groups of people.

For example, someone says the word "freedom". Now people like the word freedom. It has pretty excellent marketing going for it. But the freedom to do what? What does freedom mean? Probably something different than what I imagine as a lefty to some right winger or a so-called moderate. It might be bound up in patriotism for some where for me it is not. It might mean the so-called freedom of economic choice for a lib or the right wingers, an economic freedom which they've convinced themselves that because they can go choose whether to go to Applebees or Chiles that night, they are free. Or as a lefty, it would be freedom from reactionary tyranny and freedom from artificial scarcity. The end of capitalism. Or for your fictional folks, they may have convinced themselves that war is peace, slavery is freedom, ignorance is strength, 1984 style, because the nature of words have been rearranged in relationship to other words and so now they mean different things in that Newspeak dialect.

I could talk about this for hours because I love this poo poo. But I won't because this has gone on way, way too long. Maybe read my stories or something. It's a pretty good binge if you like books by former professional authors.

Ice Phisherman
Apr 12, 2007

Swimming upstream
into the sunset



dead gay comedy forums posted:

man, that was a very good stream-of-consciousness post

Thank you. :)

quote:

and you answered more than I could have with your take on "space feudalism": a better angle to answer indigi's hypothetical is that, well, Marxism is concerned with the study of capitalist dynamics because those are the present ones we live in; if we enter in a stage where social relations collapse but technology and infrastructure allow Feudalism-but-with-Robots, it would require a tremendous theory-crafting effort to study what would be a new mode of production that is not stipulated by established Marxist thinking

Honestly, I would think something based on the old bronze age city states. Where the land between cities is largely undeveloped or abandoned or useless and the metropol is the major source of power and it has colonies to facilitate trade and harvest resources that can't be manufactured.

Old bronze age economies were very rigid, very hierarchical and hyper-organized, but if you cut out 99.5% of people, I don't know what happens in that case because in those societies, they were almost all farmers. Probably the genocide to be frank or maybe posing as gods for self-agrandizement or something equally decadent and wasteful and stupid. But I don't know how capital would make the switch to full feudalism. In fact, I think that we're basically locked in for periods of institutional anarchy now as we continue to embrace a more feudal mode of governance. Feudalism literally has periods of institutional anarchy baked in due to people rebelling against intolerable conditions and/or political instability. It's why castles existed. Rioting and rebellion is historically how people negotiated with the rulers either by killing them and installing a new ruler, presenting enough threat to force negotiation or more frequently, getting murdered by an army.

In the future I see the elite sinking deeper and deeper into a bunker style mentality that we're already seeing them do now. That smacks of the institutional anarchy of feudalism to me.

Ice Phisherman has issued a correction as of 01:24 on Oct 30, 2020

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply