Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Locked thread
Main Paineframe
Oct 27, 2010

Owlofcreamcheese posted:

What kind of ideas is our economy just only beginning to come up with right now? nothing?

Entirely-online stores that mostly or completely phase out the need for consumer-facing physical locations and the associated overhead (and staffing). More efficient logistics that results in significantly lowered labor requirements. Growing usage of websites and apps to automate away "last-mile"-style requirements that formerly required significant staffing. There are plenty of others, too, but none of them have quite the same impact on labor demands as replacing water wheels with steam engines (removing environmental limitations on input power and allowing significant factory expansion).

Adbot
ADBOT LOVES YOU

Main Paineframe
Oct 27, 2010

Owlofcreamcheese posted:

It's interesting that you as a person in modern times can talk about this retrospectively as being so odvious that it'd increase labor demands.

But at the time: https://en.wikipedia.org/wiki/Destruction_of_Stocking_Frames,_etc._Act_1812

The issue that the Luddites had wasn't anything to do with the total amount of jobs. Their problem was that jobs that required skilled workers to do were being replaced with unskilled labor that anyone could do, with correspondingly worse pay, hellish working conditions, and zero job security. And if they didn't have a point, why did capitalists make "damaging industrial equipment" a crime punishable by the death penalty?

Main Paineframe
Oct 27, 2010

Boon posted:

Uh... You're taking a 21st century frame of reference and applying it retrospectively. The bolded part is certainly true, the rest of that poo poo is exactly that, poo poo.

No, I'm not. Even by 19th-century standards, the early industrialized factories were brutal places to work, and the unskilled workers were far easier to replace than skilled workers. Henry Ford's factories had a worker attrition rate of over 100%, for example - the conditions were so bad that each year he lost and had to replace more workers than he had. It's not like the factories were paradise before automation, but automation made them far worse, even by the standards of the time.

Main Paineframe
Oct 27, 2010

TyroneGoldstein posted:

I would say that ergonomics not being a thing and utter and complete lack of safety standards for production equipment had a whole lot more to do with making Industrial Revolution era line work grueling, but ya'know...

Being able to replace an experienced sewer with a ten-year-old kid certainly helped. Safety is less important when your workforce is more disposable.

Main Paineframe
Oct 27, 2010

Dead Reckoning posted:

Bullshit. First off, imposing the stability that is a prerequisite for and significant appeal of a first world standard of living on a worldwide basis would absolutely require inhumane brutality if not a global police state. Source: Every large empire and powerful state in world history.

I don't think a global dystopia is necessary to end civil wars and encourage general prosperity.

Main Paineframe
Oct 27, 2010

Owlofcreamcheese posted:

If robots equal to humans are 50 years away what happens with technology 60 years from now?

He didn't say "equal to humans", he said "as flexible as humans".

Main Paineframe
Oct 27, 2010

A baby ate my dingo posted:

That's the thing though, we don't need robots to be as flexible as humans, or even be able to do the same thing as humans. You're really misunderstanding how physical automation is implemented in an industrial setting, it's not about replicating the process that a human carries out, it's about achieving the same end result.

More to the point, robots don't need to replace humans entirely. If a certain job requires ten tasks to be completed, and a machine is installed that automates away just one of those ten tasks, you've decreased the total manpower required for that job by 10%. We're not going to have totally-automated McDonalds that can carry out the entire food production chain from start to finish without human intervention, but automation can make significant cuts to the required staffing at each McDonalds. Historically, that was no problem because the decreased manpower requirements would lead to expanding the business and opening more physical locations, but the fundamental nature of both our economy and new automation have shifted such that that's no longer necessarily the case.

Main Paineframe
Oct 27, 2010

Dead Reckoning posted:

I'm saying that even your best case scenario is bad. Let's assume that high levels of development result in lowered violence and slightly sub replacement fertility. We cannot sustainably support 7.5 billion people living at what we consider a high level of development.

Can't we? Most of our resource problems are issues of allocation and distribution, not availability.

Main Paineframe
Oct 27, 2010

A Buttery Pastry posted:

At developed world levels? I know famine and poo poo is just a question of distribution, but what if everyone is trying to get say, a German lifestyle? In any case, what is true today in this regard might not be true in a few decades, given the challenges associated with climate change and the possible shift in resource use that might force.

Sure, why not? If there are any fundamental limitations that we're likely to hit anytime soon, they're based on our economic systems (which encourage large amounts of pointless waste) rather than sheer population pressure. Over a third of the US food supply ends up in landfills as food waste, and many of the US's regional water problems stem from cost-cutting or trying to maintain lush green lawns in the middle of a loving desert.

Main Paineframe
Oct 27, 2010

Owlofcreamcheese posted:

Even if we become space amish and lock our culture to only allow 1998 technology eternally do you actually think they wouldn't just fire the 30 dollar an hour highschool graduates and hire either some sort of better qualified for the price 30 dollar an hour college (trade school?) graduates that had even more training or 15 dollar an hour highschool graduates eventually?

Like "guy with absolutely zero job security in any direction loses job" isn't exactly something you can prevent by stopping technology from existing.

No, it's something you can prevent by changing the nature of our society so that the long-term unemployed that technology will inevitably create aren't uninsured, homeless, and miserable.

Main Paineframe
Oct 27, 2010

Blockade posted:

I think he's right in a way. We could have tons of okay paying jobs, jobs for everyone with 'New Deal' mega projects. Maybe we can get low-skill people planting trees, demolishing old abandoned infrastructure, materials cleanup, etc.

Even there, automation drastically reduces labor needs, though. Back during the days of the New Deal, digging a ditch required a team of people with shovels and picks. These days, all you need is one guy with a ditch-digging machine. Today, one person operating a tree-planting machine can plant trees as quickly as ten people doing it by hand. A lot of construction and infrastructure work doesn't require nearly as much manpower as it did in the CCC era.

Main Paineframe
Oct 27, 2010

Blue Star posted:

That doesnt really mean anything, though. How many people are employed as umpires? Is that technology applicable to other jobs and careers?

I still think its going to be a very long time before a significant amount of jobs are threatened. We can make programs and robots that can do very narrow things but they cant handle unpredictable stuff. And lots of jobs have unpredictable aspects. Even a fast food job has unpredictability.

The trick is to make robots do all the predictable aspects, so you can fire all the human workers that used to do them and just keep a couple people around to handle the unpredictable bits. Few occupations are going to be 100% automated anytime soon...but automating away even half the manpower requirements is enough to have a significant impact on employment.

Main Paineframe
Oct 27, 2010

mobby_6kl posted:

Most likely though, these people would be eventually replaced with someone still valuable to the company. I.e. lay off 500 pencil pushers, hire 100 salespeople or developers or whatever. Business grows, hire more people, etc. If this weren't the case, all companies would be down to, like, one guy by now.

Based on experience though, even this rarely happens. Our systems are constantly improved and automated, and whenever you no longer have to spend hours on stupid poo poo because it's now automated... there's always other stupid poo poo that needs to be done. Presumably it's also valuable so the business grows.

That's not always the case. Companies don't expand their business just because they can, they expand their business because there's unmet demand they can fill. For example, replacing old-fashioned cash registers with POS systems didn't decimate overall cashier employment because the retail market was rapidly expanding during that period. Positions per store may have been down, but companies like Walmart and Kmart were opening stores all over as they quickly grew from small newcomers to industry heavyweights. Likewise, although ATMs reduced the number of employees per bank branch, banks were building a lot of physical bank branches at the time, so overall bank employment continued to rise and the labor savings simply fueled faster expansion.

Today, though, it's a different story. Retail is in a slump, with many companies cutting back their physical presence as the availability of retail stores now exceeds demand. Likewise, banks are cutting back their physical presence and closing their branch locations as customers shift more and more toward online banking. Any labor savings from automation in those industries right now won't be plowed into expansion, because they're not expanding and have no desire to expand - instead, they'll just further feed the cutbacks.

Main Paineframe
Oct 27, 2010

Owlofcreamcheese posted:

Someone was talking about a dark future where an AI could make something take 2 hours less a week resulting in mass layoffs. But literally all of history is the invention of tools that cut 2 hours a week off tasks. It evidently can't simply be a force for unemployment if it's always happened through all of history.

It's always been a force for unemployment. It's just that unemployment usually drives fundamental shifts in the nature of society and the economy...or a bloody revolution. Y'know, whichever. It's no coincidence that particularly large strides in automation and production are often accompanied by greater social unrest.

Main Paineframe
Oct 27, 2010

This isn't nearly as interesting or revolutionary as the article tries to imply, and honestly, the bit that acted all excited about robots "teaching" each other how to do things should have been a dead giveaway that they were trying to misrepresent routine tasks as revolutionary AI breakthroughs for the sake of clickbait. All they really did was slap a fancy 3D GUI on their robot programming program and have their programmers pre-program a bunch of motions in advance.

Main Paineframe
Oct 27, 2010

Owlofcreamcheese posted:

College is fake and for losers but someone should probably call MIT and tell them they've been bamboozled by a fancy 3D GUI again. You'd think by now they would know what people were doing in their PHD program and stop letting all these fakers through. (I get that fluff articles are fluff and not real, but come on, don't exaggerate in the other direction either, PHD level stuff at MIT is about the realest you are going to get for AI research stuff)

MIT appears to know exactly what those people are doing, unlike you. That's not AI research, it's regular old robotics and automation research. UI and UX are very important for practical applications, but you (and the Verge) are the ones trying to reframe "you can program the robot's motion data by clicking and dragging on a 3D model" as some kind of AI breakthrough.

Main Paineframe
Oct 27, 2010

Owlofcreamcheese posted:

It is literally AI research in their AI program by a researcher that calls it learning.

"C-LEARN: Learning geometric constraints from demonstrations for multi-step manipulation in shared autonomy"

Abstract—Learning from demonstrations has been shown to be a successful method for non-
experts to teach manipulation tasks to robots. These methods typically build generative
models from demonstrations and then use regression to reproduce skills. However, this
approach has limitations to capture hard geometric constraints imposed by the task. On the
other hand, while sampling and optimization-based motion planners exist that reason about
geometric constraints, these are typically carefully hand-crafted by an expert.

The MIT news release is pretty clear that by "learning" they mean "programming the robot". It refers to the existing methods of programming a robot to do things (feeding motion capture data to the bot, or having programmers manually code in the specific movements) as "learning", and the C-LEARN approach is meant to blend those two techniques of "teaching" robots in order to improve the user experience and avoid some of the weaknesses of each method. It's still fundamentally a human user programming an exact set of movements that the robot faithfully replicates. It's got some nice ideas that could make setting up industrial robots an easier and faster task which requires less expertise, but the fact that they use jargon like "learning" doesn't change the fact that it appears to be about as AI as an old Lego Mindstorms set.

Main Paineframe
Oct 27, 2010

Owlofcreamcheese posted:

It's literally what AI is. It's literally a paper from the MIT Artificial Intelligence Laboratory by a woman that is getting a PHD studying machine learning an AI. That is literally what AI is. It's not demon summoning where they call spirits into golems or something, it's just "boring old programming" aimed at a set of tasks that are easy for humans and hard for machines. That is literally what artificial intelligence is.

You appear to have left a couple of words off the name of the "Computer Science and Artificial Intelligence Laboratory". Which isn't surprising, since "but it's the Artificial Intelligence Laboratory" is basically the only reasoning you've offered for how this qualifies as a revolutionary AI breakthrough, rather than an improved user experience for entering tasks into existing factory robots.

As the paper itself states, C-LEARN is just "better assisted planning and interaction techniques
to enable operators to efficiently and remotely control high-DoF robots". It's robotics and it's certainly extremely relevant to automation, but that doesn't make it AI!

Main Paineframe
Oct 27, 2010

Ratoslov posted:

For the edification of the thread, could you perhaps explain what the difference is between AI and plain old programming?

Isn't this the wrong question? You should be asking what my personal definition of "artificial intelligence" is, since basically everyone in this thread seems to have completely different definitions for that and a number of other words and phrases. I don't really intend to go in-depth on it since it seems like we can't go half a page without someone redefining a word. But for the purposes of this thread, I'd say that 'when a machine mimics "cognitive" functions that humans associate with other human minds, such as "learning" and "problem solving"' sounds close enough to the commonly-understood meaning of AI. If you're going to claim that literally all programming counts as AI, then you've expanded the field of AI to cover everything from ordinary pocket calculators to V2 rockets.

Rastor posted:

Berkeley researchers teach computers to be curious


I'm sure Main Paineframe will soon explain that this "appears to be about as AI as an old Lego Mindstorms set".

Actually, it seems to be a decent attempt at slightly broadening the horizons of machine learning? You've got to learn to read beyond the headline, man.

Main Paineframe
Oct 27, 2010

Tei posted:

Sorry, I am not informed about that. So home printers killed jobs?

I still see people writing notes manually or using comic sans. People don't seems to care about the printer, they replace by a pen if they need to. I was not aware it had a effect that big.

If you wanted a hundred flyers for your lost cat or something in the days before home printers, you had to either individually write out all one hundred flyers or go to a print shop and work with them to design and print those flyers. Nowadays, even if you don't have your own printer, you can use desktop publishing software (or even simpler tools) to design it yourself and print it off at a library or Kinko's for less than a buck. It's one of those things where a lot of jobs and expertise were automated away, but it happened just long enough ago that people don't even realize it.

Main Paineframe
Oct 27, 2010

Malcolm XML posted:

Uh dude its a pure roi thing its still cheaper to deal with humans than kiosks and the human

The tech isn't new its old as heck and the cheap cost of labor has kept it from spreading. Look at automated checkouts -- they are barely functional and usually need a human to kick them into shape

But some local fast food places have been successfully using tablet kiosks for ordering for years so it's nothing new to the industry

Not really. After all, a kiosk is essentially what the cashier uses to punch in your order these days. It's more likely that the technology, process, and interface simply aren't quite there yet - a cashier is trained to use their POS machine, but letting customers enter in their own orders means it has to be designed to be easy and quick for someone who just walked in off the street who's never been to a McDonalds before and has three screaming children who have no idea what they want. That's not an insurmountable issue, especially for places with simple menus, but the problem can take some time to get a grip on. It's not particularly expensive anymore, even compared to a minimum wage employee - they just haven't quite gotten it right yet. As others have pointed out, they'll probably end up on app ordering, which takes a lot of the problematic factors out of the picture.

Malcolm XML posted:

Hmm yea just like the automobile destroyed civilization when it put the buggy whip makers out of work I look forward to mad max but every thing is done by kiosk ordering



That said online shopping is killing retail so who knows

https://mobile.nytimes.com/2017/06/25/business/economy/amazon-retail-jobs-pennsylvania.html?_r=0&referer=https://www.google.com/

A hundred years ago, industrialization was causing vast improvements in resource extraction, production, and shipping, leading to massive expansions in demand and enormous growth in the economy as it sought to supply the global middle class. The same steam engines that eliminated so many manual labor jobs also made made it possible to ship goods across the US and across the ocean far faster and more efficiently, for example, revolutionizing global trade and making national-scale business far easier.

Today, on the other hand, there's no economic expansion. Computers and the internet have revolutionized entertainment and logistics, which has largely served to streamline companies' operations rather than generating large amounts of consumer-level demand - and most of the demand has been for infinitely-reproducible data rather than physical goods that need people to create them.

Main Paineframe
Oct 27, 2010

Owlofcreamcheese posted:

If you want to name a future common job name a current ultra elite job that only a few super special people can do. That has usually been a preview of what jobs end up being common. Stuff people want but only the elites can provide while naming their exorbitant price and telling everyone only they can do it because they were chosen by god.

Your kid will grow up to be a machine aided neurosurgeon and everyone will have reliable good, cheap access to surgery while the people today that would have been neurosurgeons will declare that was just peasants work anyone could do and the true elites do ultra quantum nano surgery or whatever,

Your choice of example for "ultra elite job that only a few super special people can do" is an odd one, since the neurosurgeon shortage is not a matter of insufficient talented people. The prime driver of the neurosurgeon shortage is a thin pipeline - lack of funding and the medical profession's tight controls over entry mean that there simply aren't enough residencies available. It's no surprise there's not enough neurosurgeons when only two hundred people can begin a neurosurgery residency each year.

Tei posted:

Software developing is about talking with people.

Theres a writing software part, but thats the easy part.

Maybe, but most of the time and manpower is in writing software. If all you had to do was punch customer requirements into a magic coding machine and let it do the rest, then you could replace five developers working for three weeks with a single sales engineer putting in a day or two worth of work.

Main Paineframe
Oct 27, 2010

Paradoxish posted:

This is what I was trying to drive at. Sooner or later the process of abstraction and simplification leads to the question of what problem am I trying to solve, and it just so happens that's actually like 90% of the work of programming anyway.

Well, it's 90% of the work of programming now, after four decades of abstracting and automating and standardizing away most of the rest of the work that went into programming. Turns out you can save a lot of man-hours when you can just drop a half-dozen frameworks into your project and spend most of the dev time just hooking them all together instead of coding the functionality yourself. The amount of manpower to do a given amount of programming is already far lower than it used to be; the expansion of programming jobs is only because the amount of programming people want done has expanded exponentially as computing devices have gotten ever cheaper, smaller, and more personal. And I'm sure that trend will continue forever at the same rate with no slowdown ever!

Of course, the catch here is that you don't need an actual programmer to answer the question "what problem do I want solved". There's plenty of laymen around to hash that out, including the project manager, customer relations, the UX designer, and various management folks to hash out. The job of the programmer in any sane organization (which, admittedly, tends to be the exception rather than the rule) is to take what that group comes up with and turn it into code.

Main Paineframe
Oct 27, 2010

Paradoxish posted:

What you seem to be missing, and also why this is a pretty important topic in the wider topic of automation, is that creating a specification and feeding it into a computer is literally what programming is. The only reason we consider programming to be a highly specialized skill is because computers aren't capable of understanding the higher level business requirements produced by non-technical project members, so you need an additional layer of translation.

Getting to the point where that translation isn't required anymore isn't just a matter of simplifying tools. We already have simple tools for writing code, including lots of pretty intuitive visual tools. The problem is that doing anything complex (and keeping it maintainable) with those tools still requires the exact same specialized skills as working with written code. They make it easier to teach a lot of the basic concepts and sometimes make it easier to offload simple work that it doesn't make sense to pay a programmer to do, but that's it. The leap you're describing to move beyond that point is just absolutely massive.

edit- not to mention that you aren't so much talking about a world where programmers are obsolete as you are talking about one where everyone is a programmer. I suspect that's a world where most non-creative, non-physical work ends up being automated at a hilariously rapid pace.

I'm not talking about a world where programmers are obsolete. I'm talking about a world where you need one programmer-hour to do an amount of work that currently takes ten programmer-hours. In that world, you'd better hope that ten times as much programming needs to be done, since otherwise that means a decline in the number of programming jobs.

Twenty years from now, there'll still be programmers. There'll just be a lot less of them, because it'll have been automated down to its very automation will reduce the manpower necessary to do it.

The real threat of automation isn't that it'll completely eliminate entire careers. It's that it'll drastically reduce the manpower needed to do those jobs, forcing tens of thousands of people out of that field and dumping them onto a job market with hundreds of thousands of others who've similarly been booted from their careers.

We still have bank tellers around, even in a world with both ATMs and online banking...but the number of teller jobs is projected to shrink by 40,000 or so over the next decade, and that's tens of thousands of jobs that have to be made up somewhere else. Where? It certainly won't be in retail, which is shedding tens of thousands of jobs per month right now as retailers cut back their physical presence; that trend will slow sooner or later, but it won't reverse.

Tei posted:

This ignore that software is a machine. Sure, all programmers can build anything to the spec. Like you can get car engineers and build a car. But if you want something that will last 15 years, that will need repairs rarely, that will not create extra work, that will be easy to expand and so on you need somebody that can think about the side effects on the long term of every decision, small and big, and take decisions on architecture that other people will follow. That in most companies I suspect is what the senior developers do.

I make this comment, but is a bit offtopic in this thread. Sorry.

That's what the senior developers do, yes (at least in theory). But not every developer is a senior developer. What happens to the programming workforce when the only developer a company needs is that one senior developer, who can feed the plans and designs they come up with to a suite of tools and maybe a temporary intern instead of a team of professional developers?

Main Paineframe
Oct 27, 2010

Paradoxish posted:

I don't really disagree with anything you're saying here, except maybe on how long it'll take to get there.

What I have been saying, however, is that this is pretty much the end game for most white collar "cognitive" work. Most programmers still spend a huge amount of time solving very trivial problems. Being able to have a single programmer or extremely small team build and maintain an application that solves a highly complex business problem is a game changer on par with the introduction of computer spreadsheets.

Sure, but "when will it happen" isn't really a useful question. It will happen, sooner or later. The more difficult - and more important - question is "what will we do when it happens"?

Cicero posted:

Thanks to higher-level languages and frameworks and IDEs, you need like one programmer-hour to do the work of like a hundred programmer-hour from 50 years ago, and yet instead of demand for programmers going down, it went up, drastically, by at least an order of magnitude. So yeah it's entirely possible that'll happen, because it's happened before.

But yes, in the sufficiently long run, most programmer jobs will definitely go away. It probably just won't be until most white-collar jobs are under the same kind of automation pressure.

It's no surprise that there's higher demand for programmers today compared to 50 years ago. After all, 50 years ago, a computer was the size of a room. Today, not only do we have personal computers, but we have computers in our phones, our cars, our TVs, and everything else. On top of that, heavy platform standardization has opened up the software market in a massive way compared to the pre-x86 era. Let's not even mention the existence and evolution of the internet. That doesn't mean that the demand will continue to increase forever, though - we've already shoved computers into pretty much everything we could think of to shove a computer in, whether it's useful or not. Although demand rose fast enough to outpace the losses from automation in the past, there's no guarantee that it will continue - that's up to the specific industry and market conditions.

Main Paineframe
Oct 27, 2010

asdf32 posted:

I just had to wait 30 minutes for my TV OS to update tonight. Most TV's have probably had micrcontrollers and small processors for decades but that's not the end of progress - now they have full blown processors and OS's with a sustaining engineering team somewhere continuing to push out updates and there isn't an indication this trend of more computers with more processing in more things is going to end anytime soon.

Well, yeah. Most TVs these days are just purpose-built computers with a collection of TV-related "apps", just like how smartphones are tiny computers with a cell radio and a phone app. Hell, even soda dispensers run on Windows these days. The trend of more computers in more things is already slowing down, because somebody's already put computers in just about everything it makes sense to put computers in - as well as a bunch of things it doesn't really make sense to put a computer in. We're in a bit of an IoT bubble, and eventually it will pop and return to sanity. The world isn't ready for app-powered smart nightlights that change color if you have any unread emails, let alone smart toilets that analyze your poop.

Adbot
ADBOT LOVES YOU

Main Paineframe
Oct 27, 2010
Uber isn't going to release the LIDAR data publicly, because there's no possible way it could make them look good - after all, they can't fall back on the "it was dark" excuse with LIDAR. Either the LIDAR did see the pedestrian and ran them over anyway, or the LIDAR couldn't see a human-sized figure in the middle of the road in basically ideal conditions for a LIDAR. Either way, it's in their best interest to insist that the car wasn't any worse than a human driver, and that means avoiding calling any attention to the car's sensory advantages.

  • Locked thread