Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Locked thread
computer parts
Nov 18, 2010

PLEASE CLAP

FilthyImp posted:

Have you ever driven/walked/ride shared your way into an unfamiliar part of town and immediately wished you had taken the long way around?

There's an app for that!

Not an Onion article:

quote:

McGuire also clarified in an interview with Crain's New York that people of all races, not just white people, can download the app.

"Even though Dan and I are admittedly both young, white people, the app is not built for us as young, white people," she told Crain's.

Adbot
ADBOT LOVES YOU

Fuckt Tupp
Apr 19, 2007

Science
"The bar is three miles from here or fifteen, as the white flies."

burnishedfume
Mar 8, 2011

You really are a louse...
On the one hand, I want to give them the benefit of the doubt and believe that they totally wanted to create a app to warn people about areas with a lot of crime or stuff like "this bar has a lot of creepy guys hitting on women" but

quote:

It's never a great sign when an app's creators have to clarify that they are not "racists, bigots, [or] sexists" before the thing even launches. But in a phone interview with The Huffington Post, McGuire contended that an upvote-downvote rating system will keep super-racist posts from overrunning the app.

... so at least one of the app's creators have never been on Reddit before.

(I don't mean that as a generic "Hurr, reddit everything awful" comment, but it kinda shows a problem that you'll get far more racist people finding and downvoting a minority neighborhood they happen to not like than people scouring the map upvoting every minority neighborhood in an effort to counter the racists)

Legacyspy
Oct 25, 2008

HootTheOwl posted:

Why would an AI care about humans? We're a threat to ourselves, not it.

That is very much why A.I could be dangerous us. If it doesn't care about humans then it would have no qualms about talking actions that could be bad for us as a matter of doing what ever the A.I is trying to do. How do you program an A.I so it won't do that?

1994 Toyota Celica
Sep 11, 2008

by Nyc_Tattoo

Legacyspy posted:

That is very much why A.I could be dangerous us. If it doesn't care about humans then it would have no qualms about talking actions that could be bad for us as a matter of doing what ever the A.I is trying to do. How do you program an A.I so it won't do that?

Asimov's three laws, duh.

Augustin Iturbide
Jun 4, 2012

Legacyspy posted:

That is very much why A.I could be dangerous us. If it doesn't care about humans then it would have no qualms about talking actions that could be bad for us as a matter of doing what ever the A.I is trying to do. How do you program an A.I so it won't do that?

Don't build one? Don't give it vague movie magic powers over all other technology?

Jeffrey of YOSPOS
Dec 22, 2005

GET LOSE, YOU CAN'T COMPARE WITH MY POWERS
Even if you give it laws, if you want it to self-improve you have to make sure it doesn't improve in a way to get rid of those laws. It's hard.

on the left
Nov 2, 2013
I Am A Gigantic Piece Of Shit

Literally poo from a diseased human butt
The creators are missing out on a huge opportunity to market to social justice warriors by reversing the maps and calling it an anti-gentrification app. If you are white, you should stick to only living around other whites, to prevent gentrification.

ReV VAdAUL
Oct 3, 2004

I'm WILD about
WILDMAN
Higher level AI is interesting to talk about in terms of what nasty stuff it could do but the biggest threat to most of us is simply more efficient non-self aware AI. Google search algorithms aren't sapient but they've lead to incredibly surveillance opportunities for the powerful. Hedge fund software isn't going to be building T-1000s but doing stuff like HFT is still immiserating the vast majority of us. Etc etc.

The reason why techies being so fond of the singularity is funny/scary is that in spite of how they're empowered by relatively mundane tech they dream of an utterly unpredictable horrifically dangerous event giving them even more power by *magic*.

Bates
Jun 15, 2006

Legacyspy posted:

That is very much why A.I could be dangerous us. If it doesn't care about humans then it would have no qualms about talking actions that could be bad for us as a matter of doing what ever the A.I is trying to do. How do you program an A.I so it won't do that?

Don't give it power it can abuse. If I'm going to have a robot butler someday I want one with the strength and mental faculties of a child - just strong enough to scrub the floors and smart enough to understand an insult and feel despair but not powerful or clever enough to plot against me.

Your move Computron :colbert:

asdf32
May 15, 2010

I lust for childrens' deaths. Ask me about how I don't care if my kids die.

Pope Guilty posted:

There's only one known kind of intelligent mind so far, so suggesting that another would be very similar to that isn't too outre.

Yes it is. The human brain is the result of millions of years of evolution in the context of a bi-pedal land dwelling social creature. Our wants, desires and goals have been entirely shaped for survival and reproduction in this context.

We can't even begin to imagine what some technological being would want unless we knew about it's specific context and history.

Someone joked earlier that it might have better social skills than existing techies but why in the world would it want to socialize? Everything about our existence is shaped by our long evolutionary past, a past that wouldn't be shared whatsoever with a theoretical singularity.

Even if we tried to create something human like "Her", by hard coding desires for learning, socialization, empathy, sex and so on, we'd be 100% guaranteed to get it wrong in many really consequential ways.

Creating a shallow veneer of something that looks Human is possible. Creating something that might be considered "intelligent" while sharing this bolted on veneer is maybe plausible? But expecting a different emergent intelligence to automatically mimic human behavior in any way would be naive.


Like FRINGE pointed out (first time saying this in a positive light), examples of intelligence in nature show widely different types of behavior and note that a squid shares far more in common with humans than any technological AI would.

i am harry
Oct 14, 2003

asdf32 posted:

Yes it is. The human brain is the result of millions of years of evolution in the context of a bi-pedal land dwelling social creature. Our wants, desires and goals have been entirely shaped for survival and reproduction in this context.

We can't even begin to imagine what some technological being would want unless we knew about it's specific context and history.
The simple act of having to get up every day and work a job to pay for a roof and electricity affects a person's state of mind, intended goals and aspirations. If you were to awake a baby in a society that didn't require a third of your day spent working a job as an adult, adults of that society might behave very differently.

asdf32
May 15, 2010

I lust for childrens' deaths. Ask me about how I don't care if my kids die.

i am harry posted:

The simple act of having to get up every day and work a job to pay for a roof and electricity affects a person's state of mind, intended goals and aspirations. If you were to awake a baby in a society that didn't require a third of your day spent working a job as an adult, adults of that society might behave very differently.

Cultural and individual variations among humans are just an entirely different scale from what I'm talking about. When we're talking about theoretical technological life we're talking literally about aliens that happen to be on earth. Except biological beings on different planets are actually far more similar to us in that they've also been the product of millions of years of survival and evolution.

i am harry
Oct 14, 2003

asdf32 posted:

Cultural and individual variations among humans are just an entirely different scale from what I'm talking about. When we're talking about theoretical technological life we're talking literally about aliens that happen to be on earth. Except biological beings on different planets are actually far more similar to us in that they've also been the product of millions of years of survival and evolution.

Right. I'm trying to provide a segueway between what we experience as average, working humans, and what another form of conscious life might experience, by removing one of the largest factors in our daily lives. How does a being feel about success, love, personal wealth or value, how does it plan its life, what are its life priorities? They would be vastly different if we can all agree that we would be vastly different.

ReindeerF
Apr 20, 2002

Rubber Dinghy Rapids Bro
Okay, that was so tortured that I had to quote it.

ReV VAdAUL
Oct 3, 2004

I'm WILD about
WILDMAN

i am harry posted:

Right. I'm trying to provide a segueway between what we experience as average, working humans, and what another form of conscious life might experience, by removing one of the largest factors in our daily lives. How does a being feel about success, love, personal wealth or value, how does it plan its life, what are its life priorities? They would be vastly different if we can all agree that we would be vastly different.

At our core humans want broadly similar things, the hierarchy of needs is a good enough model of this. Imagine an AI whose hierarchy of needs has been built from the ground up to be 'make the numbers of this spreadsheet go up'. This is very unlikely to lead to it having much in common with humans who need food, shelter, warmth, companionship etc. Hell those things almost certainly conflict with the spreadsheet thus making humans an obstacle to the AI.

Now of course this is how corporations act and I have already acknowledged that much simpler AI is a more plausible threat to us but still, corporations aren't people and their goals are normally to the detriment of most if not all of humanity. So an AI built to be a 'living' corporation would have no need or capacity to be friendly towards humanity.

computer parts
Nov 18, 2010

PLEASE CLAP

Anosmoman posted:

Don't give it power it can abuse. If I'm going to have a robot butler someday I want one with the strength and mental faculties of a child - just strong enough to scrub the floors and smart enough to understand an insult and feel despair but not powerful or clever enough to plot against me.

Your move Computron :colbert:

Well that's the other thing, AI might be "self improving" but there's no reason why it should do that quickly.

People take decades to learn their skills, and it gets harder the older they get.

Main Paineframe
Oct 27, 2010

Pope Guilty posted:

There's only one known kind of intelligent mind so far, so suggesting that another would be very similar to that isn't too outre.

No there isn't. There's plenty of intelligent animals out there, many of which are thought to be sentient. The human brain just happens to be the only one we pretend to understand well, as the brains of things like dolphins and octopi may work very differently from our own.

Slow News Day
Jul 4, 2007

computer parts posted:

People take decades to learn their skills

It depends on the skill.

computer parts
Nov 18, 2010

PLEASE CLAP
Apple's released their demographic data, for those curious about comparing with other tech companies:

http://www.apple.com/diversity/






Compared to Google:

(overall)


(tech)


(non-tech)


(leadership)

Chokes McGee
Aug 7, 2008

This is Urotsuki.

Baracula posted:

What if the ai was stupid.

And on the day the first AI spoke to the outside world, its first words were ":goonsay:".

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

computer parts posted:

Well that's the other thing, AI might be "self improving" but there's no reason why it should do that quickly.

People take decades to learn their skills, and it gets harder the older they get.

Assuming that you have an excess of computer power, how quickly you can train a machine learning system depends mostly on how quickly you can acquire new training data.

You can train on your prior data all you want but eventually you're just going to overfit your system. The gold standard has always been how a system performs when you hand it a new instance of a problem that it's never seen before. As an example, mice in a maze or whatever.

If we're presuming a truly human-like AI that can get bored and go play video games or count electrons or whatever AIs do, then analogies to humans might be appropriate. If we're talking about a machine that has human-like creativity but applies it constantly to a single task, then it could progress pretty fast because it never needs or wants to do anything else.

And similarly it depends on what the task you're asking it to do is. If you're dealing with a real-world system with a long duration, obviously that limits how fast you can spit out hypotheses and test them. If it can be efficiently (and correctly) modeled by simulation alone you could do it a lot faster.

Furthermore humans take decades to learn because they all start from ground zero on each topic. If you could dump Steven Hawking's understanding of particle physics into a toddler's head it wouldn't take decades to learn. Computers can work on one really good implementation and share rather than trying to have everyone come up with their own (mostly mediocre) implementations. Again, depends on how the specifics of this AI works, we obviously can't do that with a human brain right now. On the other hand we also can't build a conscious AI either.

Paul MaudDib fucked around with this message at 21:16 on Aug 13, 2014

Gregor Samsa
Sep 5, 2007
Nietzsche's Mustache
e: nevermind.

Slow News Day
Jul 4, 2007

This excellent video was published a few days ago. It talks about why, even though we have automated things before, this time it's different with robots and computers.

https://www.youtube.com/watch?v=7Pq-S557XQU

QuintessenceX
Aug 11, 2006
We are reasons so unreal

enraged_camel posted:

This excellent video was published a few days ago. It talks about why, even though we have automated things before, this time it's different with robots and computers.

https://www.youtube.com/watch?v=7Pq-S557XQU

In a similar vein, the Pew Research Center published a paper on the topic. Pretty much everyone agrees that massive amounts of the population will lose their jobs because of AI; the disagreement is over whether or not human ingenuity will allow for those displaced to create new forms of work, which your link seems to argue against.

anonumos
Jul 14, 2005

Fuck it.

QuintessenceX posted:

In a similar vein, the Pew Research Center published a paper on the topic. Pretty much everyone agrees that massive amounts of the population will lose their jobs because of AI; the disagreement is over whether or not human ingenuity will allow for those displaced to create new forms of work, which your link seems to argue against.

In my mind, the key is how quickly the displaced labor moves to another productive outlet. That's largely a political issue, based on policies like the social safety net, economy-wide profit-sharing, and free/low-cost education. Sad to say, it looks very unlikely that displaced workers will find another position without a long, long time unemployed or otherwise marginalized.

But I think we've seen waves of automation before and people, by-and-large, shifted to other forms of employment. At times it has been exceedingly painful, far more so than it really needs to be, but given the en-vogue libertarian and "Compassionate Conservative" ideologies, it will continue to be painful.

TheBalor
Jun 18, 2001
Right, but I guess the point of the video was that before, you were in the end always needing a human to fix the machine, or tell it what to do. With the growth of machines that can learn and perform complex intellectual tasks, that factor has vanished. Now it's just a matter of "how long will it take for software and hardware to catch up?" I haven't really heard a convincing counter to that point that isn't "but it's ALWAYS happened like that before."

i am harry
Oct 14, 2003

It is time to start really focusing on training our children to think. What about, I doubt it matters, but if almost every other human capacity is getting a machine replacement, applications of creativity, imagination and expression will be the only things left. e: Not that it will pay bills, but when everything else is done for you, you'll need to be able to stare at the clouds or the trees and be alright.

i am harry fucked around with this message at 16:10 on Aug 18, 2014

Main Paineframe
Oct 27, 2010

anonumos posted:

In my mind, the key is how quickly the displaced labor moves to another productive outlet. That's largely a political issue, based on policies like the social safety net, economy-wide profit-sharing, and free/low-cost education. Sad to say, it looks very unlikely that displaced workers will find another position without a long, long time unemployed or otherwise marginalized.

But I think we've seen waves of automation before and people, by-and-large, shifted to other forms of employment. At times it has been exceedingly painful, far more so than it really needs to be, but given the en-vogue libertarian and "Compassionate Conservative" ideologies, it will continue to be painful.

If you assume infinite progress of technology, then the question isn't "what jobs can be automated" but rather "what jobs can't be automated". And more to the point, are there 6 billion jobs that are genuinely impossible to replace with a computer? It's more of a cultural question than a political one, since there's plenty of jobs we don't automate as much as we potentially could (is there really any reason to have humans driving trains anymore?), and there's some jobs that people are unlikely to accept being automated (like strippers and CEOs). Cultural resistance will fade over the years, though, so at some point there will simply not be enough jobs left in human hands to employ the entire population.

computer parts
Nov 18, 2010

PLEASE CLAP

Main Paineframe posted:

If you assume infinite progress of technology, then the question isn't "what jobs can be automated" but rather "what jobs can't be automated". And more to the point, are there 6 billion jobs that are genuinely impossible to replace with a computer? It's more of a cultural question than a political one, since there's plenty of jobs we don't automate as much as we potentially could (is there really any reason to have humans driving trains anymore?), and there's some jobs that people are unlikely to accept being automated (like strippers and CEOs). Cultural resistance will fade over the years, though, so at some point there will simply not be enough jobs left in human hands to employ the entire population.

If the world was going through a Malthusian crisis that would maybe be an issue but as it stands the parts of the world being heavily automated are also the ones with the lowest (or negative) birthrates.

Nessus
Dec 22, 2003

After a Speaker vote, you may be entitled to a valuable coupon or voucher!



computer parts posted:

If the world was going through a Malthusian crisis that would maybe be an issue but as it stands the parts of the world being heavily automated are also the ones with the lowest (or negative) birthrates.
The jobs are going away faster than the population, though, so

Nintendo Kid
Aug 4, 2011

by Smythe

Main Paineframe posted:

If you assume infinite progress of technology, then the question isn't "what jobs can be automated" but rather "what jobs can't be automated". And more to the point, are there 6 billion jobs that are genuinely impossible to replace with a computer? It's more of a cultural question than a political one, since there's plenty of jobs we don't automate as much as we potentially could (is there really any reason to have humans driving trains anymore?), and there's some jobs that people are unlikely to accept being automated (like strippers and CEOs). Cultural resistance will fade over the years, though, so at some point there will simply not be enough jobs left in human hands to employ the entire population.

Again, as I stated earlier, there's billions of jobs out there right now that don't really need to exist in the first place. From things like how many people are engaged in subsistence agriculture while there's already plenty of food surplus from other parts of the world to feed them with a better distribution system, to the tons of fundamentally extraneous sales people for many products, to every single person employed in the "alternative health" industries.

Capitalist principles have meant the creation and sustaining of arbitrary make-work employment en masse.

computer parts
Nov 18, 2010

PLEASE CLAP

Nessus posted:

The jobs are going away faster than the population, though, so

I don't know that that's true, just that it's harder now to transition from one job to another. A century or so ago a large number of jobs were based around the same sort of skills (manual labor). If new farm equipment was developed, a former migrant farmer could (at least theoretically) go get a job working in a plant without too much trouble. Those jobs (at least until unionization et all) also didn't really pay much, but let's ignore that for a second.

These days the major issues regarding jobs are either from people who didn't get higher education because they didn't need it, or people who can't afford the wages they're paid due to debt from student loans et all when acquiring that degree. It seems like that could be fixed by (on the low end) providing livable wages, and (on the high end) making acquiring education easy, cheap, and effective.

In the future, yeah that could be an issue, but right now the problem is that people don't have the skills to work new jobs or aren't paid enough after they get those skills.

computer parts fucked around with this message at 20:09 on Aug 18, 2014

Main Paineframe
Oct 27, 2010

computer parts posted:

If the world was going through a Malthusian crisis that would maybe be an issue but as it stands the parts of the world being heavily automated are also the ones with the lowest (or negative) birthrates.

For now, maybe, but there's no reason to assume that automation won't eventually spread to those countries too. Besides, there's no way natural population decline is going to be able to keep up with the decimation of entire fields (unless we also cut basic life-saving welfare programs, I guess).

Adbot
ADBOT LOVES YOU

QuintessenceX
Aug 11, 2006
We are reasons so unreal

Main Paineframe posted:

If you assume infinite progress of technology, then the question isn't "what jobs can be automated" but rather "what jobs can't be automated". And more to the point, are there 6 billion jobs that are genuinely impossible to replace with a computer? It's more of a cultural question than a political one, since there's plenty of jobs we don't automate as much as we potentially could (is there really any reason to have humans driving trains anymore?), and there's some jobs that people are unlikely to accept being automated (like strippers and CEOs). Cultural resistance will fade over the years, though, so at some point there will simply not be enough jobs left in human hands to employ the entire population.

I think it's safe to say that machines can't replicate human communication/connection. That isn't to say they can't communicate or connect, just that machines that communicate or connect effectively will always be machines communicating/connecting. I'm sure this will create markets where the unique selling point is that you can communicate/connect with a human, which will always have that "authentic" tag applied to it. Take for example, ESL Instructors. The vast majority of them will probably lose their jobs to automated programs. I'd imagine that specialty communication instructors that provide authentic communication experiences/training will have some demand because people will always be required to talk to other people, and as a result people will want practice/training from real people.

  • Locked thread