Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Locked thread
A Wizard of Goatse
Dec 14, 2014

Rush Limbo posted:

Its also the case in, say, video games and other AI simulations that they handle even fairly basic things like stairs incredibly badly, to the point where various workarounds are created including them just plain ignoring such changes in elevation from a purely AI standpoint and just moving them while having animation take over.

Videogame AI is meant to be the cheapest possible thing that can move a lot of pixel terrorists from the spawn point to in front of your ray gun without clipping through the walls, though, the people programming it are doing so towards completely unrelated priorities from the guy making an accurate stair-climbing simulator, or the guy making a robot that can efficiently climb actual stairs. If you want a robot that can go up stairs, but it doesn't need to do it from 50 different angles simultaneously and also render a simulated chainsaw bloodspray in 1080p resolution with dynamic lighting effects on all the droplets, a $5 Arduino, some gyroscopes, and a two-three hundred lines of code will do the trick. If you need it to climb stairs and also do everything else a human pair of legs might possibly successfully do in a lifetime, and on-the-fly generate and execute a reasonable response to leg emergencies the manufacturer never planned for, you're better off just getting on OKCupid and making a fresh human with its own built-in set of legs.

A Wizard of Goatse fucked around with this message at 23:41 on Nov 28, 2016

Adbot
ADBOT LOVES YOU

A Wizard of Goatse
Dec 14, 2014

Blue Star posted:

I dont think that's true. I think it's obvious that technological progress is slowing down and will probably stagnate in our lifetimes. Compare the first half of the 20th century to the second half: the first half saw way more progress. Cars, airplanes, electric power, nuclear energy, radio, telephones, television, x-rays, and much more all came out in the period between 1900 and 1950, give or take. But now look at the period from 1950 to 2000, there's way less progress. Yeah computers got smaller and faster, we got video games and cell phones and internet stuff. Visual effects in movies got better. And...that's about it.

The only really significant progress has been in computer chips. But now even that is ending, since Moore's Law will stop soon if it hasn't already. All the heady progress in computers that has been made over the past few decades will now stop. Computers in 30 years will probably be barely any better than computers today. Video game graphics probably aren't going to get any better. We're probably never going to be able to emulate a mammalian brain, even that of a mouse, let alone a human. And other fields, such as medicine, will be even slower. Drug development has slowed down dramatically.

2016 is basically 1986 except we got tablets and cellphones and social media. I think 2046 will be like 2016 almost exactly, at least technology-wise.

how old are you?

A Wizard of Goatse
Dec 14, 2014

Rush Limbo posted:

When I look at a cloud in a certain way it may look like the face of the Buddha. It would be absurd to suggest that the act of precipitation of water is actively trying to convey the form of Siddhartha.

Why should the cloud's inner monologue matter? Why should an artist's? The important thing in art is what you, the viewer, sees; nobody values great intents and you can only infer them from the result anyway.

An autistic AI with a rich inner life it cannot meaningfully convey wouldn't be meaningfully different from a pet rock. The problem with computer-generated art isn't the computer's presumable lack of a soul, it's that the output of any given algorithm is so basic, consistent, and unchanging; once the novelty of 'whoa haha look at all those eyeballs' wears off it's just white noise. I doubt even OOCC could amuse himself by looking at Deep Dream dogmonsters all day; once you've seen a couple you've pretty much seen all Deep Dream is capable of.

A Wizard of Goatse fucked around with this message at 17:40 on Nov 29, 2016

A Wizard of Goatse
Dec 14, 2014

Condiv posted:

the choices of the artist matter. i wouldn't read a novella written by a markov-chain because it would be unintelligible and meaningless. i would read slaughterhouse five though

Would Slaughterhouse-Five stop being good if you found out it had been generated by a computer script, though?

The whole discursion about authorial intent and 'meaning' is pointless; computer-generated art is bland on its own merits not because computers lack qualia deep within their circuits but because computer art all boils down to just repeating minor semirandom variations on the same simple task a bazillion times, based on the parameters humans feed them.

A Wizard of Goatse fucked around with this message at 20:24 on Nov 29, 2016

A Wizard of Goatse
Dec 14, 2014

A million monkeys banging on a million typewriters for a million years might eventually produce the complete works of Shakespeare, but it's gonna take a lot more man-hours to find the Shakespeare in all the gibberish than it did for Shakespeare to just write it.

There's no reason machines have to take the million-monkeys approach, but that's where they're at right now.

A Wizard of Goatse
Dec 14, 2014

Condiv posted:

no, but my argument has never been that computers can't generate good or entertaining things. they just can't generate anything with meaning without a human behind them.

Would Slaughterhouse-Five suddenly stop being meaningful if you found out it had been generated by a computer?

You will never meet Kurt Vonnegut. You do not know him, reading his work did not telepathically soulbond you to him; Kurt Vonnegut the conscious human may as well not exist to you. He's better at writing than any currently extant machine, but it was not his touch that made Slaughterhouse-Five a good book: it was good words in a good order that created meaning in your mind.

Dog Jones posted:

You're right that it is not an efficient method of generating quality art objects, but also I think optimality is irrelevant here.

How so? If you're not concerned with reducing the conscious human labor involved, why even bother with computers? You can poke around on the ground until you find a pretty rock.

It isn't the million monkeys that are the artists in that scenario; it's the editors.

A Wizard of Goatse fucked around with this message at 20:33 on Nov 29, 2016

A Wizard of Goatse
Dec 14, 2014

Dog Jones posted:

The "random art generating computer" was just a thought experiment to illustrate that computers are capable of producing unique, innovative art. I am not saying this is a worthwhile endeavor in practice.

Yeah but what makes you say it's the computer producing the art, rather than whoever fishes the unique, innovative-looking stuff from the millions upon millions of copies of basically the same lovely thing? Without that guy, you're never going to get art anyone wants to look at; the random generator is not capable of producing good art on its own it is simply the medium that guy works in. You could hypothetically have a standalone art machine that produced interesting work, and there's no reason it would need a genuine sense of self to do so, but it'd have to be a very different sort of thing than modern computer programs are.

A Wizard of Goatse
Dec 14, 2014

Condiv posted:

it is an abstract concept yes. it's also a defined term so i'm not sure why you're having so much trouble with it

Because you seem to think it is an abstract concept that is also an innate physical property of objects bestowed by the touch of a sapient being, rather than an interpretation of the senses generated inside your own mind.

Dog Jones posted:

Ah I see what you're saying. That's interesting. Art as a search problem. So if we imagine that art works are not generated or created, but rather discovered, the task of the searcher in our thought experiment is identical to the task of the artist. I'll have to think about that!

Pretty much. There's whole genres of found-object art that take otherwise non-artistic artifacts and recontextualize them into an artwork. Until a machine can consistently produce worthwhile artworks on its own without a human riding herd on it, I don't know why we'd consider Deep Dream to be the 'artist' any more than we would the tree that grew the piece of driftwood that Joe Cornell stuck in one of his boxes.

A Wizard of Goatse
Dec 14, 2014

Getting huffy about how self-evident your argument is and how everyone who disagrees with it is just dumb doesn't make it any less of a mess, guy.

A Wizard of Goatse
Dec 14, 2014

because you said this

Condiv posted:

you do understand the difference between something having meaning and finding meaning in something right? the first requires cognition on the part of the creator, the second requires cognition on the part of the observer. a computer is incapable of cognition and therefore incapable of creating meaningful art on its own (it can be used to create meaningful art though). likewise, a computer is not able to find meaning in the dog spaghetti picture you like.

objects innately 'have meaning' or not independent of the meaning any observer might find in them, based on the state of mind of their creator. Whatever alternate intent you imagined for yourself when writing that post, the meaning of those words in that order that any English-speaking human is going to derive is "meaning is an innate physical property of objects imbued by their creator", and because qualia at the time of creation is actually not what defines meaning you actually do have to choose the proper words to express what you want if you wish to be understood.

A Wizard of Goatse fucked around with this message at 22:21 on Nov 29, 2016

A Wizard of Goatse
Dec 14, 2014

Oh is that how that works, well in that case in ten years I'll have a flying car that folds up into a briefcase like George Jetson.

A Wizard of Goatse
Dec 14, 2014

I literally named a cartoon running through my childhood that equated the future with flying cars (there were many), which sounds like it's the same place he got all his ideas about AI, but, y'know, Owlofcreamcheese.

A Wizard of Goatse
Dec 14, 2014

Cingulate posted:

I'm a bit confused about what the significance of flying car stories from the 80s is right now.

there is exactly the same basis for the guarantee we will have flying cars any day now as the guarantee we'll have sapient computers any day now, they come from the same place, and the exact same brainless handwaving about "well things were different in 1800 than they are now therefore who knows what the future will bring??? probably robot girlfriends" works for both equally. They're both asinine, OOCC just wants a robot girlfriend enough more than he wants a sweet hovercar to ignore how asinine the first one is.

A Wizard of Goatse fucked around with this message at 21:50 on Dec 24, 2016

A Wizard of Goatse
Dec 14, 2014

it's almost like the rightness of what "people" or "scientists" or "they" predicted X years ago about the modern day says nothing at all about what's going to happen in the future, and is meaningless noise for idiots.

A Wizard of Goatse
Dec 14, 2014

The predictions of specific people with a background in the field who can show their work and make a plausible case for why they're predicting what they are are more relevant and promising than goobers burbling about how some guy was probably dismissive about electricity in 1707 therefore they're visionaries ahead of their time instead of fantasists in I loving Love Science T-shirts; which are more barely more relevant than the predictions of the abstract, mostly-mythical strawman 'they' they handwave to.

There is basically no overlap between the expectations of the keyboard metaphysicists navel-gazing about what is intelligence, really and speculating that robot people will be the majority vote of 2028, and the expectations of people who actually work with computer technology. The latter guys are the ones expected to actually build the robot people, so I'll tend to weigh what they have to say a bit higher, especially if they can source their case in their work instead of Star Wars.

A Wizard of Goatse fucked around with this message at 22:54 on Dec 24, 2016

A Wizard of Goatse
Dec 14, 2014

Avshalom getting probated is all the proof we need that the machines already mod among us, but will never be 'human level'

A Wizard of Goatse
Dec 14, 2014

Thalantos posted:

When it comes to AI, won't we reach a point where it all becomes philosophical anyway?

People still argue whether humans have free will or not, so once we reach a point that AIs start passing the turing test, won't whether they're "really" intelligent become rather a moot point?

My smart phone would be considered a mind blowing AI just a few decades ago.

Should we ever reach the point where AIs start meaningfully passing the turing test, yes, it would be more or less a metaphysical debate, but we're not anywhere close to that point, have no idea how to get there, and modern science merely suggests it's hypothetically physically possible to fully simulate a brain in machine logic the same way it's hypothetically physically possible to build a solar-system sized containment chamber to trap all of the Sun's energy. In reality, your smartphone might blow an abacus out of the water but were you to say it is "really" intelligent in the terms we use for living organisms you would be considered an idiot, or worse, a singularitarian.

A Wizard of Goatse fucked around with this message at 18:39 on Jan 10, 2017

A Wizard of Goatse
Dec 14, 2014

tbc I am referring to passing the Turing test in the colloquial sense that machine-generated speech becomes so similar to human that the average person cannot tell the difference in whatever context it might casually be found in, like you can hold an actual meaningful conversation with Siri or whatever like you would a person, I do not think it becomes unprovable whether machines are conscious as soon as one has ever successfully performed a specific parlor trick with a judge and rules and stuff.

A Wizard of Goatse fucked around with this message at 22:25 on Jan 10, 2017

A Wizard of Goatse
Dec 14, 2014

for it to be a True AI it must perfectly simulate a 44-year-old man with an IQ of 87 and crippling OCD. He communicates entirely via sonnet delivered in Morse code, which the interlocutor has a vaguely recalled understanding of but has difficulty deciphering without the aid of a codebook (provided)

A Wizard of Goatse
Dec 14, 2014

Cingulate posted:

Just define "truly intelligent", I'll gladly take care of the rest.

The ability to problem-solve and independently learn and perform complex tasks in an uncontrolled environment, without supervision.

If you're having trouble defining these words, the benchmark here is a series of naked savannah predators that overran every dry patch of land on the planet, built giant climate-controlled towers of glass and steel using tools it developed from scratch out of whatever happened to be lying around, and currently in the most directly relevant cases gains its sustenance from sitting around pushing around electrons with nary a gazelle in sight, without ever once receiving outside instruction or assistance. At great effort and expense Google can occasionally detect when you've probably misspelled a search term typed into its textbox, that's pretty impressive.

A Wizard of Goatse fucked around with this message at 23:30 on Jan 10, 2017

A Wizard of Goatse
Dec 14, 2014

Cingulate posted:

Humans don't learn the knowledge required to uphold civilization without supervision, so your definition is too strong.
That said, it's also too weak because the definition itself is fulfilled by e.g. ravens.

Who is doing the supervision, is this a God actually told Newton what to do kind of thing? Because if it's just more humans all the way down then I think you're missing the entire point in your rush to um actually pedantry

A computer that is as smart and capable as a raven would be absolutely breathaking science-fiction bullshit. Hell, you're never going to see a computer as smart as a budgie in your lifetime.

A Wizard of Goatse fucked around with this message at 00:38 on Jan 11, 2017

A Wizard of Goatse
Dec 14, 2014

what is it you suppose the totality of the human species is made of?

I can't tell whether you're arguing that a few tens of thousands of desktop computers dropped into Kenya could collectively form the rise of the machines in a few millennia or you're just very, very stupid.

A Wizard of Goatse
Dec 14, 2014

are you planning on building an electric retard? what exactly is the point of this exercise to you

'talk like a human' is a serviceable standard of consciousness because language (which yes, includes writing, guy) is the readiest form by which we perceive the thoughts of others. Where it does not exist in humans, we infer it based on our knowledge of how humans, generally, operate, which is unlike the manner in which we know rocks or trees to. People, at least people outside the ones ITT who see no manner in which they are more generally competent than a Powerbook, do not assume rocks are actually the same as nonverbally autistic people. A computer that spits out random letters like a baby banging on a keyboard or outputs nothing at all like a coma patient is likewise not going to persuade anybody but weirdo masturbatory computer cultists that there is actually the awareness of a human - young, disabled, or otherwise - lurking inside there.

This isn't the only possible measure of intelligence, but it's one that'd settle the issue in most humans' eyes. I don't know if you think you're going to lawyer someone into accepting that your paperweight thinks and feels as much as you do or what but that's not really how it works.

A Wizard of Goatse fucked around with this message at 02:22 on Jan 11, 2017

Adbot
ADBOT LOVES YOU

A Wizard of Goatse
Dec 14, 2014

thechosenone posted:

Are you talking to me? If you are, a person who was brain dead and a 2 year probably couldn't interact with a keyboard. But If you could make a robot that could move around, act and learn like a 2 year old, that would be pretty drat impressive.

I don't think I personally have much in particular in mind, just that it would have to be something pretty hard to bullshit.

That was directed to Subjunctive, the one who immediately responded to the concept of the Turing test with a digression about the difficulties of voice synthesis and is now apparently struggling with the idea that people don't generally figure that things that don't look like people or communicate like people are people. I agree with you on the learning point, I figure on the day you can drop the most advanced robot in the world in some random environment and it does as well at navigating the new terrain and carving out a little robot life for itself as, say, a squirrel in Manhattan would, we can start talking about "machine intelligence" as a real thing, and not to actual intelligence what strong liquor is to a bodybuilder's biceps.

  • Locked thread