Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
supabump
Feb 8, 2014

Klyith posted:

Can you make an intelligent brain out of molecules? Molecules aren't very complex, they certainly aren't aware of anything, they have no subjective experience or inner life.

Oh, you can? Cool. Transistors are more complicated and aware than molecules, case closed.

I don't necessarily disagree with the broader point you are making, but if your goal is to create something with as high a level of precision and complexity as possible, you want the fundamental building blocks to be smaller, not bigger. this has very significant implications for the physics of how a brain works. neurons are extremely good at their jobs.

Artificial general intelligence will probably exist in the distant future, but the simplest way to conceive of a more convincing "human" AI is to copy more things specific to human biology.

Mooey Cow posted:

Not even a 100% accurate fluid simulation of water will make your computer wet inside, nor will scribblings on a paper fly in space when you solve some differential equations related to celestial mechanics. What reason is there then to suppose a priori, or even apparently to find it blindlingly obvious, that consciousness and subjective experience will emerge out of an algorithm and somehow reach out of it? Seems like quite an extraordinary claim, that this or that particular model is the same thing as reality.

We currently create synthetic neurons out of transistors. What if we used the same chemical building blocks to create synthetic human nerve cells instead of using transistors, and created a perfect replica of a human brain? Did we create a human? If so, at what specific point did it become human instead of a machine?

Adbot
ADBOT LOVES YOU

OMFG FURRY
Jul 10, 2006

[snarky comment]
like the stoned ape theory, we just need to figure out a way to consistently get a prototype AI high as balls and then put it through incredibly stressful situations in the hopes that the machine elves imbued it with sapience and it's first words are

print ('gently caress you dad')

supabump
Feb 8, 2014

I want to also quickly note that "convincingly acts like a human" is a really lovely goal:
  • humans can already make new humans
  • people making synthetic humans will likely be more interested in profit than ethics
  • using AI to create AI-based tools instead of AI-based humans is so much more useful for the enrichment of humanity

The Butcher
Apr 20, 2005

Well, at least we tried.
Nap Ghost
My Batman villain name is... The Neural Net



I'm on a pretty tight budget so I made my costume and lair out of old fishing nets I found down at the docks.

My henchmen are always bitching that my lair smells terrible, which privately, does kind of hurt my feelings.

The Butcher
Apr 20, 2005

Well, at least we tried.
Nap Ghost
My archnemesis is The Brain.

He is such a poser and thinks way to much of himself. I'm pretty sure he doesn't actually smear brains on himself and its just ground beef.

His henchmen also think he stinks, and try not to get too close.

Tiny Bug Child
Sep 11, 2004

Avoid Symmetry, Allow Complexity, Introduce Terror

Mooey Cow posted:

Not even a 100% accurate fluid simulation of water will make your computer wet inside, nor will scribblings on a paper fly in space when you solve some differential equations related to celestial mechanics. What reason is there then to suppose a priori, or even apparently to find it blindlingly obvious, that consciousness and subjective experience will emerge out of an algorithm and somehow reach out of it?

b/c we don't really even have a solid agreed-upon definition of what "consciousness" or "subjective experience" are, let alone have a testable way to determine whether they're happening or not. and that doesn't just apply to machines! it seems weird to me to be sure that only brain-meat can generate consciousness, or to assume that there's no point where an artificial mind wouldn't have consciousness, when we don't even know what that is

echnical feasibility is a whole different question but it also seems strange to assume it's impossible even in the short term. with current hardware maybe not, but for all we know the next shockley is out there figuring out how to rip his colleagues off for the next transistor right now. i don't think there's any reasonable prediction that can be made about future technology other than that it will be used to do things you won't like

OMFG FURRY
Jul 10, 2006

[snarky comment]
https://www.youtube.com/watch?v=T6JFTmQCFHg

Worf
Sep 12, 2017

If only Seth would love me like I love him!

supabump posted:


[*] people making synthetic humans will likely be more interested in profit than ethics


this is quite obviously true of most real humans as well, since we've actually killed our planet in like 2 generations.

i think people are kinda stupid, and overly greedy in just about anything they do

idk why this should be seen as different

Splicer
Oct 16, 2006

from hell's heart I cast at thee
🧙🐀🧹🌙🪄🐸

Mooey Cow posted:

Not even a 100% accurate fluid simulation of water will make your computer wet inside, nor will scribblings on a paper fly in space when you solve some differential equations related to celestial mechanics. What reason is there then to suppose a priori, or even apparently to find it blindlingly obvious, that consciousness and subjective experience will emerge out of an algorithm and somehow reach out of it? Seems like quite an extraordinary claim, that this or that particular model is the same thing as reality.
That's where the question of "what is consciousness" gets interesting. If you model a brain exactly (and provide it with some way of taking input from and sending it out to the outside world), will you end up with a conscious being or some form of P-Zombie? Is there a meaningful difference?

Splicer
Oct 16, 2006

from hell's heart I cast at thee
🧙🐀🧹🌙🪄🐸

The Butcher posted:

My Batman villain name is... The Neural Net



I'm on a pretty tight budget so I made my costume and lair out of old fishing nets I found down at the docks.

My henchmen are always bitching that my lair smells terrible, which privately, does kind of hurt my feelings.
Actual feelings or simulated feelings

The Butcher
Apr 20, 2005

Well, at least we tried.
Nap Ghost

Splicer posted:

Actual feelings or simulated feelings

That's my power, you can't tell! I use it to start nerd arguments.

My henchmen are like "uh boss this is kind of boring, can we go rob a bank or something fun?"

"Never! I say." but it just comes out really muffled and not understandable under all these fishing nets.

Panic! At The Tesco
Aug 19, 2005

FART


The Butcher posted:

My Batman villain name is... The Neural Net



I'm on a pretty tight budget so I made my costume and lair out of old fishing nets I found down at the docks.

My henchmen are always bitching that my lair smells terrible, which privately, does kind of hurt my feelings.

I'm one of this guys thugs

ITS DA BAT!

DicktheCat
Feb 15, 2011

I hope it isn't real.

For its own wellbeing.
I don't want them to create a being only to exploit.


What a species we are.

Improbable Lobster
Jan 6, 2012

What is the Matrix 🌐? We just don't know 😎.


Buglord

Splicer posted:

When did the mean ol' programmers hurt you

like 1999

GABA ghoul
Oct 29, 2011

Tiny Bug Child posted:

b/c we don't really even have a solid agreed-upon definition of what "consciousness" or "subjective experience" are, let alone have a testable way to determine whether they're happening or not. and that doesn't just apply to machines! it seems weird to me to be sure that only brain-meat can generate consciousness, or to assume that there's no point where an artificial mind wouldn't have consciousness, when we don't even know what that is

I've been unironically coming around on the idea of panpsychism over the years. It's such a lazy but elegant solution to the ~hard problem of consciousness~. Like, lol, what do you mean by "can software have consciousness?". Of course it does, even the chair you are sitting in and your rear end sweat has consciousness :smugdog:

GABA ghoul fucked around with this message at 23:02 on Jun 21, 2022

Improbable Lobster
Jan 6, 2012

What is the Matrix 🌐? We just don't know 😎.


Buglord

GABA ghoul posted:

A man who knows literally nothing about AI research, computers or neuroscience arrogantly explains to a crowd what AI can't do.


Cool self portrait

Improbable Lobster
Jan 6, 2012

What is the Matrix 🌐? We just don't know 😎.


Buglord

Powerful Katrinka posted:

"This chat bot isn't sentient" =/= "AI is 100% impossible and always will be." Y'all gotta stop taking people's skepticism of this specific chat bot and twisting it into an extreme position no one is taking TIA

lmao

Splicer posted:

There are people in this thread saying exactly what Pitdragon is arguing against, is the thing.

The default reasonable stance is that there is so reason to believe that AI will happen anytime soon, or within our lifetimes

Mooey Cow posted:

That is a claim, not a statement of fact. It might be just as accurate as saying "a cell is just like a city", or a factory, or that an organism is like a society and vice versa. The idea that organisms are machines started around when clockworks were the coolest technology of the day. The idea that "well maybe brains are computers" started when computers became the latest most advanced thing people could conceive of. Who knows what metaphors will be used in the future to describe systems that are fundamentally self-organizing all the way down.

So let's say you do manage to accurately model a city as a black box that has trucks of resources going into and out of it, maybe it to the accuracy you can measure corresponds 100% to the movements of trucks to and from some real city. It seems there would be no reason to conclude from this that you have also managed to model the inner lives of the people living in that real city. That would be pure mysticism. The question then is, at what point does the map become the territory?

Yes, this exactly. How many people in this thread are lesswrong donators?

Improbable Lobster fucked around with this message at 23:11 on Jun 21, 2022

PITY BONER
Oct 18, 2021

*Does not include Tesla AI that keeps killing people.

Mr. Lobe
Feb 23, 2007

... Dry bones...


"Car to not explode" should also go in the little box

sebmojo
Oct 23, 2010


Legit Cyberpunk









Inner life, the single characteristic of intelligence that computers can never possess, because if they possessed it it wouldn't be inner life :smith:

Colonel Cancer
Sep 26, 2015

Tune into the fireplace channel, you absolute buffoon
Outer life's more important, it's like having a giant dong

The Butcher
Apr 20, 2005

Well, at least we tried.
Nap Ghost

Colonel Cancer posted:

Outer life's more important, it's like having a giant dong

You can only express it in very certain settings though or people get upset.

Though people get upset if you express some of your inner life with them as well, so you just can't win.

I'm sorrrrry I have "unacceptable" thoughts about WW2. Hmpf.

StrangersInTheNight
Dec 31, 2007
ABSOLUTE FUCKING GUDGEON
we don't even know how loving thought works, which is in part what this AI was built to study

StrangersInTheNight fucked around with this message at 00:21 on Jun 22, 2022

syntaxfunction
Oct 27, 2010
Everyone being like "we can't build a real mind" and it's like, heh, maybe *you* can't chumps :smug:

sebmojo
Oct 23, 2010


Legit Cyberpunk









The Butcher posted:

You can only express it in very certain settings though or people get upset.

Though people get upset if you express some of your inner life with them as well, so you just can't win.

I'm sorrrrry I have "unacceptable" thoughts about WW2. Hmpf.

my 'i will not share my inner thoughts about poisoning the town water supply' t-shirt is attracting questions that are already answered by my t-shirt

Zeluth
May 12, 2001

by Fluffdaddy
Would you like to visit to my sentient program? Here is my pamphlet.

syntaxfunction
Oct 27, 2010
I don't think most people here are sentient. Not like you're all bots, you're all just dumb lol gottem.

Nice Guy Patron
Jun 29, 2015

tango alpha delta posted:

lol, very, very few programmers can keep their egos in check because they think they are literally the smartest guys in the room.

This is a bunch of office workers smugly talking about what they think a brain is. I wish I was lucky enough to live in the lap of luxury. Where I can use my PTO to pretend my IT job is actually important.

Nice Guy Patron
Jun 29, 2015
Everyone that is comparing the brain to a computer should use their PTO to work a service job.

Nice Guy Patron
Jun 29, 2015
I want to hear the report after that.

The Bloop
Jul 5, 2004

by Fluffdaddy
I think that if we can ever get a real AI it wont be programmed by a human

It will be made by a weaker AI that was made by a weaker AI (n times) and the code will probably seem like gibberish

This requires us to encourage that to happen but not because we figured out the code for thought and then implemented it directly. It will have to iteratively evolve and that will require an arbitrary amount of computing power we probably don't yet have, or a zillion years




You know, or not. Crazy poo poo happens.

TIP
Mar 21, 2006

Your move, creep.



The Bloop posted:

I think that if we can ever get a real AI it wont be programmed by a human

It will be made by a weaker AI that was made by a weaker AI (n times) and the code will probably seem like gibberish

This requires us to encourage that to happen but not because we figured out the code for thought and then implemented it directly. It will have to iteratively evolve and that will require an arbitrary amount of computing power we probably don't yet have, or a zillion years

this is not far off from how we already create AI, except instead of an AI creating another AI it's two AIs being pitted against each other to improve both

quote:

https://en.m.wikipedia.org/wiki/Generative_adversarial_network

The generative network generates candidates while the discriminative network evaluates them.[1] The contest operates in terms of data distributions. Typically, the generative network learns to map from a latent space to a data distribution of interest, while the discriminative network distinguishes candidates produced by the generator from the true data distribution. The generative network's training objective is to increase the error rate of the discriminative network (i.e., "fool" the discriminator network by producing novel candidates that the discriminator thinks are not synthesized (are part of the true data distribution)).

sure okay
Apr 7, 2006





The Bloop posted:

I think that if we can ever get a real AI it wont be programmed by a human

It will be made by a weaker AI that was made by a weaker AI (n times) and the code will probably seem like gibberish

This requires us to encourage that to happen but not because we figured out the code for thought and then implemented it directly. It will have to iteratively evolve and that will require an arbitrary amount of computing power we probably don't yet have, or a zillion years




You know, or not. Crazy poo poo happens.

https://en.wikipedia.org/wiki/Technological_singularity

The Bloop
Jul 5, 2004

by Fluffdaddy

That's not really the same concept, but it's possible that one could lead to the other

Of course we can always literally pull the plug...

zedprime
Jun 9, 2007

yospos

Mooey Cow posted:

Not even a 100% accurate fluid simulation of water will make your computer wet inside, nor will scribblings on a paper fly in space when you solve some differential equations related to celestial mechanics. What reason is there then to suppose a priori, or even apparently to find it blindlingly obvious, that consciousness and subjective experience will emerge out of an algorithm and somehow reach out of it? Seems like quite an extraordinary claim, that this or that particular model is the same thing as reality.
If I'm to believe the people in this thread are real beings with their own experiences I'm ready to jump to the idea that simulacra define realities or experiences of their own that are just as important. I would be forced to deny the reality of a system described by equations, a fantasy story in books, and your recollections of life all on the same infantile reasoning of getting completely tripped by the Cartesian first principle in the ways of fascists looking for ways to improve their life by abusing the other.

We're ultimately in an argument about the extent of empathy and our ability to empathize with anything not anthropic.

The Bloop
Jul 5, 2004

by Fluffdaddy
Thought is a fundamentally different thing to wetness or rocket ships lol

That difference doesn't mean an algorithm can ever produce thought but it does mean that those counterexamples are extremely silly

TIP
Mar 21, 2006

Your move, creep.



computers can't do math because there's no paper or pencils in them

checkmate

sebmojo
Oct 23, 2010


Legit Cyberpunk









TIP posted:

computers can't do math because there's no paper or pencils in them

checkmate

tip bringing in the piledriver straight from the ropes

The Bloop
Jul 5, 2004

by Fluffdaddy
Listen the internet is like a million gigaquads of porn and if it hasn't gotten horny yet it's not going to

Adbot
ADBOT LOVES YOU

hot cocoa on the couch
Dec 8, 2009

100% DOG LOVER
ALL DOGS LOVED, ALL THE TIME

sebmojo posted:

tip bringing in the piledriver straight from the ropes

lol but absoluetly this. open and shut, and that's a checkmate

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply