Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
porfiria
Dec 10, 2008

by Modern Video Games

Rutibex?

Adbot
ADBOT LOVES YOU

porfiria
Dec 10, 2008

by Modern Video Games

Gentleman Baller posted:

One thing I've been doing with Bing's AI lately, is thinking of new puns and seeing if it can work out the theme and generate more puns that fit the theme, and it does it extremely well, I think.

When I asked it, "Chairman Moe, Vladimir Lenny, Carl Marx. Think of another pun that fits the theme of these puns." It gave me Homer Chi Minh and Rosa Luxembart (and a bunch of bad ones ofc that still fit the theme.)

I did the same with, "Full Metal Arceus, My Spearow Academia, Dragonite Ball Z" and it gave me Cowboy Beedrill and Tokyo Gimmighoul.

A common refrain I see online and even from ChatGPT itself is that it is just a text predictor, and is incapable of understanding or creating truly new things. But as far as I can tell, it can do something that is at least indistinguishable from understanding and unique creation, right?

Edit: I guess what I have been trying to wrap my head around is, if this isn't understanding and unique creation then what is the difference?

I mean I agree, at least to a degree. Does it "know" what a "horse" is? All it has are these weighted associations, but I would bet you could ask it almost any question about a horse, both things that are explicitly in the training data (how many legs does a horse have?) and things that aren't (could a horse drive a car) and get coherent, accurate answers. So I'd say it has an internal model of what a horse is, semantically. I don't think that means it has subjective experience or anything, just that yeah, it seems pretty reasonable to say it knows what a horse is.

porfiria
Dec 10, 2008

by Modern Video Games

Gumball Gumption posted:

In that example it's more fair to say it knows that when text describes a horse it describes it with 4 legs so if you ask it how many legs a horse has it's response will be 4 because that's the common response and it wants to give you a response that looks correct.

Yeah but it also knows you can't build a real horse out of cabbage (but you can build a statue of one), that horses can't drive because they aren't smart enough and don't have hands, and so on. All this stuff may just be weighted values in a huge matrix or whatever, but it can build associations that are vastly more extensive and subtler than words just tending to appear near other words in the training data.

You edited your response a bit. So just to expand:

I'd say it does "know" what a horse is, but that's for some definition of "know." It doesn't have any kind of audio or visual model for a horse (although it probably will soon, so it's kind of a moot point). And of course it doesn't have any personal, subjective associations with a horse in the way that a human does.

But as a matter of language, I'd say yeah, it can deploy "horse" correctly, and "knows" just about all the facts about horses there are, and how those facts inter-relate to other facts about the world in a comprehensive way that, to my mind, meets a lot of the criteria for "knowing" something.

porfiria fucked around with this message at 01:01 on Mar 28, 2023

porfiria
Dec 10, 2008

by Modern Video Games

Count Roland posted:

I'd be shocked if AI models haven't been working with big money on the stock market for years.

60 to 70 percent of the trading on any given day is algorithmic. Also look up high frequency trading and boggle at how abstract our economy has gotten.

porfiria
Dec 10, 2008

by Modern Video Games
I think the thinking among economists is that if technology reduces the amount of labor input for each unit of production, the cost and therefore unit price falls, which stimulates demand, which causes more people to get employed.

This makes some sense but I have doubts it holds in all cases, particularly when the amount of human labor becomes extremely small.

porfiria
Dec 10, 2008

by Modern Video Games

KillHour posted:

The fact that you could look at that, agree with my take, and go "The coal mining companies are the lesser evil" is astounding.

This is why Trump won, and will win again...

porfiria
Dec 10, 2008

by Modern Video Games
Lol at the AI dorks turning concept artists into the next displaced cohort with nowhere to shelter but right wing corporatism.

Adbot
ADBOT LOVES YOU

porfiria
Dec 10, 2008

by Modern Video Games

KwegiboHB posted:

This is bullshit, there is absolutely no reason to go right wing anything, ever.

EXT. RUINED RED HOOK - DAY

Barron Trump
And when I am elected president, I vow to ban AI and bring HR and design jobs back to the people!

The crowd, made up of middle aged zoomers, their bodies ravaged by poor nutrition, drugs,
and vaping, cheers wildly.

INT. MSNBCNN STUDIO - DAY

"Expert"
(Dorkishly)
Um, I think you'll find that the automation of white collar jobs actually increased utility
in aggregate and empowered consumers.

(USER WAS PUT ON PROBATION FOR THIS POST)

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply