Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
MrQwerty
Apr 15, 2003

LOVE IS BEAUTIFUL
(づ ̄ ³ ̄)づ♥(‘∀’●)


Someone get Nathen Mazri on the phone

Adbot
ADBOT LOVES YOU

Atopian
Sep 23, 2014

I need a security perimeter with Venetian blinds.
Probably best not to assume that someone in marketing means well.

They might! Many do! It just wouldn't be my first assumption.

Saganlives
Jul 6, 2005



Atopian posted:

Probably best not to assume that someone in marketing means well.

They might! Many do! It just wouldn't be my first assumption.

I shudder at the thought of an unconscious marketing person.

teen witch
Oct 9, 2012
Instagram you knew we needed ads you can’t mute as a cool 2000s throwback!!! Thank you so much I just love this nostalgia trip!!!!!

anonumos
Jul 14, 2005

Fuck it.

deep dish peat moss posted:

I used to work for a company that did a lot of e-commerce and digital marketing research and all of their research kept coming back with the conclusions that people absolutely love personalized shopping/advertising experiences and personably-written marketing mailing lists. On one hand I want to say it was just bullshit self-congradulatory stuff but on the other hand I think the average consumer is dumb as hell and loves being told what to spend their money on.

So, let's add this up shall we? The company spent oodles of time, money, and effort providing a more personalized experience, but when they ask their customers what they want they practically beg for a more personalized experience.

Customer: I want A.
Company: Here you go.
Customer: This is B. I want A.
Company: Here you go.
Customer: I Want A.
Company: Here you go.
Customer: ...
Company: Consultant, what does this customer want?

His Divine Shadow
Aug 7, 2000

I'm not a fascist. I'm a priest. Fascists dress up in black and tell people what to do.

teen witch posted:

Instagram you knew we needed ads you can’t mute as a cool 2000s throwback!!! Thank you so much I just love this nostalgia trip!!!!!

I have a big physical button I can turn down on the 1980s amp that is the base of my PCs sound system. Nothing is unmutable to me :smug:

I feel like Nevil Clavain sometimes. Which is a character from a bunch of sci-fi books, his deal is he and others like him got heads full of tech and a lot of people get their heads compromised by a bad guy super hacker. Except him because his implants are so woefully outdated it's like trying to hack a rotary phone.

wash bucket
Feb 21, 2006

His Divine Shadow posted:

I feel like Nevil Clavain sometimes. Which is a character from a bunch of sci-fi books, his deal is he and others like him got heads full of tech and a lot of people get their heads compromised by a bad guy super hacker. Except him because his implants are so woefully outdated it's like trying to hack a rotary phone.

Ghost in the Shell had a character like that. They were a team of cyborgs but they kept one regular human on the team just in case they went up against a super hacker or something.

Thought everyone would like to know that. God bless.

wash bucket fucked around with this message at 11:59 on May 15, 2024

Vampire Panties
Apr 18, 2001
nposter
Nap Ghost

DrBouvenstein posted:

If you think Google's half-assed, lovely AI helpers in their search is bad now, don't worry... It'll be full-assed soon enough:

https://www.wsj.com/tech/ai/google-generative-ai-search-summaries-announcement-ffab4ce1

my big issue with Google AI search summaries is that they're almost always wrong, and you have to dig into them to find out why. I was searching for the dimensions on a specific model of truck bed, and the summaries showed me the dimensions of a different make/model entirely.

DicktheCat
Feb 15, 2011

Vampire Panties posted:

my big issue with Google AI search summaries is that they're almost always wrong, and you have to dig into them to find out why. I was searching for the dimensions on a specific model of truck bed, and the summaries showed me the dimensions of a different make/model entirely.

I'm certainly not the first, nor the most eloquent to say this, but this ai poo poo is going to start compounding problems really fast if people build anything of real import on it.

If it can't get something basic like "specific truck dimensions" correct, then it can't be trusted for any of the real science that proponents promise over and over it'll revolutionize. Your use case here is the simplest imaginable, it's looking for hard data with no real ambiguity to it. A lot of people have pointed out already that busy people will often default to what's convenient, and if we, say, start genuinely implementing this tech in the medical field, then poo poo's going to go off the rails fast.


The tech simply isn't there. I recommended a brand of sticker paper I've had good success with in a crafting subreddit (yuck, I know, but I needed answers for some troubleshooting on my cutter machine) and some little ai bot chimed in with a summary of other reviews that weren't accurate to the product at all. It was claiming a lot of things that were counter to all experiences I've had with it, and I can only figure that it was lumping in all sticker paper reviews together or reporting from people who were using the wrong kind of printer, etc.

I wonder if they gently caress up so badly when collating the data they're stealing from everyone.

euphronius
Feb 18, 2009

It’s just destroying the casual internet. Data you actually need to do a job has not been infected yet. I hope. It probably has

Pennywise the Frown
May 10, 2010

Upset Trowel
Not to mention all of the AI that's going to get their lovely info from the incorrect AI. Absolutely nothing will be reliable.

poo poo's going to get real bad real quick.

I wonder if they still sell Encyclopedia Britannica.

Internet Old One
Dec 6, 2021

Coke Adds Life

Tunicate posted:

Similarly, Coca Cola advertisers created the myth that their ads created the modern Santa. Their job is literally lying to the public about how good things are, of course they'll make this poo poo up.

Astonishing. Just looked it up and had no problem finding modern santa ads that predate coca cola!

The crazy thing is I've only ever heard this factoid in the context of "Our entire culture, all folk wisdom, and basically everything you love is the product of some fucker's toothpick or enema tonic ad campaign in 1802" and not that it was a good thing.

I notice advertisers, from the writing of Bernays to the guy I knew who blasted spam out of 4 cables modems in his lovely apartment and turned it into an empire, they have this absolutely warped degree of professional myopia where they honestly think there is nothing wrong with what they do and consumers are more than happy to be "educated" with dozens of emails a day and ads crammed into every other tiny faucet of their lives.

Also cable modem guy once said he had sex with his cousin on the couch in his server room but then denied they were related later saying she was merely a new employee

euphronius
Feb 18, 2009

moist banana bread
Dec 17, 2023

banana Jake!

Pennywise the Frown posted:

Not to mention all of the AI that's going to get their lovely info from the incorrect AI. Absolutely nothing will be reliable.

poo poo's going to get real bad real quick.

I wonder if they still sell Encyclopedia Britannica.

Games are going to have locally run LLM models (by the way this will increase the size of the game 30-100 GB) driving NPC's. Graphics cards are going to have more (seemingly) genuine empathy than actual people and gamers who are already introverted are in for a weird time.

sinky
Feb 22, 2011



Slippery Tilde

moist banana bread posted:

Games are going to have locally run LLM models (by the way this will increase the size of the game 30-100 GB) driving NPC's.

Why locally? Host that poo poo online so they can switch the servers off after 3 years and make people buy the sequel.

moist banana bread
Dec 17, 2023

banana Jake!
Yeah you're right, probably not (though not for that reason). We'd all need $10,000 computers with 100GB of RAM or whatever. I guess my realization is that it's feasible though. It'll be cloud hosted, which yea is convenience for the "games as a service" goons.

MrQwerty
Apr 15, 2003

LOVE IS BEAUTIFUL
(づ ̄ ³ ̄)づ♥(‘∀’●)

moist banana bread posted:

Yeah you're right, probably not (though not for that reason). We'd all need $10,000 computers with 100GB of RAM or whatever. I guess my realization is that it's feasible though. It'll be cloud hosted, which yea is convenience for the GAAS goons.

you sell smart computers to people at a significant markdown with 2 layers of abstracted, bloated data-gathering software before it boots windows or whatever os

TotalLossBrain
Oct 20, 2010

Hier graben!
Getting mad at the in-game AI helper for arguing with me about vagina bones

Mozi
Apr 4, 2004

Forms change so fast
Time is moving past
Memory is smoke
Gonna get wider when I die
Nap Ghost

moist banana bread posted:

Yeah you're right, probably not (though not for that reason). We'd all need $10,000 computers with 100GB of RAM or whatever. I guess my realization is that it's feasible though. It'll be cloud hosted, which yea is convenience for the "games as a service" goons.

Microsoft is dictating that future Windows versions will require a hardware dedicated AI chip

Canine Blues Arooo
Jan 7, 2008

when you think about it...i'm the first girl you ever spent the night with

Grimey Drawer
Games aren't going to be running AI for NPCs any time soon for a variety of reasons. These reasons include:

• Running a local model requires way, way too many resources
• Running a remote model is expensive that has to be paid by someone.
• Both of these solutions will produce inconsistent results and there is no way to say, '...And please don't make poo poo up!' to the model
• The local solution will produce substantial latency. The remote solution on big hardware will produce some latency.

The model size would only be a few gigs, but the sheer resources required to run it would be out of control, especially along side what is ostensibly a whole game.

So no, this isn't happening except for games that are built around it as the prime feature. Baldur's Gate 4 / Dragon Age Next / Cyberpunk 2078 will not feature this.

anonumos
Jul 14, 2005

Fuck it.

sinky posted:

Why locally? Host that poo poo online so they can switch the servers off after 3 years and make people buy the sequel a new girlfriend.

deep dish peat moss
Jul 27, 2006

Mozi posted:

Microsoft is dictating that future Windows versions will require a hardware dedicated AI chip

Don't AI things still run off of GPU? So basically they're saying everyone's going to need 2 video cards?

anonumos
Jul 14, 2005

Fuck it.

deep dish peat moss posted:

Don't AI things still run off of GPU? So basically they're saying everyone's going to need 2 video cards?

Maybe not, but perhaps the best gaming experience will come from using Nvidia brand shovels. Meaning maybe you can have a voice conversation with the uncanny valley in the next installment of your favorite RPG. Or not, ATI virgins.

Hippie Hedgehog
Feb 19, 2007

Ever cuddled a hedgehog?

deep dish peat moss posted:

Don't AI things still run off of GPU? So basically they're saying everyone's going to need 2 video cards?

Seriousposting in the whine thread: Most PCs today not have a neural processing unit (NPU). I think you can make a lot of use of a GPU when training models for AI (ie machine learning), but for very fast inference (ie when evaluating inputs using a model, to make predictions/classifications/whatever) you normally need a specific NPU. (I could be wrong on this part.)

Some phones and tablets do have them, notably Apple were the first, and I'm pretty sure their M-series notebook CPUs have NPUs too. Google put their Tensor chips in some Pixel phones, and I think Intel and AMD have both added an NPU function in their last-gen CPUs.

Right now, I believe most ML inference happens in the cloud, like when you use Midjourney or GPT. Those use huge models that don't make sense to put locally on your PC. In the future, though, these vendors predict you'll have tons of small AI models deployed locally for apps to use, like Apple's camera app magic that compensates for your shaky hands.

Hippie Hedgehog fucked around with this message at 21:04 on May 15, 2024

doctorfrog
Mar 14, 2007

Great.

Problem: LLMs require astronomical amounts of quality data, all of which is completely exhausted

Solution: use synthetic data created by other LLMs to train the LLMs

Problem: this is roughly the same as connecting a dog's anus directly to its mouth and claiming to have invented perpetual motion

Solution: evolve captchas to require humans to train the LLMs directly

Problem: most humans make bad copy editors and contaminate the new data, also there aren't enough thumbs on planet earth to do this anyway

Solution: require humans to complete a captcha before they get a response from their LLM. the captchas will be LLM responses to other humans

Canine Blues Arooo
Jan 7, 2008

when you think about it...i'm the first girl you ever spent the night with

Grimey Drawer

Hippie Hedgehog posted:

Seriousposting in the whine thread: Most PCs today not have a neural processing unit (NPU). I think you can make a lot of use of a GPU when training models for AI (ie machine learning), but for very fast inference (ie when evaluating inputs using a model, to make predictions/classifications/whatever) you normally need a specific NPU. (I could be wrong on this part.)

Some phones and tablets do have them, notably Apple were the first, and I'm pretty sure their M-series notebook CPUs have NPUs too. Google put their Tensor chips in some Pixel phones, and I think Intel and AMD have both added an NPU function in their last-gen CPUs.

Right now, I believe most ML inference happens in the cloud, like when you use Midjourney or GPT. Those use huge models that don't make sense to put locally on your PC. In the future, though, these vendors predict you'll have tons of small AI models deployed locally for apps to use, like Apple's camera app magic that compensates for your shaky hands.

NPUs are effectively just a hardware feature set on SoCs. My knowledge on their specifics is limited, so grain of salt and all.

Getting result from a model means access to a model loaded in memory, and physical distance to that memory matters a shitload. Putting some 'AI-enabled CPU' in a computer ain't it for this purpose - you are too far away from the actual model. The compute for any work is going to be done on the GPU, which also happens to be well-tailored to this workload.

In an SoC with a CPU, GPU, and Memory all on one die (e.g. Apples M-series chips), then conversation changes a little bit but the fundamental ideas are the same. It also runs into different sets of problems that limit it's ability to do large workloads in that space. What Apple will call 'AI cores' is likely just going to be not that different from what Nvidia Tensor cores: compute units that can do mixed precision matrix math natively. But you can't say 'Matrix Core', you have to brand your poo poo AI-y so it's now an AI core... Actually maybe you could call it a Matrix Core™, that's marketable by itself...

All this is to say that AI isn't magic. Computers are not magic. AI™ is not even AI - It's machine learning dressed up in a fancy suit. It has the same practical uses it's had for decades and the new implementations are shiny and fun and have extended that use case in places, but it's not going to transform computers from 'math boxes' into 'MAGIC'.

Canine Blues Arooo fucked around with this message at 21:35 on May 15, 2024

Splicer
Oct 16, 2006

from hell's heart I cast at thee
🧙🐀🧹🌙🪄🐸
*flips switch to 'more magic'"

MrQwerty
Apr 15, 2003

LOVE IS BEAUTIFUL
(づ ̄ ³ ̄)づ♥(‘∀’●)

lol "neural processing unit"
that name carries a lot of hubris

Salt Fish
Sep 11, 2003

Cybernetic Crumb

Canine Blues Arooo posted:

Games aren't going to be running AI for NPCs any time soon for a variety of reasons. These reasons include:

• Running a local model requires way, way too many resources
• Running a remote model is expensive that has to be paid by someone.
• Both of these solutions will produce inconsistent results and there is no way to say, '...And please don't make poo poo up!' to the model
• The local solution will produce substantial latency. The remote solution on big hardware will produce some latency.

The model size would only be a few gigs, but the sheer resources required to run it would be out of control, especially along side what is ostensibly a whole game.

So no, this isn't happening except for games that are built around it as the prime feature. Baldur's Gate 4 / Dragon Age Next / Cyberpunk 2078 will not feature this.

You missed "the technology literally can't do this even if it worked perfectly because its not an actual AI that understands what is coherent or interesting to a player"

Salt Fish
Sep 11, 2003

Cybernetic Crumb

MrQwerty posted:

lol "neural processing unit"
that name carries a lot of hubris

Either hubris or a sad statement about the current state of human neural processing.

MrQwerty
Apr 15, 2003

LOVE IS BEAUTIFUL
(づ ̄ ³ ̄)づ♥(‘∀’●)

Salt Fish posted:

Either hubris or a sad statement about the current state of human neural processing.

both

Cabal Ties
Feb 28, 2004
Yam Slacker
You guys say this but the latest version of chat gpt I’m having problems actually stopping it from providing me with over a 1000 lines of perfectly written code each time. The previous model it could be a struggle to get it to produce snippets sometimes but this one shits out the lot no questions asked.

I can absolutely see a world where basic flavor text is dynamically generated by a LLM rather than a variant or two hardcoded in

Cabal Ties fucked around with this message at 00:29 on May 16, 2024

Salt Fish
Sep 11, 2003

Cybernetic Crumb

Cabal Ties posted:

You guys say this but the latest version of chat gpt I’m having problems actually stopping it from providing me with over a 1000 lines of perfectly written code each time. The previous model it could be a struggle to get it to produce snippets sometimes but this one shits out the lot no questions asked.

I can absolutely see a world where basic flavor text is dynamically generated by a LLM rather than a variant or two hardcoded in

Here's a true story for you. Two teams of developers both started using the latest GPT to automate coding. One team reported back that 30% of the code generated by GPT was usable. The other team reported that 50% of the code generated was usable. They were using the same version and same language.

MrQwerty
Apr 15, 2003

LOVE IS BEAUTIFUL
(づ ̄ ³ ̄)づ♥(‘∀’●)

lol

Houle
Oct 21, 2010
Generating code is likely much easier since it just needs to take a sequence of steps to achieve a result and it is already thinking like a computer.


As for the AI in games chat, I imagine it would be some sort of Middleware solution where companies of long established franchises would just send them text dumps of all the games to train the model used for that series. Kind of like how there are studios that specialize in making trees really well. Programming is already modular it would simply be another module. You would also likely need to specify the limitations like the character can only sort of like params

I don't think it would benefit much though, maybe something like animal crossing or the sims, but when you just need NPCs to give clues what to do next , you just write flavour text in an afternoon.

It would also be a much smaller version since the inputs and choices are tightly controlled. There are only so many things you can do in the game that are defined so the reactions would be much simpler to program for.

I imagine that this would be more of a console thing for the AI or whatever and the games just plug into it with their limitations and other variables to tweak it's "personality".

thathonkey
Jul 17, 2012
im my experience the code that ai writes is pretty bad. but i think it may catch on because the code that most people write is also pretty bad

Nettle Soup
Jan 30, 2010

Oh, and Jones was there too.

I tried to get chatgpt to give me a simple-ish excel formula one time and after around ten attempts I gave up.

TotalLossBrain
Oct 20, 2010

Hier graben!

Nettle Soup posted:

I tried to get chatgpt to give me a simple-ish excel formula one time and after around ten attempts I gave up.

You need a prompt engineer

Atopian
Sep 23, 2014

I need a security perimeter with Venetian blinds.

thathonkey posted:

im my experience the code that ai writes is pretty bad. but i think it may catch on because the code that most people write is also pretty bad

Same situation as with text and images then.

Adbot
ADBOT LOVES YOU

Nettle Soup
Jan 30, 2010

Oh, and Jones was there too.

TotalLossBrain posted:

You need a prompt engineer
I think so, I tried to get it to give me some AI art around the same time, and it didn't go great.






and an example of bullshitting with absolute confidence:

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply