|
Tunicate posted:license Someone get Nathen Mazri on the phone
|
# ? May 15, 2024 06:06 |
|
|
# ? Jun 4, 2024 06:01 |
|
Probably best not to assume that someone in marketing means well. They might! Many do! It just wouldn't be my first assumption.
|
# ? May 15, 2024 06:16 |
Atopian posted:Probably best not to assume that someone in marketing means well. I shudder at the thought of an unconscious marketing person.
|
|
# ? May 15, 2024 06:54 |
|
Instagram you knew we needed ads you cant mute as a cool 2000s throwback!!! Thank you so much I just love this nostalgia trip!!!!!
|
# ? May 15, 2024 07:00 |
|
deep dish peat moss posted:I used to work for a company that did a lot of e-commerce and digital marketing research and all of their research kept coming back with the conclusions that people absolutely love personalized shopping/advertising experiences and personably-written marketing mailing lists. On one hand I want to say it was just bullshit self-congradulatory stuff but on the other hand I think the average consumer is dumb as hell and loves being told what to spend their money on. So, let's add this up shall we? The company spent oodles of time, money, and effort providing a more personalized experience, but when they ask their customers what they want they practically beg for a more personalized experience. Customer: I want A. Company: Here you go. Customer: This is B. I want A. Company: Here you go. Customer: I Want A. Company: Here you go. Customer: ... Company: Consultant, what does this customer want?
|
# ? May 15, 2024 08:40 |
|
teen witch posted:Instagram you knew we needed ads you cant mute as a cool 2000s throwback!!! Thank you so much I just love this nostalgia trip!!!!! I have a big physical button I can turn down on the 1980s amp that is the base of my PCs sound system. Nothing is unmutable to me I feel like Nevil Clavain sometimes. Which is a character from a bunch of sci-fi books, his deal is he and others like him got heads full of tech and a lot of people get their heads compromised by a bad guy super hacker. Except him because his implants are so woefully outdated it's like trying to hack a rotary phone.
|
# ? May 15, 2024 08:51 |
|
His Divine Shadow posted:I feel like Nevil Clavain sometimes. Which is a character from a bunch of sci-fi books, his deal is he and others like him got heads full of tech and a lot of people get their heads compromised by a bad guy super hacker. Except him because his implants are so woefully outdated it's like trying to hack a rotary phone. Ghost in the Shell had a character like that. They were a team of cyborgs but they kept one regular human on the team just in case they went up against a super hacker or something. Thought everyone would like to know that. God bless. wash bucket fucked around with this message at 11:59 on May 15, 2024 |
# ? May 15, 2024 11:46 |
|
DrBouvenstein posted:If you think Google's half-assed, lovely AI helpers in their search is bad now, don't worry... It'll be full-assed soon enough: my big issue with Google AI search summaries is that they're almost always wrong, and you have to dig into them to find out why. I was searching for the dimensions on a specific model of truck bed, and the summaries showed me the dimensions of a different make/model entirely.
|
# ? May 15, 2024 14:10 |
|
Vampire Panties posted:my big issue with Google AI search summaries is that they're almost always wrong, and you have to dig into them to find out why. I was searching for the dimensions on a specific model of truck bed, and the summaries showed me the dimensions of a different make/model entirely. I'm certainly not the first, nor the most eloquent to say this, but this ai poo poo is going to start compounding problems really fast if people build anything of real import on it. If it can't get something basic like "specific truck dimensions" correct, then it can't be trusted for any of the real science that proponents promise over and over it'll revolutionize. Your use case here is the simplest imaginable, it's looking for hard data with no real ambiguity to it. A lot of people have pointed out already that busy people will often default to what's convenient, and if we, say, start genuinely implementing this tech in the medical field, then poo poo's going to go off the rails fast. The tech simply isn't there. I recommended a brand of sticker paper I've had good success with in a crafting subreddit (yuck, I know, but I needed answers for some troubleshooting on my cutter machine) and some little ai bot chimed in with a summary of other reviews that weren't accurate to the product at all. It was claiming a lot of things that were counter to all experiences I've had with it, and I can only figure that it was lumping in all sticker paper reviews together or reporting from people who were using the wrong kind of printer, etc. I wonder if they gently caress up so badly when collating the data they're stealing from everyone.
|
# ? May 15, 2024 14:52 |
|
Its just destroying the casual internet. Data you actually need to do a job has not been infected yet. I hope. It probably has
|
# ? May 15, 2024 14:53 |
|
Not to mention all of the AI that's going to get their lovely info from the incorrect AI. Absolutely nothing will be reliable. poo poo's going to get real bad real quick. I wonder if they still sell Encyclopedia Britannica.
|
# ? May 15, 2024 17:00 |
|
Tunicate posted:Similarly, Coca Cola advertisers created the myth that their ads created the modern Santa. Their job is literally lying to the public about how good things are, of course they'll make this poo poo up. Astonishing. Just looked it up and had no problem finding modern santa ads that predate coca cola! The crazy thing is I've only ever heard this factoid in the context of "Our entire culture, all folk wisdom, and basically everything you love is the product of some fucker's toothpick or enema tonic ad campaign in 1802" and not that it was a good thing. I notice advertisers, from the writing of Bernays to the guy I knew who blasted spam out of 4 cables modems in his lovely apartment and turned it into an empire, they have this absolutely warped degree of professional myopia where they honestly think there is nothing wrong with what they do and consumers are more than happy to be "educated" with dozens of emails a day and ads crammed into every other tiny faucet of their lives. Also cable modem guy once said he had sex with his cousin on the couch in his server room but then denied they were related later saying she was merely a new employee
|
# ? May 15, 2024 18:37 |
|
|
# ? May 15, 2024 19:40 |
|
Pennywise the Frown posted:Not to mention all of the AI that's going to get their lovely info from the incorrect AI. Absolutely nothing will be reliable. Games are going to have locally run LLM models (by the way this will increase the size of the game 30-100 GB) driving NPC's. Graphics cards are going to have more (seemingly) genuine empathy than actual people and gamers who are already introverted are in for a weird time.
|
# ? May 15, 2024 19:44 |
|
moist banana bread posted:Games are going to have locally run LLM models (by the way this will increase the size of the game 30-100 GB) driving NPC's. Why locally? Host that poo poo online so they can switch the servers off after 3 years and make people buy the sequel.
|
# ? May 15, 2024 19:49 |
|
Yeah you're right, probably not (though not for that reason). We'd all need $10,000 computers with 100GB of RAM or whatever. I guess my realization is that it's feasible though. It'll be cloud hosted, which yea is convenience for the "games as a service" goons.
|
# ? May 15, 2024 20:08 |
|
moist banana bread posted:Yeah you're right, probably not (though not for that reason). We'd all need $10,000 computers with 100GB of RAM or whatever. I guess my realization is that it's feasible though. It'll be cloud hosted, which yea is convenience for the GAAS goons. you sell smart computers to people at a significant markdown with 2 layers of abstracted, bloated data-gathering software before it boots windows or whatever os
|
# ? May 15, 2024 20:10 |
|
Getting mad at the in-game AI helper for arguing with me about vagina bones
|
# ? May 15, 2024 20:11 |
|
moist banana bread posted:Yeah you're right, probably not (though not for that reason). We'd all need $10,000 computers with 100GB of RAM or whatever. I guess my realization is that it's feasible though. It'll be cloud hosted, which yea is convenience for the "games as a service" goons. Microsoft is dictating that future Windows versions will require a hardware dedicated AI chip
|
# ? May 15, 2024 20:12 |
|
Games aren't going to be running AI for NPCs any time soon for a variety of reasons. These reasons include: Running a local model requires way, way too many resources Running a remote model is expensive that has to be paid by someone. Both of these solutions will produce inconsistent results and there is no way to say, '...And please don't make poo poo up!' to the model The local solution will produce substantial latency. The remote solution on big hardware will produce some latency. The model size would only be a few gigs, but the sheer resources required to run it would be out of control, especially along side what is ostensibly a whole game. So no, this isn't happening except for games that are built around it as the prime feature. Baldur's Gate 4 / Dragon Age Next / Cyberpunk 2078 will not feature this.
|
# ? May 15, 2024 20:15 |
|
sinky posted:Why locally? Host that poo poo online so they can switch the servers off after 3 years and make people buy
|
# ? May 15, 2024 20:34 |
|
Mozi posted:Microsoft is dictating that future Windows versions will require a hardware dedicated AI chip Don't AI things still run off of GPU? So basically they're saying everyone's going to need 2 video cards?
|
# ? May 15, 2024 20:37 |
|
deep dish peat moss posted:Don't AI things still run off of GPU? So basically they're saying everyone's going to need 2 video cards? Maybe not, but perhaps the best gaming experience will come from using Nvidia brand shovels. Meaning maybe you can have a voice conversation with the uncanny valley in the next installment of your favorite RPG. Or not, ATI virgins.
|
# ? May 15, 2024 20:45 |
|
deep dish peat moss posted:Don't AI things still run off of GPU? So basically they're saying everyone's going to need 2 video cards? Seriousposting in the whine thread: Most PCs today not have a neural processing unit (NPU). I think you can make a lot of use of a GPU when training models for AI (ie machine learning), but for very fast inference (ie when evaluating inputs using a model, to make predictions/classifications/whatever) you normally need a specific NPU. (I could be wrong on this part.) Some phones and tablets do have them, notably Apple were the first, and I'm pretty sure their M-series notebook CPUs have NPUs too. Google put their Tensor chips in some Pixel phones, and I think Intel and AMD have both added an NPU function in their last-gen CPUs. Right now, I believe most ML inference happens in the cloud, like when you use Midjourney or GPT. Those use huge models that don't make sense to put locally on your PC. In the future, though, these vendors predict you'll have tons of small AI models deployed locally for apps to use, like Apple's camera app magic that compensates for your shaky hands. Hippie Hedgehog fucked around with this message at 21:04 on May 15, 2024 |
# ? May 15, 2024 21:02 |
|
Problem: LLMs require astronomical amounts of quality data, all of which is completely exhausted Solution: use synthetic data created by other LLMs to train the LLMs Problem: this is roughly the same as connecting a dog's anus directly to its mouth and claiming to have invented perpetual motion Solution: evolve captchas to require humans to train the LLMs directly Problem: most humans make bad copy editors and contaminate the new data, also there aren't enough thumbs on planet earth to do this anyway Solution: require humans to complete a captcha before they get a response from their LLM. the captchas will be LLM responses to other humans
|
# ? May 15, 2024 21:17 |
|
Hippie Hedgehog posted:Seriousposting in the whine thread: Most PCs today not have a neural processing unit (NPU). I think you can make a lot of use of a GPU when training models for AI (ie machine learning), but for very fast inference (ie when evaluating inputs using a model, to make predictions/classifications/whatever) you normally need a specific NPU. (I could be wrong on this part.) NPUs are effectively just a hardware feature set on SoCs. My knowledge on their specifics is limited, so grain of salt and all. Getting result from a model means access to a model loaded in memory, and physical distance to that memory matters a shitload. Putting some 'AI-enabled CPU' in a computer ain't it for this purpose - you are too far away from the actual model. The compute for any work is going to be done on the GPU, which also happens to be well-tailored to this workload. In an SoC with a CPU, GPU, and Memory all on one die (e.g. Apples M-series chips), then conversation changes a little bit but the fundamental ideas are the same. It also runs into different sets of problems that limit it's ability to do large workloads in that space. What Apple will call 'AI cores' is likely just going to be not that different from what Nvidia Tensor cores: compute units that can do mixed precision matrix math natively. But you can't say 'Matrix Core', you have to brand your poo poo AI-y so it's now an AI core... Actually maybe you could call it a Matrix Core, that's marketable by itself... All this is to say that AI isn't magic. Computers are not magic. AI is not even AI - It's machine learning dressed up in a fancy suit. It has the same practical uses it's had for decades and the new implementations are shiny and fun and have extended that use case in places, but it's not going to transform computers from 'math boxes' into 'MAGIC'. Canine Blues Arooo fucked around with this message at 21:35 on May 15, 2024 |
# ? May 15, 2024 21:30 |
|
*flips switch to 'more magic'"
|
# ? May 15, 2024 23:21 |
|
lol "neural processing unit" that name carries a lot of hubris
|
# ? May 15, 2024 23:23 |
|
Canine Blues Arooo posted:Games aren't going to be running AI for NPCs any time soon for a variety of reasons. These reasons include: You missed "the technology literally can't do this even if it worked perfectly because its not an actual AI that understands what is coherent or interesting to a player"
|
# ? May 16, 2024 00:05 |
|
MrQwerty posted:lol "neural processing unit" Either hubris or a sad statement about the current state of human neural processing.
|
# ? May 16, 2024 00:06 |
|
Salt Fish posted:Either hubris or a sad statement about the current state of human neural processing. both
|
# ? May 16, 2024 00:07 |
|
You guys say this but the latest version of chat gpt Im having problems actually stopping it from providing me with over a 1000 lines of perfectly written code each time. The previous model it could be a struggle to get it to produce snippets sometimes but this one shits out the lot no questions asked. I can absolutely see a world where basic flavor text is dynamically generated by a LLM rather than a variant or two hardcoded in Cabal Ties fucked around with this message at 00:29 on May 16, 2024 |
# ? May 16, 2024 00:27 |
|
Cabal Ties posted:You guys say this but the latest version of chat gpt Im having problems actually stopping it from providing me with over a 1000 lines of perfectly written code each time. The previous model it could be a struggle to get it to produce snippets sometimes but this one shits out the lot no questions asked. Here's a true story for you. Two teams of developers both started using the latest GPT to automate coding. One team reported back that 30% of the code generated by GPT was usable. The other team reported that 50% of the code generated was usable. They were using the same version and same language.
|
# ? May 16, 2024 00:42 |
|
lol
|
# ? May 16, 2024 01:16 |
|
Generating code is likely much easier since it just needs to take a sequence of steps to achieve a result and it is already thinking like a computer. As for the AI in games chat, I imagine it would be some sort of Middleware solution where companies of long established franchises would just send them text dumps of all the games to train the model used for that series. Kind of like how there are studios that specialize in making trees really well. Programming is already modular it would simply be another module. You would also likely need to specify the limitations like the character can only sort of like params I don't think it would benefit much though, maybe something like animal crossing or the sims, but when you just need NPCs to give clues what to do next , you just write flavour text in an afternoon. It would also be a much smaller version since the inputs and choices are tightly controlled. There are only so many things you can do in the game that are defined so the reactions would be much simpler to program for. I imagine that this would be more of a console thing for the AI or whatever and the games just plug into it with their limitations and other variables to tweak it's "personality".
|
# ? May 16, 2024 01:52 |
|
im my experience the code that ai writes is pretty bad. but i think it may catch on because the code that most people write is also pretty bad
|
# ? May 16, 2024 02:20 |
I tried to get chatgpt to give me a simple-ish excel formula one time and after around ten attempts I gave up.
|
|
# ? May 16, 2024 02:41 |
|
Nettle Soup posted:I tried to get chatgpt to give me a simple-ish excel formula one time and after around ten attempts I gave up. You need a prompt engineer
|
# ? May 16, 2024 02:43 |
|
thathonkey posted:im my experience the code that ai writes is pretty bad. but i think it may catch on because the code that most people write is also pretty bad Same situation as with text and images then.
|
# ? May 16, 2024 02:49 |
|
|
# ? Jun 4, 2024 06:01 |
TotalLossBrain posted:You need a prompt engineer and an example of bullshitting with absolute confidence:
|
|
# ? May 16, 2024 02:54 |