|
I know ChatGPT is regularly bad, but when it's good it's scarily good. Today I was having a database issue because I'm bad at databases. I explained my database structure and what I was trying to do in natural language and it constructed a full minimum working example for me not just in SQL (which is what I asked first) but also in SQLAlchemy (the specific Python library I was using), solving the issue instantly. This was after googling had failed miserably, and there was no language in common with any of the standard examples. Google search is absolutely hosed.
|
# ? Mar 6, 2023 19:48 |
|
|
# ? May 18, 2024 02:26 |
|
pumpinglemma posted:I know ChatGPT is regularly bad, but when it's good it's scarily good. Today I was having a database issue because I'm bad at databases. I explained my database structure and what I was trying to do in natural language and it constructed a full minimum working example for me not just in SQL (which is what I asked first) but also in SQLAlchemy (the specific Python library I was using), solving the issue instantly. This was after googling had failed miserably, and there was no language in common with any of the standard examples. So, stackoverflow but it didn't condescend to you first? No thanks.
|
# ? Mar 6, 2023 19:51 |
|
pumpinglemma posted:I know ChatGPT is regularly bad, but when it's good it's scarily good. Today I was having a database issue because I'm bad at databases. I explained my database structure and what I was trying to do in natural language and it constructed a full minimum working example for me not just in SQL (which is what I asked first) but also in SQLAlchemy (the specific Python library I was using), solving the issue instantly. This was after googling had failed miserably, and there was no language in common with any of the standard examples. Im a programmer too. For programming stuff that poo poo is really good If you can explain what you need properly in human language, it will probably give you an useful answer edit: of course it wont write a full complex program for you, but for the kind of stuff you would usually go to google or stack overflow, it does helps
|
# ? Mar 6, 2023 20:05 |
|
the key about getting chatGPT to do simple programs is it is rather easy to validate the answer if you ask chatGPT to write you an essay with sources it is much harder to determine if it's bullshitting you, compared to copy/pasting the code and seeing if it compiles
|
# ? Mar 6, 2023 20:09 |
|
Compiling isn't working.
|
# ? Mar 6, 2023 20:19 |
|
pumpinglemma posted:Google search is absolutely hosed. Google have had their own model in the works for years. But determined it wasn't close to ready. Microsoft realised everyone hates bing anyway so launching a new search that actively makes poo poo up would be fine. You can see this in their demos, Googles bard had an error, stock crashes. Microsoft had multiple huge ones, stock shot up. Mega Comrade fucked around with this message at 20:35 on Mar 6, 2023 |
# ? Mar 6, 2023 20:32 |
|
evilweasel posted:the key about getting chatGPT to do simple programs is it is rather easy to validate the answer Besides that compiling is a low bar but still higher than not having syntax errors in interpreted languages, that depends on what you mean by simple. I don't use ChatGPT (or Copilot) myself but I've seen plenty of posted examples where it will write something that works fine for a specific input example but not something very slightly different, which I suppose is fine if you just need a one-off that you can easily validate but not so much otherwise. Of course you can ask it to write unit tests if you intend to reuse the code at all. Funny enough, the first two results I found for that are a positive one (starting in Java and then using this testing cloud testing framework they're trying to sell) and a much more negative one with Python/selenium. And on that note, Do Users Write More Insecure Code with AI Assistants?: quote:We conduct the first large-scale user study examining how users interact with an AI Code assistant to solve a variety of security related tasks across different programming languages. Overall, we find that participants who had access to an AI assistant based on OpenAI's codex-davinci-002 model wrote significantly less secure code than those without access. Additionally, participants with access to an AI assistant were more likely to believe they wrote secure code than those without access to the AI assistant. Furthermore, we find that participants who trusted the AI less and engaged more with the language and format of their prompts (e.g. re-phrasing, adjusting temperature) provided code with fewer security vulnerabilities. Finally, in order to better inform the design of future AI-based Code assistants, we provide an in-depth analysis of participants' language and interaction behavior, as well as release our user interface as an instrument to conduct similar studies in the future.
|
# ? Mar 6, 2023 20:49 |
|
I'm seeing people start to use ChatGPT on a subreddit where people try to identify video games that they've forgotten the names of. The results are like the world's dumbest bullshitter glanced at the first page of a Google search. It will grab the title of a random game based off of a simple keyword search, give some vague facts from what sounds like the first sentences of a wikipedia page, and then just restate details from the question into its answer. Details that don't have any connection to the game it's describing. If an elderly person started answering questions like this, you'd take them to the doctor to get checked for the early signs of dementia.
|
# ? Mar 6, 2023 20:52 |
One of my friends works in a scientific field, and has reached out to me several times with variations on "I had ChatGPT write this code and I can't figure out why it doesn't work, can you help me debug?". I suspect that if the MBAs have their way, this will become my full-time job. (For what it's worth, I responded by helping him understand the general concepts and then told him not to use ChatGPT to code in the future because if you don't understand the code that it generates, you have no assurance of its correctness.)
|
|
# ? Mar 6, 2023 21:04 |
|
VikingofRock posted:One of my friends works in a scientific field, and has reached out to me several times with variations on "I had ChatGPT write this code and I can't figure out why it doesn't work, can you help me debug?". I suspect that if the MBAs have their way, this will become my full-time job. Looking forward to research published with incorrect results because the ChatGPT code used to analyse the data 'worked'.
|
# ? Mar 6, 2023 21:33 |
|
lol if reddit ruins its own side hustle of being the search engine helper
|
# ? Mar 6, 2023 21:41 |
|
My expectation for how ChatGPT and similar AI affects the workplace is that it won't be able to replace humans doing the same job, but it will look enough like that to the executives that they'll fire the people anyhow and it will take years to for people to admit it was a bad move.
|
# ? Mar 6, 2023 21:51 |
|
pumpinglemma posted:Google search is absolutely hosed. If they can get it to scale. Google search requires huge data centers to run, and the current version of ChatGPT requires more than 10x the power of Google search for each search. They say they can drop it down to only 3x-5x more expensive, depending on who you talk to. But that's still a lot of money at that scale.
|
# ? Mar 6, 2023 21:56 |
|
golden bubble posted:If they can get it to scale. Google search requires huge data centers to run, and the current version of ChatGPT requires more than 10x the power of Google search for each search. They say they can drop it down to only 3x-5x more expensive, depending on who you talk to. But that's still a lot of money at that scale. I was wondering how they will incorporate ads into all this that comes even close to getting any kind of engagement.
|
# ? Mar 6, 2023 22:05 |
|
Mega Comrade posted:I was wondering how they will incorporate ads into all this that comes even close to getting any kind of engagement. Easy, you just adjust the model to prioritize brands!
|
# ? Mar 6, 2023 22:23 |
|
Jaxyon posted:Easy, you just adjust the model to prioritize brands! quote:"year 10 school essay on why King Louis XVI was executed"
|
# ? Mar 6, 2023 23:03 |
|
sinky posted:Looking forward to research published with incorrect results because the ChatGPT code used to analyse the data 'worked'. Looking forward to another Mars Climate Orbiter.
|
# ? Mar 7, 2023 00:43 |
|
Right, so how would you fit ads into that purely historical text?
|
# ? Mar 7, 2023 01:10 |
|
It has to spit out a variable amount of history based on how far/back forward from the event it needs to go in order to place a relevant ad.
|
# ? Mar 7, 2023 02:15 |
|
I actually think if they could solve the egregious bullshit issue then they could monetise by just charging a subscription fee, no ads required. They won’t, but they could. The bullshit is definitely real, though, the next question I asked it pointed me to a library function that didn’t exist and fell into the usual “oh poo poo I’ve been caught out” loop when I asked for a link to the docs.
|
# ? Mar 7, 2023 02:41 |
|
It’s even harder to justify charging a subscription fee for it given the issues it has with accuracy. With ads people will tolerate any type of shoddy product. Plus the mass market access is an essential part of building the product. Generative AI only became truly viable when they managed to get a large number of people to rate and rank the results. It’s free labor.
|
# ? Mar 7, 2023 03:02 |
|
Jaxyon posted:My expectation for how ChatGPT and similar AI affects the workplace is that it won't be able to replace humans doing the same job, but it will look enough like that to the executives that they'll fire the people anyhow and it will take years to for people to admit it was a bad move. It will absolutely be able to replace the "coding bootcamp" folks, because it exhibits the same degree of thought and competence while relying on a far larger training set. It will not be able to replace the people who fix the code those idiots poo poo out.
|
# ? Mar 7, 2023 05:07 |
|
Elias_Maluco posted:Im a programmer too. For programming stuff that poo poo is really good depends on the language. I had it do pinescript (which I do) and it did a poo poo job. Same with trying to do stuff in rust. Both in multiple cases were wrong and/or wouldn't compile among other issues.
|
# ? Mar 7, 2023 06:55 |
|
PT6A posted:It will absolutely be able to replace the "coding bootcamp" folks, because it exhibits the same degree of thought and competence while relying on a far larger training set. Yeah and that same "those coders suck, not like me" attitude that has kept coders from unionizing, is how they'll accept getting replaced by chatGPT. The "good" coders will howl just as hard when management replaces them because management doesn't know the difference between good coders and bad ones.
|
# ? Mar 7, 2023 09:26 |
|
There Bias Two posted:Right, so how would you fit ads into that purely historical text? Pterry posted:There were a number of reasons for switching to Goldmann, but a deeply personal one for me was the way Heyne (in Sourcery, I think, although it may have been in other books) inserted a soup advert in the text … a few black lines and then something like ‘Around about now our heroes must be pretty hungry and what better than a nourishing bowl’… etc, etc. My editor was pretty sick about it, but the company wouldn’t promise not to do it again, so that made it very easy to leave them. They did it to Iain Banks, too, and apparently at a con he tore out the offending page and ate it. Without croutons.
|
# ? Mar 7, 2023 09:53 |
|
Jaxyon posted:Yeah and that same "those coders suck, not like me" attitude that has kept coders from unionizing, is how they'll accept getting replaced by chatGPT. The "good" coders will howl just as hard when management replaces them because management doesn't know the difference between good coders and bad ones. This will begin just as soon as Atlassian figures out how to jam this into Jira so idiot non technical product managers can write stories that get GPT-coded and poo poo out of the deployment pipeline.
|
# ? Mar 7, 2023 13:49 |
|
Motronic posted:This will begin just as soon as Atlassian figures out how to jam this into Jira so idiot non technical product managers can write stories that get GPT-coded and poo poo out of the deployment pipeline. Please do not speak this evil
|
# ? Mar 7, 2023 15:10 |
|
Heck Yes! Loam! posted:Please do not speak this evil Maybe they can plot it on a burndown chart
|
# ? Mar 7, 2023 15:37 |
|
The AI will automatically prepare a corrective code snippet in response to new bug reports, you just have to review it to confirm that it works before pasting it in.
|
# ? Mar 7, 2023 16:13 |
|
pumpinglemma posted:I actually think if they could solve the egregious bullshit issue then they could monetise by just charging a subscription fee, no ads required. They won’t, but they could. The bullshit is definitely real, though, the next question I asked it pointed me to a library function that didn’t exist and fell into the usual “oh poo poo I’ve been caught out” loop when I asked for a link to the docs. Haha, they already have a premium account for $20 a month. You get faster responses and access to a "Legacy" model of ChatGPT. I bought 1 month, but probably won't renew since I'm playing around a lot with local LLMs.
|
# ? Mar 7, 2023 16:16 |
|
i think maybe chatgpt would be really cool if it could be implemented into video games somehow, to make the npc dialog
|
# ? Mar 7, 2023 16:18 |
|
OctaMurk posted:i think maybe chatgpt would be really cool if it could be implemented into video games somehow, to make the npc dialog it would have to be something the game was designed around. the issue is that npc dialogue is usually meant to provide hints/worldbuilding/characterization/guidance so the game would either need to clearly indicate the ai dialogue is just random crap segregated from the human-written stuff or just be a game where the dialogue doesnt matter at all it might be helpful as something to generate random filler for villagers or whatever to say before being edited by a human though
|
# ? Mar 7, 2023 16:25 |
|
deep dish peat moss posted:Because I haven't seen it mentioned in this thread: here's an example of AI-driven games in action:
|
# ? Mar 7, 2023 16:31 |
|
OctaMurk posted:i think maybe chatgpt would be really cool if it could be implemented into video games somehow, to make the npc dialog I'm sure there's more than a few software developers that are thinking about that. First issue would be, can you restrict it so the chat doesn't suddenly start trying to make a npc in a fantasy RPG talk about how a dragon last week burnt down their village cinema or something. And two even if you can restrict the chat to broadly stuff that should be said in the game world, the dialog would be even more removed than random typical npc dialog as don't really see a way the game could parse what ever dialog that chat is making up. Unless you can come up with a clever way to get the chat ai and the game to have ongoing interaction it probably wouldn't work that great. Not impossible problem though.
|
# ? Mar 7, 2023 16:38 |
|
dr_rat posted:I'm sure there's more than a few software developers that are thinking about that. First issue would be, can you restrict it so the chat doesn't suddenly start trying to make a npc in a fantasy RPG talk about how a dragon last week burnt down their village cinema or something. And two even if you can restrict the chat to broadly stuff that should be said in the game world, the dialog would be even more removed than random typical npc dialog as don't really see a way the game could parse what ever dialog that chat is making up. Unless you can come up with a clever way to get the chat ai and the game to have ongoing interaction it probably wouldn't work that great. Not impossible problem though. There's also the issue that the more random nonsense you throw at the player, the harder it is for them to identify and retain the actually important information. For example, if every NPC tells you about their day then it's risky to use that dialog to hint about side content. Now, if it can also generate side content, that's different. But, honestly, if it's just fluff the NPCs might as well just do a random emote animation.
|
# ? Mar 7, 2023 16:42 |
|
it could be neat if it was used as like a background thing. like if some billboards in cyberpunk had ai-generated text constantly scrolling talking about how great some product was, itd be setting-appropriate as something a corp would try and also make a statement about how lovely and dehumanized even advertising has become. or if half life 2 had generated audio in some areas talking about how great the combine were as an offputting display of alien propaganda
|
# ? Mar 7, 2023 16:48 |
|
OctaMurk posted:i think maybe chatgpt would be really cool if it could be implemented into video games somehow, to make the npc dialog funnily enough their older models are better at this, chat was finetuned on example conversations that reiterate it is an AI assistant over and over so there's too much risk of it bleeding through
|
# ? Mar 7, 2023 16:53 |
|
Blue Footed Booby posted:There's also the issue that the more random nonsense you throw at the player, the harder it is for them to identify and retain the actually important information. For example, if every NPC tells you about their day then it's risky to use that dialog to hint about side content. Blue Footed Booby posted:Now, if it can also generate side content, that's different. Depending on how reliably you could train it to answer in a very particularly format, that could maybe possible? Would just be a slightly more evolved radiant quest thing. Someone actually tried this, by following quest as suggested by a Chat ai. Obviously the quest weren't written into the game, and it was still very janky, but still pretty interesting: https://www.youtube.com/watch?v=81RSpHNbi5M
|
# ? Mar 7, 2023 16:59 |
|
All this sounds like way more trouble than its worth for an inferior experience. Games are bloated with generic filler content far too much already. They don't need more.
|
# ? Mar 7, 2023 17:08 |
|
|
# ? May 18, 2024 02:26 |
|
I think it's important to realize the amount of smoke and mirrors required in game dev to give an experience that feels "organic". Things that feel surprising are almost always designed to feel that way without you realizing it. Devs do an awful lot of work to keep you on rails while making you feel empowered. Generative AI just isn't capable of matching that experience without needing an incredible amount of hand-holding, which means:Ghost Leviathan posted:All this sounds like way more trouble than its worth for an inferior experience. Games are bloated with generic filler content far too much already. They don't need more. I mean maybe it's a useful tool in some limited contexts. But also think about any memorable side quest you've ever played in an Elder Scrolls or Fallout or Baldur's Gate or whatever. An AI just isn't going to be able to make free-associative zany poo poo like that, quite literally it's a fundamental constraint of the way the model is designed.
|
# ? Mar 7, 2023 17:13 |