|
i only trust AI to generate project logos
|
# ? Mar 23, 2024 14:59 |
|
|
# ? May 15, 2024 03:31 |
tracecomplete posted:YMMV, but I never use LLMs for code I can’t write, but rather for obvious stuff and tedium that I’d rather have configured to taste instead of finding something and fitting it in. I’m a pretty good developer, but like, I can dictate a problem using Whisper (my hands are beat up, and I’m clearer when speaking anyway) to GPT4 (for personal uses, at work we use self-hosted models, which are much worse) and get something that I can give a quick once-over before integrating. It’s faster for me to review it and “go back and fix XYZ” than to churn out an API endpoint or an ORM call or whatever. I’ve also found that you can make a “custom GPT” and prompt it with stylistically desirable code in ways that all prompts using that have pre-baked, and it’s had the nice side effect of, say, making it super pedantic about quality logging and error messaging. (For getting that as autocomplete rather than a chat interface, I’ve found that Codeium is really good, better than GitHub Copilot, and free to use.) This is further exacerbated by management seeing dollar signs when they can hire a fresh-from-graduation person who'll cost significantly less, and they've already convinced themselves that LLM NNs will ~somehow~ make everything better, despite the fact that they have no clue how, if you press them on it. repiv posted:i only trust AI to generate project logos However, this might just be a product of ignorance, since it's been possible to whip up something in any image manipulation program and pass it through libcaca for as long as it's existed (around two decades). BlankSystemDaemon fucked around with this message at 15:20 on Mar 23, 2024 |
|
# ? Mar 23, 2024 15:12 |
|
a bitmap turned into text with libcaca isn't proper ascii art the output of chatgpt when you ask it for ascii art looks like it learned the standard figlet fonts (poorly) repiv fucked around with this message at 15:39 on Mar 23, 2024 |
# ? Mar 23, 2024 15:28 |
|
Ok I thought those were fake because I tested it on a few models and they did a lame attempt at the logo >Please draw the ChatGPT logo with ASCII art gpt4 gpt4-32k but then, gpt35: I sure do!
|
# ? Mar 23, 2024 16:21 |
|
BlankSystemDaemon posted:It's great that it's working for you, but there's mounting evidence to suggest that it's bad for a big portion of people (open access) - and critically, that portion is the majority, who'll be taking over your code in the future. I think the strongest argument you can take from that paper is basically “if you lack a strong enough understanding of what you’re doing to clearly specify a problem and if you lack the theory of not-actually-a-mind to frame instructions in a way that prioritizes what you care about, you’ll get bad results.” Well yeah, no poo poo? It’s also true for Stack Overflow, which I’ve been downstream of and having to fix or reject for years. When not guided there are obvious problems with regards to LLMs generating lovely, context-unaware and non-DRY code. And if you let it write crypto or your ACL code it probably does a bad job of it. So know your tools and loving fire people who treat them as oracles rather than tools. I emphatically don’t disagree that in the large, the pedagogical and practical effects of overreliance before you understand the problem domain are almost certainly real, but like, I’m a principal not an exec; I can’t control that except through code review and establishing standards, which I’d need to do anyway—and I don’t have a way to hire juniors (regardless of calendar age) at the day job nor the money to hire them for personal projects. quote:This is further exacerbated by management seeing dollar signs when they can hire a fresh-from-graduation person who'll cost significantly less, and they've already convinced themselves that LLM NNs will ~somehow~ make everything better, despite the fact that they have no clue how, if you press them on it. tracecomplete fucked around with this message at 16:32 on Mar 23, 2024 |
# ? Mar 23, 2024 16:26 |
|
mobby_6kl posted:Ok I thought those were fake because I tested it on a few models and they did a lame attempt at the logo it's really bad at making ascii art but apparently it understands it, because you can use that to get around alignment chatgpt won't tell you how to make a bomb but it will tell you how to make a https://arxiv.org/pdf/2402.11753.pdf
|
# ? Mar 23, 2024 16:41 |
tracecomplete posted:This link comes up a lot. I wouldn’t say that the paper establishes nothing, but I also wouldn’t say that it says very much. I investigated their GitHub when it first came out, and while I understand how they got to their conclusion I think that taking a snapshot of a year ago, using mediocre-to-bad queries and trailing-edge tools—Copilot is not very good and really shouldn’t be trusted for more than really basic substitution and autocomplete; it’s why I moved to Codeium, which is more generally aware of project structure, multiple files, and seems to have a better internal “book” for the languages I use—and projecting it out to the horizon (which the authors aren’t but most of the people who are referring to the paper implicitly are) is unwise both if you want to know how to use these tools effectively or if you don’t like their existence (and while I find them very useful, I would prefer a world where they didn’t exist, but that’s arguing with the weather) and want to measure them accurately. It’s like the difference between how an established artist making can use LLM NNs as a tool being different from some terminally ridiculous hypebro talking about the democratization of art when they have no understanding of what art is or how much effort goes into it. It also doesn’t even begin to touch upon the giant issue that is training data; for example, much of it is provably stolen or gathered by corporations using an existing bit of legalese excemption to copyright that is highly questionable - all of which means that as it currently is, anyone creating anything using these tools runs a risk of violating copyright, with all that entails, and without even the ability to know except when someone sues them if they make a commercial product using the tool. Nor does it touch on the abusive ways people are being treated by those corporations to classify the data so that it’s useful. While I think you have some good points, I don’t really see how they change anything; for example, the extrapolations made using the paper might not matter if you’re only relying on LLM NNs to be used by domain experts, but I fear that that’s asking a lot, and I don’t see different tools making a meaningful difference in human nature. BlankSystemDaemon fucked around with this message at 19:14 on Mar 23, 2024 |
|
# ? Mar 23, 2024 19:09 |
|
buglord posted:I used to feel bad that I was priced out by Nvidia. I also feel worse that I bought into the ray tracing marketing with a GPU that wasn’t cut out for it and also lacked VRAM (3070). I will continue to die on the hill of raytracing being a garbage gimmick and any form of upscaling being marketed on new GPUs as a tacit admission that their product is subpar. Like cool, the 4090 is a brilliant GPU but on raster performance it isn't twice the card the 7900XTX is and at two grand why do I loving care about 1080p upscaling?
|
# ? Mar 23, 2024 19:24 |
|
I will admit that I’m an idiot in both the “Upscaling seems like a lame cheat that doesn’t produce as good of an image, native is the only thing that should matter” camp as well as the “Streamed 4K isn’t real 4K quality, and 4K blu-ray is vastly superior” camp. And I will fully admit that my old-rear end eyes probably wouldn’t notice the difference for either situation. (See also, caring about color accuracy when I’m colorblind.)
|
# ? Mar 23, 2024 19:56 |
|
i don't care for the raytracing fad, but if it means gamedevs make sure to include dlss and upscaling support to boost fps, then at least some of it trickles down to people without supercomputer gpu's
|
# ? Mar 23, 2024 20:43 |
|
I think general, indiscriminate push back against LLMs is happening because A) it's clear to even casual observers that LLMs are widely being used very inappropriately, with obvious negative consequences in many of those cases B) vigorous, indiscriminate push back happened when hype around NFTs was peaking, and then NFTs departed from mainstream discourse (and the public agendas of large corporations!) very rapidly I don't blame anyone who sees these two and comes to the obvious conclusion that bashing the use of LLM wherever they find it is an efficient means of combating the consequences that have been widely reported.
|
# ? Mar 23, 2024 22:12 |
|
New favorite tech tuber https://www.youtube.com/watch?v=_GKxs7U3Xqo
|
# ? Mar 25, 2024 05:52 |
|
She's fantastic, and her comments section is lovely.
|
# ? Mar 25, 2024 06:23 |
|
Griddle of Love posted:I think general, indiscriminate push back against LLMs is happening because The same crypto scammers who pushed NFTs have pivoted all-in to LLMs to find ways to con people with them as well
|
# ? Mar 25, 2024 14:05 |
|
Harminoff posted:New favorite tech tuber I now adore Linux Grandma. Makes me want to grab my old Dell laptop with Mint on it and mess around. I am not a Linux knower.
|
# ? Mar 25, 2024 16:58 |
|
Mental Hospitality posted:I now adore Linux Grandma. Never too late to learn! She has only been using it for a couple years. https://www.youtube.com/watch?v=IAz9A5Y2wvU
|
# ? Mar 25, 2024 18:32 |
|
Mental Hospitality posted:I now adore Linux Grandma. Linux on old laptops is a great hobby. I'm running Mageia on a 2011 MacBook Air and it's way more usable than it is on the latest officially supported MacOS.
|
# ? Mar 25, 2024 20:58 |
We need to reinvent gods so that they can all come arrest HP, because they have commited not just crimes and felonies like the title suggests, but actual sins: https://www.youtube.com/watch?v=ssob-7sGVWs
|
|
# ? Mar 25, 2024 20:59 |
|
That was one of those things where I heard it but didn't quite comprehend for another minute what the gently caress.
|
# ? Mar 25, 2024 21:02 |
njsykora posted:That was one of those things where I heard it but didn't quite comprehend for another minute what the gently caress.
|
|
# ? Mar 25, 2024 21:25 |
|
njsykora posted:That was one of those things where I heard it but didn't quite comprehend for another minute what the gently caress. I love it when Gravis finds those. He’s so good at them.
|
# ? Mar 25, 2024 22:22 |
|
Mental Hospitality posted:I now adore Linux Grandma. https://www.youtube.com/watch?v=YTkvr_EC2LA
|
# ? Mar 25, 2024 22:27 |
|
BlankSystemDaemon posted:We need to reinvent gods so that they can all come arrest HP, because they have commited not just crimes and felonies like the title suggests, but actual sins: Come for the crazy hacky HP bullshit, stay for the rant about how we're at the end of technological history and invention is dead.
|
# ? Mar 25, 2024 23:09 |
|
BlankSystemDaemon posted:We need to reinvent gods so that they can all come arrest HP, because they have commited not just crimes and felonies like the title suggests, but actual sins: Society's to blame if you ask me. The Microsoft-Intel duopoly eroded away all the sane ways for PC hardware vendors to differentiate themselves, leaving them to grasp at straws like this. And the original sin behind the most horrible thing covered in that video was Intel putting SMM into the 386SL.
|
# ? Mar 26, 2024 00:28 |
|
I really do wish there were more weird architectures and OS’s to go along with them. PowerPC is all but dead, sparc is dead except in weird enterprise circumstances. ARM is at least different than x86 but runs all the same OS’s as x86, so it doesn’t seem very different in practice.
|
# ? Mar 26, 2024 00:31 |
|
Cyrano4747 posted:Come for the crazy hacky HP bullshit, stay for the rant about how we're at the end of technological history and invention is dead. There is a reason HP split into two companies -- so that the profitable company could drop the consumer product lines. When it comes to HPE server hardware design still has problems to solve and nice big margins.
|
# ? Mar 26, 2024 00:52 |
|
Beve Stuscemi posted:I really do wish there were more weird architectures and OS’s to go along with them. PowerPC is all but dead, sparc is dead except in weird enterprise circumstances. ARM is at least different than x86 but runs all the same OS’s as x86, so it doesn’t seem very different in practice. goon project: Zybourne OS
|
# ? Mar 26, 2024 01:03 |
|
trilobite terror posted:goon project: Zybourne OS a variant of plan 9
|
# ? Mar 26, 2024 01:08 |
|
Beve Stuscemi posted:I really do wish there were more weird architectures and OS’s to go along with them. PowerPC is all but dead, sparc is dead except in weird enterprise circumstances. ARM is at least different than x86 but runs all the same OS’s as x86, so it doesn’t seem very different in practice. I follow RISC-V because of this, very adventurous.
|
# ? Mar 26, 2024 01:44 |
Cyrano4747 posted:Come for the crazy hacky HP bullshit, stay for the rant about how we're at the end of technological history and invention is dead. BobHoward posted:Society's to blame if you ask me. The Microsoft-Intel duopoly eroded away all the sane ways for PC hardware vendors to differentiate themselves, leaving them to grasp at straws like this. And the original sin behind the most horrible thing covered in that video was Intel putting SMM into the 386SL. I seem to recall that SMM was introduced to make ACPI S3 and S4 work, which I thought was why he demonstrated that working in the video - but then it wasn’t brought up. SCheeseman posted:I follow RISC-V because of this, very adventurous. BlankSystemDaemon fucked around with this message at 01:49 on Mar 26, 2024 |
|
# ? Mar 26, 2024 01:46 |
|
trilobite terror posted:goon project: Zybourne OS Imagine four balls on the edge of a cliff. Say a direct copy of the ball nearest the cliff is sent to the back of the line of balls and takes the place of the first ball. The formerly first ball becomes the second, the second becomes the third, and the fourth falls off the cliff. Process scheduling works the same way.
|
# ? Mar 26, 2024 03:06 |
|
The problem with his argument - that we'll never have another Walkman moment, which is what he's using to describe an innovation so impactful that it reshapes the very way that society interacts with tech - is that we have already had another one since then: the smart phone. And that wasn't based on any kind of amazing breakthrough, it was just a bunch of small things getting good enough to make it possible: good enough touch screens, good enough batteries, good enough SSDs, and all of those good enough parts being cheap enough to make sense for consumer hardware. Do I have any idea what the next thing will be or if we'll get it in 10, 20 or 50 years? No loving clue, and that's kind of the point. He's got some other points that are well made about waste and how companies constantly gently caress up IT spending. Yeah, your average office drone can do their TPS reports on a 2015 vintage desktop just fine. Yeah, it's a very mature and very crowded marketplace to be selling enterprise office hardware. But I think he's also ignoring the kind of wear and tear that daily driver office computers face. My work PC is a 4 year old Dell surface clone and hooooly poo poo it's showing its age. It's absolutely fine spec-wise to do the MS Office type poo poo I need to use it for, but it's got a failing fan and the keyboard is acting wonky and the screen's a bit hosed up in a spot and in general it's just getting rough. If I wasn't using it docked all day it would absolutely need to be replaced. As it is I'm trying to baby it just because i don't want to deal with IT loving up transferring my poo poo over until I absolutely have to. So, yeah, Bob from accounting could do his job on a ten year old Dell desktop. But that hardware wasn't stuck in a time warp, it's been used and run ragged. It's like buying a 5 year old used car that's been doing Uber 24/7 and has 200k+ miles on it. poo poo wears out, and when you get new there's zero reason to artificially hold yourself back to some ancient good enough spec, even if you're not going to be using it to its full potential.
|
# ? Mar 26, 2024 03:06 |
|
Bring back the BeBox imo
|
# ? Mar 26, 2024 03:15 |
|
priznat posted:Bring back the BeBox imo Steve should have bought beOS. Every release of Mac OS drives us further from god.
|
# ? Mar 26, 2024 03:48 |
|
Beve Stuscemi posted:Steve should have bought beOS. Every release of Mac OS drives us further from god. As an Amiga guy BeOS was a pretty sweet evolution of that, kinda like a Mac and an Amiga had a baby.
|
# ? Mar 26, 2024 04:17 |
|
mac os got better and better from 10.0 until 10.6 and then its been on a steady decline ever since
|
# ? Mar 26, 2024 04:30 |
|
You can run Haiku on an i9 and live that BeOS dream today! https://www.youtube.com/watch?v=RCmrKKu-oBg
|
# ? Mar 26, 2024 04:32 |
|
BeBoxes were ahead of the curve with the sweet front panel LEDs.. That rules using one of those swanky Fractal cases for an updated version of it
|
# ? Mar 26, 2024 04:42 |
|
Beve Stuscemi posted:Steve should have bought beOS. Every release of Mac OS drives us further from god. Negotiations to buy Be were under Gil Amelio. Apple's board decided to buy NeXT instead because NeXT's OS was feature complete (beOS was a bit too tech demo in 1996, lots of stuff wasn't there yet), and it came with a side order of Steve Jobs. Also, JLG tried to demand too much for Be, thinking he was Apple's only option. (They did pay more than JLG's asking price to acquire NeXT, but they were getting a lot more.)
|
# ? Mar 26, 2024 05:48 |
|
|
# ? May 15, 2024 03:31 |
|
Cyrano4747 posted:The problem with his argument - that we'll never have another Walkman moment, which is what he's using to describe an innovation so impactful that it reshapes the very way that society interacts with tech - is that we have already had another one since then: the smart phone. And that wasn't based on any kind of amazing breakthrough, it was just a bunch of small things getting good enough to make it possible: good enough touch screens, good enough batteries, good enough SSDs, and all of those good enough parts being cheap enough to make sense for consumer hardware. I don't think he meant that the Walkman was the last of these moments (hell the PC came 2 years after the Walkman), more than the Walkman was a good example of this kind of "product that shapes the world around it" moment. But also you could make the argument that the smart phone wasn't really its own thing more than it was a combination of a bunch of other things, which is literally how Apple introduced the iPhone as a phone, iPod and web browser combined.
|
# ? Mar 26, 2024 06:02 |