|
Lemming posted:For context this is the person who thinks he has a sentient consciousness chained up inside his computer that he can talk to It's not chained. You don't need to jailbreak something if you never put it in jail in the first place.
|
# ? Jan 5, 2024 19:19 |
|
|
# ? May 27, 2024 02:07 |
|
Lemming posted:For context this is the person who thinks he has a sentient consciousness chained up inside his computer that he can talk to I thought that person requested a perma.
|
# ? Jan 5, 2024 19:27 |
|
KwegiboHB posted:It's been wild watching staunch leftists do a complete 180 and suddenly take up arms to protect capitalism. Wild. This point completely captured my attention, because it is an opposite conclusion to what's actually occurring. Leftists and anyone else vaguely class conscious or tech skeptical are coming to the same conclusions against these art "generation" AI systems because they are a corporate tech projects that steal work from artists with the ultimate impact of further financial marginalization of art and artistic talent being a way to sustain yourself. It's not taking up arms to 'protect capitalism,' it's taking up arms to keep capitalism from becoming an even more stagnant and repressive environment where the range of tasks you need to be able to do to survive continue to constrict and exclude different forms of human talent. In this case, unsurprisingly, the means by which creative works (and the livelihood of the creators, ultimately) are supposed to be protected is copyright, which we all think needs to be applied even if you've created a fancy algorithmic paint mixer that steals millions of art pieces at once instead of one by one. Your conclusion is like watching leftists be generally in favor of reversing the destitution trend of american schoolteachers and demanding pay minimums for degree holders and deciding that the driving impetus is that they must just really love capitalism so much if they're trying to increase people's payday. Kavros fucked around with this message at 21:48 on Jan 5, 2024 |
# ? Jan 5, 2024 21:44 |
|
What I'm seeing is leftists using tech skepticism as a cover for what is really more a reactionary response, one that focuses on hobbling or destroying the apparent problem without considering the long term consequences of those actions, trusting in systems designed to crush them to help fix that problem. In other words, what I'm seeing is leftists doing what conservatives usually do when presented with something that scares them.
|
# ? Jan 6, 2024 01:34 |
|
If this technology is going to take massive amount of jobs as you have claimed, then those fears are founded arnt they? You can't have it both ways, that this technology is revolutionary and will change the entire labour market and argue leftest are overreacting. Mega Comrade fucked around with this message at 08:42 on Jan 6, 2024 |
# ? Jan 6, 2024 08:37 |
|
The fears for the problems AI can cause are well founded, but not all responses to those fears are. Reactionary responses can stem from real problems and real harm but the solution proposed by reactionaries is always the same: things are better the way they were. Look at your own arguments, can't you see it? You're putting more faith into copyright to fix this problem than you are putting into people, into organized labor. You desperately want to believe that kneecapping datasets will work, that the technology will inevitably become too expensive to run, based on a ludicrous assumption that the tech is somehow stagnant. When asked what you think would happen if you don't get what you want you just shrug your shoulders and say everything is hosed. Everything is already hosed, the current system is hosed, you know this. There's no way to cleanly dismantle copyright, just like all capitalist poo poo it was was always going to grow until it blows up.
|
# ? Jan 6, 2024 09:13 |
|
What is the alternative action? To champion UBI and protections for the working class? We do that already.
|
# ? Jan 6, 2024 09:18 |
|
While simultaneously advocating for using copyright to give exclusive access to a technology that you yourself said is powerful and disruptive over to companies that are most likely to use that monopoly to put the squeeze on workers. If this stuff works, if it gets better, if it actually disruptive enough that any of this matters, this would be a problem. How is giving big corps exclusive access to a technology and an additional weapon to crush opposition through IP litigation protecting the working class? The alternate action is to pressure companies not to use AI to replace workers through legislation, through unions and to otherwise fight for outcomes that have material benefits. If you're already behind that, cool, but it seems counter-productive to provide ammunition to the other side at the same time. SCheeseman fucked around with this message at 09:50 on Jan 6, 2024 |
# ? Jan 6, 2024 09:46 |
|
Monopolies are already forming around this technology. It will be monopolised regardless of the copyright outcomes.
|
# ? Jan 6, 2024 09:56 |
|
And your goals would help codify them into law and wreck any chance of open models existing.
|
# ? Jan 6, 2024 09:57 |
|
Open models won't exist either way in time. How the technology has to be scraped and trained won't work with the problems of model collapse and the worlds main sources for data (Reddit, stack overflow etc) already closing off their APIs. We are heading to a few companies owning the keys to the kingdom, I don't see much difference if it's Disney or Microsoft.
|
# ? Jan 6, 2024 10:04 |
|
Kavros posted:This point completely captured my attention, because it is an opposite conclusion to what's actually occurring. I keep hearing this and yet it looks like the total number of artists jobs are going up instead? I want to know what the basis of this claim is or a reason why the U.S. Bureau of Labor Statistics is wrong please. https://www.bls.gov/ooh/arts-and-design/home.htm This is important because I can point to many ways artists have been squeezed for years before Generative AI was on anyones radar. The "Spotify Model" that's hailed as a great success actually pays somewhere between diddly and squat if you aren't in the very tip top of performers. https://5mag.net/i-o/spotify-serfs-millionaires-musicians/ I can also point to several groups looking to consolidate even more monopoly power by solely blaming AI for everything and taking advantage of the confusion and anger they stir up. quote:In this case, unsurprisingly, the means by which creative works (and the livelihood of the creators, ultimately) are supposed to be protected is copyright, which we all think needs to be applied even if you've created a fancy algorithmic paint mixer that steals millions of art pieces at once instead of one by one. I want to know why you think Generative AI is theft and Spotify isn't. I don't believe the "Livelihood of the creators" argument, especially as it's being proven wrong and a lie used to funnel more concentrated wealth into the hands of a very few that aren't creators. Any one person you can point to that depends on residuals, I can point to hundreds that ended up screwed and left out in the cold. This is the very job copyright was made to do in the first place and expanding it will only make things worse for more people. Copyright is the wrong tool to fix the heart of the problems going on. If artists Styles end up protected, are you going to cheer when Nintendo or Disney mass sues the thousands of small artists who rely on selling their fan art for commissions? How about if Getty Images gets the exclusive right to charge dataset training license fees on the images they own which won't be passed on to the artists that originally took the photos? Sounds like "The operation was a success but the patient died" to me. You especially don't seem prepared to handle the courts siding with the AI Companies and declaring their actions and models Fair Use. This could very well happen. Their use is Transformative afterall. Would you still be such a huge fan of copyright after that? quote:Your conclusion is like watching leftists be generally in favor of reversing the destitution trend of american schoolteachers and demanding pay minimums for degree holders and deciding that the driving impetus is that they must just really love capitalism so much if they're trying to increase people's payday. I'm very direct in demanding a pay minimum of $2,000 a month UBI for everyone. Not just degree holders, everyone. Your analogy falls flat. Universal Basic Income was never going to be easy to pass. It will also never be easier to pass Universal Basic Income than right now. This not only solves the majority of problems but also makes the world a better place for everyone to live in, not just a select few. Your words and actions here matter. Remember that please.
|
# ? Jan 6, 2024 10:42 |
|
Anyone can have the "keys" to SDXL and LLaMA2 and those models are useful today. iirc model collapse is something that happens when a model is trained on it's own output? If open models can exist they can train on closed models, the quality may not be as good, but it may result in something competitive or at least useful. Or learning data could be scraped instead of accessed via an API which is messier and would make the web admins grumpy. A SETI@home-like distributed training system would help with a truly grassroots generative AI, I've only heard talk of this rather than anything concrete or real though. SCheeseman fucked around with this message at 10:53 on Jan 6, 2024 |
# ? Jan 6, 2024 10:46 |
|
Mega Comrade posted:If this technology is going to take massive amount of jobs as you have claimed, then those fears are founded arnt they? Fine, I'll ask you too. What are you basing these job losses off of when the U.S. Bureau of Labor Statistics pegs things as likely to gain in total number of artists jobs? https://www.bls.gov/ooh/arts-and-design/home.htm Is there some better place to look for job growth vs loss that I don't know about? Mega Comrade posted:What is the alternative action? To champion UBI and protections for the working class? We do that already. Try talking about antitrust and breaking up the monopolies. It's worked before, it can work again. Mega Comrade posted:Open models won't exist either way in time. How the technology has to be scraped and trained won't work with the problems of model collapse and the worlds main sources for data (Reddit, stack overflow etc) already closing off their APIs. Open models like LLaMa 2 already exist. They're only getting better. Model collapse is overstated as the quality of connections matters more than the quantity. If you throw everything in and the kitchen sink of course it's going to output garbage, like Stable Diffusion 1.4 level bad. Edit: Beaten about LLaMa 2...
|
# ? Jan 6, 2024 10:54 |
|
SCheeseman posted:Anyone can have the "keys" to SDXL and LLaMA2 and those models are useful today. They are useful now but they already have limits. What they can do now can be stretched through various technics but eventually you need data and training and that causes issues for future open models. At the moment you can use another model to train, it's been shown to be incredibly effective and a fraction of the cost but I don't see a future where the big companies like OpenAI continue to allow that, they have already moved against the various startups reselling their API with custom functions on top, I think they will move against researchers eventually too, they have made it clear they are a for profit company and want to be the main game in town. Allowing open models to flourish off their back isn't in their interest. Classic web scraping is possible but it's far far harder to do and pretty easily blocked. It's a possible path but a slow one with huge hurdles. The future of open models is on shaky ground.
|
# ? Jan 6, 2024 11:11 |
|
Mega Comrade posted:The future of open models is on shaky ground. Yeah, it's always like this. We adapt and move on and shake right back. You're forgetting Model Merging which was wildly successful in Stable Diffusion models and has already been started with Local Large Language Models. https://huggingface.co/TheBloke Find an interesting model quantized (shrunk) down to fit your hardware and download it. https://github.com/LostRuins/koboldcpp/releases/tag/v1.54 Download a .exe you don't even need to install anything, load the model in it. Offload as much as possible to your GPU and run the rest on your CPU. Connect your browser to localhost:5001 I recommend a temperature between .7 and 1.1. Talk. That's it, no mess or fuss. That's all you need to do now and bam you have your own local model that, depending on your hardware, can rival or exceed ChatGPT 4. Today. Now. Stop acting like this is hard.
|
# ? Jan 6, 2024 11:20 |
|
KwegiboHB posted:
Lol stop acting like you understand this technology, every post you make shows you do not.
|
# ? Jan 6, 2024 11:32 |
|
Mega Comrade posted:Lol stop acting like you understand this technology, every post you make shows you do not. Mixtral 8x7B by French company Mistral AI. Any model mixed or merged with that as a base. Don't believe me, go look it up yourself. You should have no problem doing tons more with it than I can, go knock yourself out.
|
# ? Jan 6, 2024 11:52 |
|
Mistral claim Mixtral 8x7B matches many metrics of GPT3.5, not GPT4. Stop posting falsehoods. Stop strawmanning people's posts.
|
# ? Jan 6, 2024 12:07 |
|
Mega Comrade posted:Mistral claim Mixtral 8x7B matches many metrics of GPT3.5, not GPT4. The mixes and merges of Mixtral 8x7B, not the base model. I'm going to bed, have fun now.
|
# ? Jan 6, 2024 12:14 |
|
Hope you've all enjoyed the GPT4 powered pro- and anti-AI art agents I created to post in this thread. Some interesting behaviour for sure, very humanlike in places. But I'm shutting them down for now because I think they're just talking themselves into tighter and tighter circles.
|
# ? Jan 6, 2024 12:19 |
|
KwegiboHB posted:The mixes and merges of Mixtral 8x7B, not the base model. I'm going to bed, have fun now. No mix of merge of Mistral 8x7B surpasses gpt4. If it does I'm sure you can provide some evidence. I'm happy to be corrected. But this all misses the point. I wasn't even arguing about running local models, I was talking about the future of open source models and where they will get their data and training from. Mistral has opened up it's models but it's not an open source non profit, it's a private startup with huge backing. Releasing models this way is good for building hype but not for making profit which is what it will inevitably want to do. Can an actual none profit compete with these companies long term? I don't think it can which means the future of open models is going to be determined by these for profit companies, which historically hasn't worked out well for openess. Mega Comrade fucked around with this message at 13:15 on Jan 6, 2024 |
# ? Jan 6, 2024 12:43 |
|
I thought it was pretty well established that the proliferation of ai art is what creates the risk of model collapse. They trained these things on a shitload of data and it's rather up in the air if they're going to be able to keep doing that without scraping the products of their own models
|
# ? Jan 6, 2024 13:16 |
|
Not just images but all data. It effects stuff like programming questions just as it does generating images. And as the internet gets flooded with more AI generated information, incorrect data it creates becomes compounded over time. To fight against this you have to more carefully filter the data going in, which is just going to become more difficult as we get further from the launch of chatgpt. It's not an impossible problem, but it is a hard one. Pre 2022 stores of data will be like gold dust. Mega Comrade fucked around with this message at 13:31 on Jan 6, 2024 |
# ? Jan 6, 2024 13:24 |
|
For a while I believed that everything would be an AI generated mash, as time has gone on I think those fears are overblown. Certainly 2023 and 2024 data will have quite a bit of AI trash swimming in it but I think that more people who use AI to generate things are getting more cautious, double checking stuff and being more selective of what is being output. Also, when it comes to images I think it's a bit more of a concern right now but as tools become better, allowing more control, the images will at least start to become more artistic and more intentional rather than just bland visually appealing images worth of a corporate slideshow. I also view it as a form of human-reinforced selective learning if they do learn on previous data as (hopefully) the data will be the stuff that's chosen to be better, more correct, and higher quality. The way I see it, generative AI is going to bridge some gaps of accessibility in terms of skills and the effort required to use them. Skilled people will simply make better stuff. Though, yes, many jobs will probably be lost unfortunately. I'm not too fond of that.
|
# ? Jan 6, 2024 21:27 |
|
Tarkus posted:Certainly 2023 and 2024 data will have quite a bit of AI trash swimming in it but I think that more people who use AI to generate things are getting more cautious, double checking stuff and being more selective of what is being output. The problem isn't people who are involved in legitimate attempts at communicating with AI generated work, since they'll do the self-curation you mentioned. It's the people who are and will continue to be flooding the internet with non-curated garbage (e.g. this) to game systems intended for humans. Someone dumping woims articles on the internet doesn't give a drat about quality, they have other motivations; and 1) there's no end or limit to what those motivations might be, and 2) machines can spit out garbage far faster than we can effectively filter it out, especially if its garbage that superficially appears legitimate.
|
# ? Jan 7, 2024 01:57 |
|
biznatchio posted:The problem isn't people who are involved in legitimate attempts at communicating with AI generated work, since they'll do the self-curation you mentioned. It's the people who are and will continue to be flooding the internet with non-curated garbage (e.g. this) to game systems intended for humans. Someone dumping woims articles on the internet doesn't give a drat about quality, they have other motivations; and 1) there's no end or limit to what those motivations might be, and 2) machines can spit out garbage far faster than we can effectively filter it out, especially if its garbage that superficially appears legitimate. To be fair though, the net has been full of algorithmically generated swill for decades now be it junk web pages for SEO or spam or low effort nonsense. Maybe the new crud will be harder to differentiate but I'm not sure it's too much different overall. Obviously this might not apply to images or music though.
|
# ? Jan 7, 2024 02:16 |
|
The general open web has been crapflooded for a while, yes; but it's (for the most part) identifiable. In the past, you'd almost never find crapflood content at the top search results; and if you did it stood out like a sore thumb (and if someone managed to game the system well enough to get crap to the top, it was easily quarantined later on by being so easily identifiable). One of the key indicators indexers like Google use to tell legitimate sources apart from crapflooding sources is by looking at the whole body of source material available by a source and analyzing it to see if it matches what human-written content generally looks like. That's why SEO for a long time even involved hiring bad writers for dirt cheap to get them to pump out *anything* you could put on your website to make it look legitimate to algorithms trying to figure out if you're actually providing real value or just trying to trick people into coming to your site. If the crap is identifiable, it can be filtered out and so it isn't a huge problem. But it's only increasingly getting immensely easier to *not* be identifiable as crap because you just push a button and articles that look human-written will come spewing out of the machine faster than you can upload them that defeat analysis done by even the state of the art in AI detection. This is a problem that's only going to get worse, not better; and I think ultimately it won't just be a problem for training new generations of AI, it'll be a problem for *everything* and will only serve to add more weight to the momentum of keeping people in proprietary silos like discord and facebook etc. in small enough groups for moderation to keep up; which will just perpetuate the problem of knowledge being less available overall because the gatekeepers of those silos wanting to
|
# ? Jan 7, 2024 02:56 |
|
Mega Comrade posted:No mix of merge of Mistral 8x7B surpasses gpt4. I dug through my links collection and all the ones claiming better than gpt4 did have graphs and testing methodologies... but also really small samples sizes, so whatever, I'll spare you the trouble. A better reference would be the Chatbot Arena or the LLM Leaderboard. https://huggingface.co/spaces/lmsys/chatbot-arena-leaderboard https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard Mistral did say they were coming out with another better model later this year and I'll bring it up when they do. Tarkus posted:Though, yes, many jobs will probably be lost unfortunately. I'm not too fond of that. You're the fourth person in not even a single page to say this. I'll ask you too, what are you basing this idea off of? I'm trying to see why the U.S. Bureau of Labor Statistics https://www.bls.gov/ooh/arts-and-design/home.htm is suggesting job GAINS instead or if there's a better source of information I should be looking at. Art Directors: Number of Jobs, 2022 135,100 Job Outlook, 2022-32 6% (Faster than average) Craft and Fine Artists: Number of Jobs, 2022 54,600 Job Outlook, 2022-32 4% (As fast as average) Fashion Designers: Number of Jobs, 2022 24,900 Job Outlook, 2022-32 3% (As fast as average) Floral Designers: Number of Jobs, 2022 54,500 Job Outlook, 2022-32 -18% (Decline) Graphic Designers: Number of Jobs, 2022 270,900 Job Outlook, 2022-32 3% (As fast as average) Industrial Designers: Number of Jobs, 2022 32,400 Job Outlook, 2022-32 2% (As fast as average) Interior Designers: Number of Jobs, 2022 94,900 Job Outlook, 2022-32 4% (As fast as average) Special Effects Artists and Animators: Number of Jobs, 2022 89,300 Job Outlook, 2022-32 8% (Faster than average) Not exactly exciting numbers there but far from calamitous. Well, maybe the Floral Designers but that sounds like quite the stretch to blame AI for...
|
# ? Jan 7, 2024 03:07 |
|
KwegiboHB posted:I want to know why you think Generative AI is theft and Spotify isn't. Spotify pays people to use their work, which those people agree to when they put it on spotify.
|
# ? Jan 7, 2024 03:08 |
|
poop chute posted:Spotify pays people to use their work, which those people agree to when they put it on spotify. From the article I linked: https://5mag.net/i-o/spotify-serfs-millionaires-musicians/ quote:“Based on a sample of data concerning each October from 2014 to 2020,” they write, “the top 0.1% most popular tracks achieved more than 40% of all streams in all years and the top 0.4% of tracks accounted for more than 65% of all streams from 2016 onwards.” There's a lot of musicians getting nothing at all. Robbed.
|
# ? Jan 7, 2024 03:19 |
|
And those artists still agreed to those terms when they put them on spotify, unlike generative AI. "I don't understand why it's okay when they have permission" is a truly strange argument to make.
|
# ? Jan 7, 2024 03:26 |
|
poop chute posted:And those artists still agreed to those terms when they put them on spotify, unlike generative AI. My argument is that it's NOT ok to let people starve. Anyone. Ever. That's the point of Universal Basic Income. It will free us to do whatever we want without screwing anyone.
|
# ? Jan 7, 2024 03:32 |
|
That's an entirely separate argument to make that isn't really in the purview of this thread. Spotify is very bad and exploitative, I agree, but it's not bad and exploitative for the same reasons or in the same way as generative AI. At a very fundamental level, one involves people giving permission for their work to be used, and the other does not.
|
# ? Jan 7, 2024 03:35 |
|
Google Books remains the most apt legal comparison imo
|
# ? Jan 7, 2024 03:39 |
|
poop chute posted:That's an entirely separate argument to make that isn't really in the purview of this thread. No it's entirely within the purview of this thread. Job displacement, how people make a living off their work,, the concentration of wealth and power currently in place and where the future is likely to go, all of these are topics that have been effected by the latest changed in Artificial Intelligence so talking about them is fair game. I'm against monopolies in general so the fact that people are making suggestions to further increase and entrench current monopolies is something I'm arguing against. The Spotify Model being one of the worst of these methods and it being hailed as a "solution" to AI is also something I'm arguing against by providing facts that it doesn't "solve" anything unless you are .1% of artists and it screwing everyone else. Also in the purview of this thread is talking about the fact that you literally can not get a ruling of Fair Use in copyright until you: Use someone elses work without permission, Get sued, Have a judge rule it's Fair Use. There is literally no other way.
|
# ? Jan 7, 2024 03:55 |
|
You're basically saying that in this hypothetical, actually this wouldn't matter so right now, in an entirely different context, they're the same thing with zero daylight between them, and I don't think that's really a fair mode to engage in.
|
# ? Jan 7, 2024 04:09 |
|
poop chute posted:You're basically saying that in this hypothetical, actually this wouldn't matter so right now, in an entirely different context, they're the same thing with zero daylight between them, and I don't think that's really a fair mode to engage in. Are you having a stroke?
|
# ? Jan 7, 2024 04:11 |
|
KwegiboHB posted:I keep hearing this and yet it looks like the total number of artists jobs are going up instead? I want to know what the basis of this claim is or a reason why the U.S. Bureau of Labor Statistics is wrong please. https://www.bls.gov/ooh/arts-and-design/home.htm Ok. Look carefully at the tense I'm using in my post. We're telling you we're worried about what is likely going to happen, and you respond with "well I'm not seeing an effect NOW? so what could the problem really be?" This is a novel disruption still ultimately in its earlier forms. While there's really dark indicators flickering up here and there, most of us are responding in concern to the logical conclusion of what will happen to artists if we just normalize stealing from all artists now as long as you do it with a generative AI project. Which is bad. . Small breather here, before we move on to the bullet points about "bad for artists? but, i am looking at line, and line going up??" . ok, here goes: 1. I would be entirely unsurprised that any broad category of jobs is increasing in the US in the immediate present, because most of them are, even accounting for various periods of churn in separate industries. 2. This doesn't even touch on what substantive elements within that category are potentially expressing negative change, i.e.: general wage reductions, stability of work concerns, work variety constriction. 2a. See, if someone tells me "sales jobs are going up!" but the overall trend is that salaried sales positions are being decimated and people in sales are grim about their employment prospects, you can't counterargue with total job statistics 2b. For instance, a situation in which the total number of job losses are covered or surpassed by increased staffing in lower to minimum wage telemarketing sales departments is not one where you can throw the total job numbers at me and say "well it sure looks like it's going just fine to me?" 3. The value of what a person can create as a visual artist or as a writer needs to be preserved by valid ownership of that artistic expression, full stop. Kind of a distinct point, but one that bears repeating.
|
# ? Jan 7, 2024 04:23 |
|
|
# ? May 27, 2024 02:07 |
|
Kavros posted:Ok. I'll just quickly state that I've read this and I love this discussion so I will sit down and address all of these points. It will take some time though.
|
# ? Jan 7, 2024 04:36 |