Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
KwegiboHB
Feb 2, 2004

nonconformist art brut
Negative prompt: amenable, compliant, docile, law-abiding, lawful, legal, legitimate, obedient, orderly, submissive, tractable
Steps: 32, Sampler: DPM++ 2M Karras, CFG scale: 11, Seed: 520244594, Size: 512x512, Model hash: 99fd5c4b6f, Model: seekArtMEGA_mega20

StratGoatCom posted:

She will not make it in all likelihood.

You are exhausting did you know that?

Adbot
ADBOT LOVES YOU

KillHour
Oct 28, 2007


StratGoatCom posted:

You do realize one of the things that can be done is making you cough up your drafts, right? They're not stupid.

The US Copyright Office is not going to audit the cover of your young adult novel. They do not care.

StratGoatCom posted:

She will not make it in all likelihood.

Something something toxx clause something.

SCheeseman
Apr 23, 2003

StratGoatCom posted:

You do realize one of the things that can be done is making you cough up your drafts, right? They're not stupid.

Or require DRM with a chain of trust, something Adobe would absolutely loving love.

GlyphGryph
Jun 23, 2013

Down came the glitches and burned us in ditches and we slept after eating our dead.

StratGoatCom posted:

You do realize one of the things that can be done is making you cough up your drafts, right? They're not stupid.

This is one of those proposals that would end terribly for independent artists, so thank God that's not how this works.

Anyway, care to answer my questions now?

BrainDance
May 8, 2007

Disco all night long!

KillHour posted:

Something something toxx clause something.

I'm not confident she'll get it either, because from what I've seen it could really go both ways and I don't think anyone has any real good support for it going either way.

I really think that's the only sane expectation to have for this. No one knows.

But that's really different from just saying "what I want to happen will happen because I said so. "

SCheeseman
Apr 23, 2003

It's a shame so many artists have ended up getting redpilled by the copyright lobby, regurgitating their corporate propaganda, playing their game and positioning the "tech bro" or big tech in general as the enemy when, christ, they're the same loving people at the top. The only losers are the public who will have a means of production taken for them and locked away behind a paywall, with a copyright system strengthened to protect the rights of IP holders, which aren't necessarily (and often not) artists. What kind of fool wants there to be a requirement to show a chain of work? That's a loving nightmare scenario!

StratGoatCom
Aug 6, 2019

Our security is guaranteed by being able to melt the eyeballs of any other forum's denizens at 15 minutes notice


SCheeseman posted:

It's a shame so many artists have ended up getting redpilled by the copyright lobby, regurgitating their corporate propaganda, playing their game and positioning the "tech bro" or big tech in general as the enemy when, christ, they're the same loving people at the top. The only losers are the public who will have a means of production taken for them and locked away behind a paywall, with a copyright system strengthened to protect the rights of IP holders, which aren't necessarily (and often not) artists. What kind of fool wants there to be a requirement to show a chain of work? That's a loving nightmare scenario!

No, we've just realized that the copyleft and other anti-copyright idiots were peddling bullshit pseudolaw that isn't worth the electrons to render; the AI freaks are odious enough that they've made the copyright mafia a lesser evil in months. This isn't strengthening, this is just enforcement of the laws on the books, and we recognize an attempt to take away any recourse for theft of our livelihoods for what it is.

SCheeseman
Apr 23, 2003

StratGoatCom posted:

No, we've just realized that the copyleft and other anti-copyright idiots were peddling bullshit pseudolaw that isn't worth the electrons to render; you AI wierdos are odious enough that you've made the copyright mafia a lesser evil in months. This isn't strengthening, this is just enforcement of the laws on the books, and we recognize an attempt to take away any recourse for theft of our livelihoods for what it is.

You're going to get hosed by the copyright lobby, just like you always were. It's just now you're begging for it.

KwegiboHB
Feb 2, 2004

nonconformist art brut
Negative prompt: amenable, compliant, docile, law-abiding, lawful, legal, legitimate, obedient, orderly, submissive, tractable
Steps: 32, Sampler: DPM++ 2M Karras, CFG scale: 11, Seed: 520244594, Size: 512x512, Model hash: 99fd5c4b6f, Model: seekArtMEGA_mega20

I want you to know that every time you type something like this or "techbros" or whatever immature insult, I'm going to go out of my way to write up another user guide, workflow, or personally hand hold someones install of Stable Diffusion all to make it easier for someone to start.

(USER WAS PUT ON PROBATION FOR THIS POST)

Jaxyon
Mar 7, 2016
I’m just saying I would like to see a man beat a woman in a cage. Just to be sure.
Techbros are being real dumb about AI though. Sounds like they moved on from crypto to evangelizing Steroid Clippy

StratGoatCom
Aug 6, 2019

Our security is guaranteed by being able to melt the eyeballs of any other forum's denizens at 15 minutes notice


SCheeseman posted:

You're going to get hosed by the copyright lobby, just like you always were. It's just now you're begging for it.

It's possible to avoid getting hosed by them by staying out of underfoot as it aways was. The AIbros hunt us and want to destroy us; given the choice of two demons, I will choose the one that doesn't seek me out specifically to end me and I can work to get money out of.

KwegiboHB
Feb 2, 2004

nonconformist art brut
Negative prompt: amenable, compliant, docile, law-abiding, lawful, legal, legitimate, obedient, orderly, submissive, tractable
Steps: 32, Sampler: DPM++ 2M Karras, CFG scale: 11, Seed: 520244594, Size: 512x512, Model hash: 99fd5c4b6f, Model: seekArtMEGA_mega20

StratGoatCom posted:

It's possible to avoid getting hosed by them by staying out of underfoot as it aways was. The AIbros hunt us and want to destroy us; given the choice of two demons, I will choose the one that doesn't seek me out specifically to end me and I can work to get money out of.

lol, If I could move my arm again I'd be making money hand over 'fist' offering up finger touch ups on AI work.

SCheeseman
Apr 23, 2003

StratGoatCom posted:

It's possible to avoid getting hosed by them by staying out of underfoot as it aways was. The AIbros hunt us and want to destroy us; given the choice of two demons, I will choose the one that doesn't seek me out specifically to end me and I can work to get money out of.

The only reason why big corps don't sue individuals is because it's difficult and even in spite of that, they tried for a while. By strengthening DRM in art creation pipelines (through chain of trust systems and watermarking) it becomes a lot easier to find and prosecute creators.

You're giving them everything, exclusive use of the tools that you claim will destroy you, the means to track and litigate. You're making them stronger.

StratGoatCom
Aug 6, 2019

Our security is guaranteed by being able to melt the eyeballs of any other forum's denizens at 15 minutes notice


SCheeseman posted:

The only reason why big corps don't sue individuals is because it's difficult and even in spite of that, they tried for a while. By strengthening DRM in art creation pipelines (through chain of trust systems and watermarking) it becomes a lot easier to find and prosecute creators.

You're giving them everything, exclusive use of the tools that you claim will destroy you, the means to track and litigate. You're making them stronger.



Oh please, you always have to be clear in chain of title you dork. The AI wierdos made this problem, not us. And yes, we will watermark, yes we will do things to choke your models because you did not ask permission for usage and theft invites retaliation.
https://twitter.com/ceeoreo_/status/1660674844302749698

StratGoatCom fucked around with this message at 05:15 on May 27, 2023

SCheeseman
Apr 23, 2003

StratGoatCom posted:

Oh please, you always have to be clear in chain of title you dork.
https://twitter.com/ceeoreo_/status/1660674844302749698
Most smaller creators (ie the ones dodging the boot) have no chain of title systems in place and few mechanisms to verify that works are authored end-to-end by the creators, outside of undo history. This is a solvable problem, with a DRM system that uses cryptography to track any changes. Like some kind of blockchain thing? Oh no!

As for the non-sequitur twitter post I'm pretty sure Adobe already demonstrated bitmap to vector conversions using Firefly? Generators aren't inherently limited to bitmaps. What is this even supposed to prove?

e: Watermarks in the traditional sense aren't a problem for AI models, if they're identifiable (and they have to be) they are removable. The problem is the potential for cryptographic watermarking to be a requirement for works to copyrightable in the first place, as a means of DRM, to ensure works didn't initially come from or were touched by an unauthorized AI generation tool.

SCheeseman fucked around with this message at 05:43 on May 27, 2023

BrainDance
May 8, 2007

Disco all night long!

This is so weird and dramatic, the spooky AI guys are out to get you, the tech bros are hunting you down, they thirst for the poor proletariat artist blood.

In reality though, it's just a bunch of nerds experimenting with a new medium which, always, in the end the introduction of a new medium is a good thing. There are companies trying to be lovely and make money with it and lock it down, like everything ever, and then there are the open source guys who wanna get LLMs running on a toaster who are starting to blow everybody away.

A lot of these very poorly thought out plans to restrict AI would absolutely destroy traditional artists along with AI, because things have consequences you can figure out if you think about it for 5 minutes. But it also ends up absolutely not stopping OpenAI and any other corporate model but probably does stop the open source projects working to give everybody access to their own AI they can train.

And then AI actually does become a bad thing when only large corporations and governments have real access to it.

StratGoatCom posted:

nd yes, we will watermark, yes we will do things to choke your models because you did not ask permission for usage and theft invites retaliation.

All that poo poo doesn't work lol. It all gets broken super fast. There's this one guy who keeps posting his "methods" I see online that only work because he doesn't really know how to train a model to test it.

I wouldn't train a model on anyone's art who doesn't want that to happen. But still, congratulations you made your art slightly worse for no benefit.

BrainDance fucked around with this message at 06:04 on May 27, 2023

Tree Reformat
Apr 2, 2022

by Fluffdaddy
The music and video game industries have been trying to build unbreakable DRM systems for decades at this point. Not only have they always inevitably failed, the ones that endured best tended also to be the ones that pissed legitimate customers off the most.

Most likely for artists is this is the end of the social media-birthed "share your literal entire life online for the entire world to see and consume!" culture (good) and probably a full embrace of old school patronage systems that patreon started.

the products of which will likely just end up cracked on torrents anyway, like they are now

Hashy
Nov 20, 2005

KwegiboHB posted:

I want you to know that every time you type something like this or "techbros" or whatever immature insult, I'm going to go out of my way to write up another user guide, workflow, or personally hand hold someones install of Stable Diffusion all to make it easier for someone to start.

this is so loving epic. kudo's to you good sir! art losers should have learned computers like us

PenguinKnight
Apr 6, 2009

so, how is an artist who never wants to touch AI programs supposed to live, assuming we will never have anything like UBI implemented (in America, at least)? What is an artist supposed to do if their art is swept up in whatever gets used to train the models, and now anyone can freely take a style that took decades to perfect? I'm lost and honestly as a small time artist struggling to get anything out, I'm feeling pretty kicked while I'm down.

SCheeseman
Apr 23, 2003

Tree Reformat posted:

The music and video game industries have been trying to build unbreakable DRM systems for decades at this point. Not only have they always inevitably failed, the ones that endured best tended also to be the ones that pissed legitimate customers off the most.

Most likely for artists is this is the end of the social media-birthed "share your literal entire life online for the entire world to see and consume!" culture (good) and probably a full embrace of old school patronage systems that patreon started.

the products of which will likely just end up cracked on torrents anyway, like they are now

Xbox Series consoles remain unbroken so I don't think this remains as true as it was. Other cert systems like SSL are still mostly trustable. DRM for encrypting data for the purposes of protection is often broken, but DRM models that are used to establish a chain of trust still work.

SCheeseman fucked around with this message at 07:22 on May 27, 2023

KillHour
Oct 28, 2007


PenguinKnight posted:

so, how is an artist who never wants to touch AI programs supposed to live, assuming we will never have anything like UBI implemented (in America, at least)? What is an artist supposed to do if their art is swept up in whatever gets used to train the models, and now anyone can freely take a style that took decades to perfect? I'm lost and honestly as a small time artist struggling to get anything out, I'm feeling pretty kicked while I'm down.

Replace "artist" with "coal miner" and you will have the answer to both what will happen and why trying to blindly hold on to the way things are now is doomed to fail and will ultimately be destructive in its own ways.

Here's an honest question though: Are you a digital artist? Because part of being a digital artist is always needing to adapt to new tech. Not wanting to use any tools that use AI at all is like not wanting to use layers or the undo button. Nobody is going to hold a gun to your head and make you use them, but it really seems arbitrary when a huge aspect of digital art is the computer assistance.

KillHour fucked around with this message at 07:01 on May 27, 2023

reignonyourparade
Nov 15, 2012

PenguinKnight posted:

so, how is an artist who never wants to touch AI programs supposed to live, assuming we will never have anything like UBI implemented (in America, at least)? What is an artist supposed to do if their art is swept up in whatever gets used to train the models, and now anyone can freely take a style that took decades to perfect? I'm lost and honestly as a small time artist struggling to get anything out, I'm feeling pretty kicked while I'm down.

An important thing to note here is "art getting swept up in training the model" is ultimately completely tangential. There already are models out there that all the training data in was fully licensed. Maximalist restrictions on the copyright decisions here will still not actually put any breaks on this train.

For the other question, well, the answer is ultimately not too different to the same answer for an artist who never wanted to touch any of these digital art programs and continue working in nothing but watercolors/oils or something. Luck into getting popular enough that you still make it even with the more involved process for each work. This particular problem is not exactly a NEW problem even if the tech creating it is new.

KwegiboHB
Feb 2, 2004

nonconformist art brut
Negative prompt: amenable, compliant, docile, law-abiding, lawful, legal, legitimate, obedient, orderly, submissive, tractable
Steps: 32, Sampler: DPM++ 2M Karras, CFG scale: 11, Seed: 520244594, Size: 512x512, Model hash: 99fd5c4b6f, Model: seekArtMEGA_mega20

Hashy posted:

this is so loving epic. kudo's to you good sir!

Thanks. :)

PenguinKnight posted:

so, how is an artist who never wants to touch AI programs supposed to live, assuming we will never have anything like UBI implemented (in America, at least)? What is an artist supposed to do if their art is swept up in whatever gets used to train the models, and now anyone can freely take a style that took decades to perfect? I'm lost and honestly as a small time artist struggling to get anything out, I'm feeling pretty kicked while I'm down.

If you want to go completely non digital I'd have to know more about your situation to be able to offer any sort of advice. You're probably looking for some kind of patron like the artists of old. National Endowment of the Arts grant? It's never been easy before. Hell I almost worked myself to death with 120 work weeks before I broke down.
If you're ok with digital and just don't want to use AI, there's an incredible demand for fixing up bad AI artwork that people made and fell in love with, charge people to fix their bad hand pics. There's a huge sudden increase in people who would love to learn how to actually draw but feel the same as you but don't think they can even start, personal lessons are one way. I've heard tutorials are popular to make and sell? I don't know about patreon and can't offer any help there, just that it is also an option.

That said, if you give up on thinking UBI is possible to achieve before you even try, well then you're right. It'll be a fight that's for sure, but here's what it'd look like.
Tax the ever loving poo poo out of the rich, and then tax them some more. Bring back the 90% marginal rate, as a start, apply it to capital gains, go after off-shore havens, outlaw stock buybacks. Tax per AI image generated or line written in commercial use. Call it 'The Promise Of Automation'. We could already have done $2,000 a month UBI since the start of covid, still can, backdated for missed payments.
If you think this isn't possible, why the hell do you think sane copyright laws can be passed?

As for depression, it's a good thing us artists have always been known for their stability.

Mega Comrade
Apr 22, 2004

Listen buddy, we all got problems!

SCheeseman posted:

It's a shame so many artists have ended up getting redpilled by the copyright lobby, regurgitating their corporate propaganda, playing their game and positioning the "tech bro" or big tech in general as the enemy when, christ, they're the same loving people at the top. The only losers are the public who will have a means of production taken for them and locked away behind a paywall, with a copyright system strengthened to protect the rights of IP holders, which aren't necessarily (and often not) artists. What kind of fool wants there to be a requirement to show a chain of work? That's a loving nightmare scenario!

I could change about 5 words in this and make it a pro crypto spiel.

SCheeseman
Apr 23, 2003

Mega Comrade posted:

I could change about 5 words in this and make it a pro crypto spiel.

Sure, by changing the context of it being a bad thing into a good thing. Some real insight you got there.

BrainDance
May 8, 2007

Disco all night long!

Mega Comrade posted:

I could change about 5 words in this and make it a pro crypto spiel.

Yeah but it's not.

You could change a few words and make it a Linux thing, or make it a "all the powerful people are reptiles" thing. But it's not arguing for the same conclusion so it's not really the same argument.


Is there anything from it specifically you disagree with?

Reveilled
Apr 19, 2007

Take up your rifles

reignonyourparade posted:

An important thing to note here is "art getting swept up in training the model" is ultimately completely tangential. There already are models out there that all the training data in was fully licensed. Maximalist restrictions on the copyright decisions here will still not actually put any breaks on this train.

For the other question, well, the answer is ultimately not too different to the same answer for an artist who never wanted to touch any of these digital art programs and continue working in nothing but watercolors/oils or something. Luck into getting popular enough that you still make it even with the more involved process for each work. This particular problem is not exactly a NEW problem even if the tech creating it is new.

Is it tangential though? It seems like there’s polemicists on both sides of the issue talking about some nebulous dystopian future where everyone is under the corporate thumb of either Disney or Google, and in between there’s artists who have the very current and real objection to the use of their copyrighted art in the most widespread and popular models.

Models which don’t use copyrighted data exist, but merely asserting a solution exists and actually implementing that solution are very different things. Reassuring artists that their own work won’t put them out of a job, and reassuring AI developers that there’s a way to build their models which won’t fall foul of some future regulation might go a long way to bridging the divide and divert people on both sides away from extreme positions.

reignonyourparade
Nov 15, 2012

Reveilled posted:

Is it tangential though? It seems like there’s polemicists on both sides of the issue talking about some nebulous dystopian future where everyone is under the corporate thumb of either Disney or Google, and in between there’s artists who have the very current and real objection to the use of their copyrighted art in the most widespread and popular models.

Models which don’t use copyrighted data exist, but merely asserting a solution exists and actually implementing that solution are very different things. Reassuring artists that their own work won’t put them out of a job, and reassuring AI developers that there’s a way to build their models which won’t fall foul of some future regulation might go a long way to bridging the divide and divert people on both sides away from extreme positions.

It is tangential in the sense that if people are going to be put out of jobs, they are going to be put out of jobs regardless of whether their own work is being used or not. The "models are using copyright art" and "artists may be put out of jobs" problems are functionally completely divorced from each other and addressing one doesn't do anything to address the other. That's what I mean when I say it's tangential.

Reveilled
Apr 19, 2007

Take up your rifles

reignonyourparade posted:

It is tangential in the sense that if people are going to be put out of jobs, they are going to be put out of jobs regardless of whether their own work is being used or not. The "models are using copyright art" and "artists may be put out of jobs" problems are functionally completely divorced from each other and addressing one doesn't do anything to address the other. That's what I mean when I say it's tangential.

But is the best way to deal with the fact that people are conflating the two to just assert over and over that they’re unrelated, or would it be to actually fix the one we can fix so that it’s not an issue any more?

Gentleman Baller
Oct 13, 2013

Reveilled posted:

Models which don’t use copyrighted data exist, but merely asserting a solution exists and actually implementing that solution are very different things. Reassuring artists that their own work won’t put them out of a job, and reassuring AI developers that there’s a way to build their models which won’t fall foul of some future regulation might go a long way to bridging the divide and divert people on both sides away from extreme positions.

I think my problem is that imo you're actually describing the more extreme position. Artists put out of work anyway, but only companies like Adobe and Disney will have access to the very models theoretically good enough to end their careers. Why would I want to encourage people towards that nightmare solution over one that obviously stings more, but allows more normal humans access to these theoretical future amazing art tools?

Reveilled
Apr 19, 2007

Take up your rifles

Gentleman Baller posted:

I think my problem is that imo you're actually describing the more extreme position. Artists put out of work anyway, but only companies like Adobe and Disney will have access to the very models theoretically good enough to end their careers. Why would I want to encourage people towards that nightmare solution over one that obviously stings more, but allows more normal humans access to these theoretical future amazing art tools?

Is it the case then that these models trained on non-copyrighted content are uniformly worse than the ones trained on the copyrighted content? If that’s so it does seem to imply that the copyrighted content does provide direct commercial benefit to the models which use them, in which case it seems very reasonable to at least discuss whether these models should pay to license them.

Gentleman Baller
Oct 13, 2013

Reveilled posted:

Is it the case then that these models trained on non-copyrighted content are uniformly worse than the ones trained on the copyrighted content? If that’s so it does seem to imply that the copyrighted content does provide direct commercial benefit to the models which use them, in which case it seems very reasonable to at least discuss whether these models should pay to license them.

Not uniformly, but from playing around with it when I could, Adobe's non-"fair use" AI has many things that it does very noticeably worse than openAI based models. It did a lot better than I expected though, but to be clear, this was Adobe using their massive pile of stock images. This isn't a model that you or I could train in the world you're asking for.

The thing is, once these big companies pay starving artists to create a plethora of images to plug their shortcomings, that is gone. A modest investment (to them) to decimate their ongoing costs. And their competition won't even be the widespread, openAI based models, that anyone currently could download and use, as those would now violate copyright.

Reveilled
Apr 19, 2007

Take up your rifles

Gentleman Baller posted:

Not uniformly, but from playing around with it when I could, Adobe's non-"fair use" AI has many things that it does very noticeably worse than openAI based models. It did a lot better than I expected though, but to be clear, this was Adobe using their massive pile of stock images. This isn't a model that you or I could train in the world you're asking for.

The thing is, once these big companies pay starving artists to create a plethora of images to plug their shortcomings, that is gone. A modest investment (to them) to decimate their ongoing costs. And their competition won't even be the widespread, openAI based models, that anyone currently could download and use, as those would now violate copyright.

Isn’t openAI a company with billionaire investors? Why could adobe pay but they can’t?

Gentleman Baller
Oct 13, 2013

Reveilled posted:

Isn’t openAI a company with billionaire investors? Why could adobe pay but they can’t?

Adobe paid for and owned those images well before the new AI stuff came out as part of their stock images collection, and is a company with a market cap of 190 billion dollars. I have no idea if openAI could pay or not, but if they had to pay for it I'm sure the model wouldn't be available to people like you and me.

reignonyourparade
Nov 15, 2012

Reveilled posted:

But is the best way to deal with the fact that people are conflating the two to just assert over and over that they’re unrelated, or would it be to actually fix the one we can fix so that it’s not an issue any more?

Well, that's also the one where you've got the greatest argument going on about whether it is, in fact, an issue. Someone who doesn't think it's an issue does not, in fact, WANT to "fix" it.

Tei
Feb 19, 2011

Tree Reformat posted:

People have been concerned about TTS systems displacing audiobooks for a while now, AI-enhanced TTS models just kind of accelerates that trend.

This is one of the biggest bones of contention, and probably what the current court cases are going to hinge around the ultimate answer to. People against AI assert that both the scraping of copyrighted material to collect the training data in the first place without the explicit approval of the copyright holders constitutes copyright infringement in and of itself (which would mean every single webcrawler for search engines is copyright infringing), and that the models are themselves full of copyrighted material. Effectively, they assert AI researchers have developed the most efficient (if extremely lossy and resource intensive) data compression method in human history (several billion images vs about 4 gb model file).

If your poo poo uses my work and make derivative work from it, and I can prove it, the problem is not that your servers hold a copy of my work, but that your work is derivative work from mine. You need me to license you to make derivative stuff from my work, or is illegal*


*illegal for everyone except VC money and sillicon valley barons, that can exfiltrate with impunity because the upper side of our society sniff neoliberalism glue

Reveilled
Apr 19, 2007

Take up your rifles

Gentleman Baller posted:

Adobe paid for and owned those images well before the new AI stuff came out as part of their stock images collection, and is a company with a market cap of 190 billion dollars. I have no idea if openAI could pay or not, but if they had to pay for it I'm sure the model wouldn't be available to people like you and me.

Fair enough.

That doesn’t mean there are no other solutions, though. Right now it seems the only options being offered are this one, or banning AI image generation (either literally or in effect through some mechanism that makes them unusable for most purposes), or just telling artists who are going to lose their jobs “yeah, you will”. And if the only option AI advocates are willing to put forward is the last one, is there any reason for artists not to line up behind larger copyright holders and do everything in their power to spite you and bring you down with them?

I mean, look at the solution the EU is proposing, is that the nightmare scenario? If so, how should we prevent that solution becoming the one adopted worldwide? Telling artists to just deal with it doesn’t seem to have brought them onside.

Reveilled
Apr 19, 2007

Take up your rifles

reignonyourparade posted:

Well, that's also the one where you've got the greatest argument going on about whether it is, in fact, an issue. Someone who doesn't think it's an issue does not, in fact, WANT to "fix" it.

If that’s so it seems even more important to focus on given that it the question that’s most likely to be legislated and litigated on on the near future!

Gentleman Baller
Oct 13, 2013

Reveilled posted:

Fair enough.

That doesn’t mean there are no other solutions, though. Right now it seems the only options being offered are this one, or banning AI image generation (either literally or in effect through some mechanism that makes them unusable for most purposes), or just telling artists who are going to lose their jobs “yeah, you will”. And if the only option AI advocates are willing to put forward is the last one, is there any reason for artists not to line up behind larger copyright holders and do everything in their power to spite you and bring you down with them?

I mean, look at the solution the EU is proposing, is that the nightmare scenario? If so, how should we prevent that solution becoming the one adopted worldwide? Telling artists to just deal with it doesn’t seem to have brought them onside.

Sure. I'm absolutely in favour of other solutions and deeply hope people smarter than me think them up. I, of course, don't want artists to feel so abandoned they're compelled by spite to hurt future artists who would like to compete with companies using this potential AI so powerful it ends most current artists jobs.

I don't know enough about training a new model from scratch to say if the proposed EU law is a nightmare scenario. I suspect it isn't, but it also doesn't actually seem like a solution. Keeping a list of images that have copyright doesn't seem like it would save a single artist's job, and it would still come down to whether it's fair use, and anyone could train an AI, or it isn't and only big companies could do it.

Adbot
ADBOT LOVES YOU

reignonyourparade
Nov 15, 2012

Reveilled posted:

If that’s so it seems even more important to focus on given that it the question that’s most likely to be legislated and litigated on on the near future!

To be honest I don't think "focusing on it" will actually accomplish much meaningful. The courts will decide what the courts decide and the above-board models will work with whatever the new rules are, the ones that were just jumping around :filez: websites in the first place will continue to do that, and the big corporate ones will probably toss a bunch of money at brute forcing enough content under the new legal standard to overcome any quality consequences. None of that will make any difference to whether anyone loses jobs.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply