Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Sourdough Sam
May 2, 2010

:dukedog:
https://twitter.com/billyperrigo/status/1615682180201447425?t=Of2xkkmurvbHJkKwUjKq_w&s=19

Here's AI empowering the commoners!

Adbot
ADBOT LOVES YOU

Moongrave
Jun 19, 2004

Finally Living Rent Free
openAI, the least open one of the lot, the one funded by elon musk, is bad????????

Sir Mat of Dickie
Jul 19, 2012

"There is no solitude greater than that of the samurai unless it be that of a tiger in the jungle... perhaps..."
I can't imagine it would be that hard for commercial AI platforms to put in a "copyrighted material" check when they output something, much the way YouTube et al. do it. "How similar is this to a copyrighted image in our training set?" surely can't be that hard to train a model on. That won't suffice if they decide you can't use copyrighted images in training, though (IMO, that would be completely absurd).

pixaal
Jan 8, 2004

All ice cream is now for all beings, no matter how many legs.


Sir Mat of Dickie posted:

I can't imagine it would be that hard for commercial AI platforms to put in a "copyrighted material" check when they output something, much the way YouTube et al. do it. "How similar is this to a copyrighted image in our training set?" surely can't be that hard to train a model on. That won't suffice if they decide you can't use copyrighted images in training, though (IMO, that would be completely absurd).

then don't provide the training images keep building on them and say you destroy them after training so even you aren't sure what it's trained on.

Sedgr
Sep 16, 2007

Neat!

I too am shocked a corporation outsources AI tech work to the same places corporations outsource other tech work to.

They'll keep doing it until they have an AI good enough to not need the outsourcing anymore.

Same as always.

Fitzy Fitz
May 14, 2005





Not saying that this stuff isn't worth investigating and reporting, but this is the same as in literally every industry.

Doctor Zero
Sep 21, 2002

Would you like a jelly baby?
It's been in my pocket through 4 regenerations,
but it's still good.

deep dish peat moss posted:

Presumably only one of those people went through the copyright registration process for it. You don't have to register to own a copyright to your work, but if you don't and then someone else creates an identical work and registers it and you can't prove that it's derivative of yours, they get the copyright.

You don’t register Copyright. You’re thinking of Trademark. Anything you produce is automatically copyrighted the moment you produce it.

deep dish peat moss
Jul 27, 2006

You do in fact register copyright:
https://www.copyright.gov/registration/


You don't have to, but doing so provides extra protections and makes it easier to pursue claims and prove yourself as the original creator.

e: from much later: Essentially if you do not register a copyright and it gets infringed, then you must prove in court that you are the original creator and own the copyright, which you may or may not be able to do depending on the circumstances. If you do register and then it gets infringed, you are already on record as the owner of the copyright and you do not need to fight (as much of) a legal battle in court to prove yourself. You own the copyright either way, but registering it is like doing all of the court legwork ahead of time - it's insurance that if you take someone to court over copyright infringement, you will win unless they can conclusively prove that their work is not derivative. And if you and another artist create an identical work at the same time, only one of you can be the officially-registered copyright holder, so if you can afford the registration fee it is best to register any copyright that you feel may be infringed..


I don't know if this is "the rule" or whatever but the precedent I've seen in reading about lots of copyright cases:
Unregistered copyright = onus is on copyright holder to prove ownership
Registered copyright = onus is on the infringing party to prove ownership.

(again, I am not a lawyer but I've been reading about this stuff since I was 13 because I got in to editing megaman sprites back then)

deep dish peat moss fucked around with this message at 00:09 on Jan 19, 2023

Tunicate
May 15, 2012

Lol google tried building a big database of human-curated sentiment analysis of English text to train their robot on. They outsourced the tagging to lowest-bidder nonnative speakers, with the result that 70% of their human tagged data was wrong.

Sourdough Sam
May 2, 2010

:dukedog:

Fitzy Fitz posted:

Not saying that this stuff isn't worth investigating and reporting, but this is the same as in literally every industry.

The difference is once the necessary people are exploited, made obsolete, and tossed aside you'll have your free culture and entertainment machine to make your favorite thing on demand in perpetuity and you won't have to think about these people who plied a trade ever again!

KakerMix
Apr 8, 2004

8.2 M.P.G.
:byetankie:

Sourdough Sam posted:

The difference is once the necessary people are exploited, made obsolete, and tossed aside you'll have your free culture and entertainment machine to make your favorite thing on demand in perpetuity and you won't have to think about these people who plied a trade ever again!

Yeah capitalism is a real bitch, what exactly are you trying to say that is new?

Moongrave
Jun 19, 2004

Finally Living Rent Free

KakerMix posted:

Yeah capitalism is a real bitch, what exactly are you trying to say that is new?

Art was supposed to be safe from the evil AI so they could keep mocking people who did jobs as they lost theirs to the machines

Sourdough Sam
May 2, 2010

:dukedog:
Posting will be the last true art, and it's not looking good for us either.

ThisIsJohnWayne
Feb 23, 2007
Ooo! Look at me! NO DON'T LOOK AT ME!



Sourdough Sam posted:

Posting will be the last true art, and it's not looking good for us either.

lol where did you go to posting school and who was your professor. You should ask them for your tuition back, clearly

deep dish peat moss
Jul 27, 2006

Sourdough Sam posted:

The difference is once the necessary people are exploited, made obsolete, and tossed aside you'll have your free culture and entertainment machine to make your favorite thing on demand in perpetuity and you won't have to think about these people who plied a trade ever again!

Anyone with a true creative vision will still have a true creative vision that persists in the age of AI. There is always going to be a person driving the AI machine and how that person steers is always going to affect the output. There is always going to be a clear, discernible difference in the creative output of an auteur and your "average artist", whether it's an AI or a human pushing the pencil. If you think that some kind of "art AI" that creates a full-fledged, distinct, original, minimum viable consistent product on its own without trained human guidance is realistic within our lifetimes, you don't have much experience with AI image tools.

I understand that it sucks for the working-class artists who are not widely recognized as the authors of their own works and that it is going to affect their livelihoods, but pretending it's about artistic integrity is absurd. What you're frustrated with is the fact that the worker is a replaceable cog in a capitalist system, not some kind of siren song of artistic integrity. Artistic integrity isn't going anywhere - corporate technical artist jobs are what AI is taking away from us. And it's going to happen to everyone, not just artists.

I listened to the Greg Rutkowski podcast and literally the only reason he has a problem with any of this is because "by greg rutkowski" is such a common prompt. He's concerned that e.g. images show up with his signature on them, which only happens because people put "by Greg Rutkowski" in the prompt - which is exactly what a signature on an image is, it's a way of saying "By Greg Rutkowski". If he just talked to MJ and had them ban his name from prompts he would have no tangible complaint (ok ok he would still have a tangible complaint against SD), but instead of taking the time to understand the technology and realize that the images have his signature because people are literally asking for his signature on the images in the prompt (which tbf is not his fault but it's because most people using these tools are still like 6 months behind on prompt-writing technique) he champions the disney IP lawyer anti-AI gofundme :(


e: Then again I have never had any interest in creating derivative works with AI and literally from the day I first started poking at it in 2019 I've been focused on creating distinct unique styles that can feel like they are "mine", and I get that the vast majority of its current audience are there specifically because they want to create derivative works, so :shrug: I just, as an artist, fundamentally cannot see the problem with tools that allow people to create derivative works, because just off the top of my head here are some other tools that allow people to create derivative works: pencils, pens, MSpaint, photoshop, mosaic tiles, crayons, markers, Google Image Search, laserjet printers, drawing with their fingers in the sand, collecting and repurposing old soda cans, spraypaint, etc.


Anyway as an aside here's some things I made today:


(made with text prompt alone, no artist names or existing media mentioned, etc)


(same, but image prompted it with my own art)

deep dish peat moss fucked around with this message at 00:47 on Jan 19, 2023

Prolonged Panorama
Dec 21, 2007
Holy hookrat Sally smoking crack in the alley!



The AIs could have just been trained on public domain, CC0, or licensed/commissioned images. That would be ethical, and there'd be no legal issues either. Nobody's work being used without consent, compensation, or credit. That's what artists are asking for. Those systems would still be hugely powerful and interesting.

I'm not sure what to make of people against the "ethical art AI" idea. It seems to boil down to "but those ethical AIs would be worse, I already have the version with millions (billions?) of pieces of work scraped against their creators' will, I want to keep benefiting from their work/time/labor/craft. I am entitled to it."

steckles
Jan 14, 2006

deep dish peat moss posted:

If he just talked to MJ and had them ban his name from prompts he would have no tangible complaint, but instead of taking the time to understand the technology and realize that the images have his signature because people are literally asking for his signature on the images in the prompt (which tbf is not his fault but it's because most people using these tools are still like 6 months behind on prompt-writing technique) he champions the disney IP lawyer anti-AI gofundme :(
That's not really how these things work. If you could define somebody's style precisely enough, which based on some of the ridiculously long and specific prompts I've seen seems plausible to me, the network might go ahead and throw the signature in there anyway. They still exist in the latent space and could be reached by other means. This seems like it'd be more of a problem for styles where there are few artists represented in the training set, not the generic digital painting style a lot of prompters seem to be aiming for, but there's no technical reason why any given prompt couldn't include a real artist's signature. Just banning a list of names from the prompts probably isn't going to solve anything.

deep dish peat moss posted:

because just off the top of my head here are some other tools that allow people to create derivative works: pencils, pens, MSpaint, photoshop, mosaic tiles, crayons, markers, Google Image Search, laserjet printers, drawing with their fingers in the sand, collecting and repurposing old soda cans, spraypaint, etc.
Well, isn't that what the core philosophical question is? The most reductive point of view would be that because the latent space is defined by the training set, and prompts can only ever pick points within that latent space, then all their output must be derivative. I don't know that I agree with that view, but it's what is being grappled with at the moment. It's interesting to look at the music industry and all of the court cases about plagiarism and what constitutes a derivative work there. It's no surprise why the companies out there trying to make Stable Diffusion for music are going to such lengths to ensure that their training sets are of unimpeachable provenance.

KakerMix
Apr 8, 2004

8.2 M.P.G.
:byetankie:

deep dish peat moss posted:

Then again I have never had any interest in creating derivative works

Yeah like my whole deal with these tools are creating faux photos, I like seeing 'real' things, cars, machines, buildings. Lots of others want to make anime real which is cool but with me specifically I'm invoking 'feelings', I don't invoke artists or specific works, just objects, angles, film types, color gamuts.

Prolonged Panorama posted:

The AIs could have just been trained on public domain, CC0, or licensed/commissioned images. That would be ethical, and there'd be no legal issues either. Nobody's work being used without consent, compensation, or credit. That's what artists are asking for. Those systems would still be hugely powerful and interesting.

I'm not sure what to make of people against the "ethical art AI" idea. It seems to boil down to "but those ethical AIs would be worse, I already have the version with millions (billions?) of pieces of work scraped against their creators' will, I want to keep benefiting from their work/time/labor/craft. I am entitled to it."

If I look at, and take the time to draw something, based on the style of someone else, while never trying to recreate specific works or claiming my works are theirs, that's ok, right? Shameless, tasteless, sure, but legal and clear A-OK with how the status quo currently is. If the robot does it though why is that different? This is where your argument breaks down with me because you are assuming these AI models are somehow unethical, where I don't think they are because of functionally how they work. You will never get a copy of someone's work out of these machines without overtraining a model on a specific work. Remember, "style" is not protected, just specific works.
I also know that lots of people complaining about these AI systems fundamentally have zero idea how they work, and most of the time go in the opposite direction and think they work in ways they don't actually. You go ahead and make an 'ethical' dataset and people will still be screaming about it because they do not understand how the systems work. It's a big whole angry twitter shitshow, that's all.

Finally, the text ones are far, far more terrifying, how come nobody is screaming about those?

Ruffian Price
Sep 17, 2016

All news worldwide has been screaming about ChatGPT for over a month now

Tunicate posted:

Lol google tried building a big database of human-curated sentiment analysis of English text to train their robot on. They outsourced the tagging to lowest-bidder nonnative speakers, with the result that 70% of their human tagged data was wrong.
Never have the resources to do it right, always have the resources to do it twice.

Moongrave
Jun 19, 2004

Finally Living Rent Free

Prolonged Panorama posted:

The AIs could have just been trained on public domain, CC0, or licensed/commissioned images.

it was

people need to read the ToS of the places that hosted their images for free for decades.

KakerMix
Apr 8, 2004

8.2 M.P.G.
:byetankie:

Ruffian Price posted:

All news worldwide has been screaming about ChatGPT for over a month now


Yeah but where is the PASSION

Here is a thing that I guess is unethical because it's obviously ____________'s work



I've actually been drawing (lol) more than I ever have been thanks to this drawing tablet. Being able to zoom in and regenerate sections is really good for stuff like the controller, which was generated on top of this image using the Kirta plugin and inpaiting, and it allows you to render 'full res' in smaller spaces. The whole image of the box was generated then futzed with just a bit, then I masked off a section down there, MS-painted a controller in real rough fashion, then generated on top of that little section. This works really well for faces as well.

EDIT
not happy with those vents on the side :argh:

Ornamental Dingbat
Feb 26, 2007

Tree Reformat posted:

We're ultimately grappling with the societal and legal implications of the CSI "Enhance Image" button being made real.

https://www.youtube.com/watch?v=ISDQg899qVY&t=724s

To think as a kid, I thought this was an absurd sequence that could never actually happen in real life.

Pixel phones and Google Photos already offer the option to upscale/edit/remove people from your photos. The next step is inserting advertising. Like if you have a Coca-Cola can in the background of your photo it'll be easy enough to ensure that the logo will always be visible. If you take a picture at a public event and share it online all the billboards will show products that align with each of your relatives' adsense interests, your niece will see ads for Roblox while your grandmother will see ads for Big Buck's Buttplug Bonanza Outlet

BrainDance
May 8, 2007

Disco all night long!

I want to scream "common crawl obeyed robots.txt, this has been the standard for giving or taking consent from crawlers for nearly 30 years. You agreed to a ToS that allowed this or literally didn't do the thing in place to say you didn't want your art to get scraped if you self hosted" every time the issue of artist permission comes up.

Like, there's a way to give/take permission from crawlers, robots.txt, it doesn't have to be obeyed and it's not perfect but in this case it literally was.

But no one knows what robots.txt is (they should, if they're gonna post poo poo on the internet) and this gets a little too close to me going on a rant about how we should all start using gopher again and stuff.

Moongrave
Jun 19, 2004

Finally Living Rent Free
capitalism is the problem, they're also all too scared to fight the closed AI companies because they're filled with scary lawyers and will instead go after stability because they gave it out for free to everyone instead of hoarding it like Real Capitalists

Prolonged Panorama
Dec 21, 2007
Holy hookrat Sally smoking crack in the alley!



KakerMix posted:

Yeah like my whole deal with these tools are creating faux photos, I like seeing 'real' things, cars, machines, buildings. Lots of others want to make anime real which is cool but with me specifically I'm invoking 'feelings', I don't invoke artists or specific works, just objects, angles, film types, color gamuts.

Sounds like you're at the lower end of this scale then:



I found myself dropping lower and lower the more I used midjourney, the more I heard from artists, and the more I learned about the training datasets.

KakerMix posted:

If I look at, and take the time to draw something, based on the style of someone else, while never trying to recreate specific works or claiming my works are theirs, that's ok, right? Shameless, tasteless, sure, but legal and clear A-OK with how the status quo currently is. If the robot does it though why is that different? This is where your argument breaks down with me because you are assuming these AI models are somehow unethical, where I don't think they are because of functionally how they work. You will never get a copy of someone's work out of these machines without overtraining a model on a specific work.

The statistical methods used to train the models aren't themselves unethical, I'm with you on that. (Though I'd argue a human doesn't learn the way these systems appear to, not even close.) The unethical part is the non-consensual training set. People's work has been incorporated in to a commercial product without their knowledge, and (now that they know about it) against their wishes. With how these models work, you can't "opt out" once the training is complete. Once trained, the AIs don't learn, and they can't forget, either.

We can argue about how much any one training image impacts the future outputs of the model. But I don't see how you can make the argument that if the model were re-trained on an opt-in and public domain only data set, it would not behave differently. That difference in the function (and let's be honest, quality) of the tool is the impact copyrighted works have had on it. So even when you're not invoking specific artists, their work is shaping the output. They helped define the latent space you're exploring.

BARONS CYBER SKULL posted:

people need to read the ToS of the places that hosted their images for free for decades.

So if it's on pinterest it's public domain? This is news to me.

taqueso
Mar 8, 2004


:911:
:wookie: :thermidor: :wookie:
:dehumanize:

:pirate::hf::tinfoil:

BrainDance posted:

I want to scream "common crawl obeyed robots.txt, this has been the standard for giving or taking consent from crawlers for nearly 30 years. You agreed to a ToS that allowed this or literally didn't do the thing in place to say you didn't want your art to get scraped if you self hosted" every time the issue of artist permission comes up.

Like, there's a way to give/take permission from crawlers, robots.txt, it doesn't have to be obeyed and it's not perfect but in this case it literally was.

But no one knows what robots.txt is (they should, if they're gonna post poo poo on the internet) and this gets a little too close to me going on a rant about how we should all start using gopher again and stuff.

robots.txt doesn't have anything to do with how something is licensed

Ornamental Dingbat
Feb 26, 2007

It'll be interesting to see media companies go apeshit when this stuff evolves to the point where someone can say something like "Gossip Girl vs Xenomorphs directed by Alfred Hitchcock" and get a watchable feature-length movie.

steckles
Jan 14, 2006

BrainDance posted:

I want to scream "common crawl obeyed robots.txt, this has been the standard for giving or taking consent from crawlers for nearly 30 years. You agreed to a ToS that allowed this or literally didn't do the thing in place to say you didn't want your art to get scraped if you self hosted" every time the issue of artist permission comes up.
Not saying "I don't consent to having my site crawled" is not the same as saying "I consent to have the content of my site used for arbitrary purposes". Like, if a particularly prescient creator included text saying "The contents of this site can not be used to train a neural network" but still let their site be scraped, then including their work in a training set would be a clear violation of the license granted to the visitors of that site.

BrainDance
May 8, 2007

Disco all night long!

taqueso posted:

robots.txt doesn't have anything to do with how something is licensed

It doesn't, though there's a good chance no license in most western countries that would make anything common crawl, LAION, or SD did a violation of it. Otherwise every license would include "one weird trick you can't use this for anything ever" and no one could ever make anything disneyesque.

But it does give or take consent to crawlers, and was obeyed in this situation.

steckles posted:

if a particularly prescient creator included text saying "The contents of this site can not be used to train a neural network" but still let their site be scraped, then including their work in a training set would be a clear violation of the license granted to the visitors of that site.

Common Crawl identifies itself as ccbot, and I'm really skeptical over whether a specific "no neural networks" clause would be in any way enforceable, and not just because it isn't, but because it shouldn't be. In the same way "fair use does not apply to this content" wouldn't and shouldn't be enforceable.

BrainDance fucked around with this message at 03:20 on Jan 19, 2023

Humbug Scoolbus
Apr 25, 2008

The scarlet letter was her passport into regions where other women dared not tread. Shame, Despair, Solitude! These had been her teachers, stern and wild ones, and they had made her strong, but taught her much amiss.
Clapping Larry

Prolonged Panorama posted:

Sounds like you're at the lower end of this scale then:



I found myself dropping lower and lower the more I used midjourney, the more I heard from artists, and the more I learned about the training datasets.


I'm at a solid 2.5 on that one

Sedgr
Sep 16, 2007

Neat!

The model doesn't contain anyone's art. It doesn't have art in it at all. It outputs data that looks like art.

KakerMix
Apr 8, 2004

8.2 M.P.G.
:byetankie:

Prolonged Panorama posted:

Sounds like you're at the lower end of this scale then:



I found myself dropping lower and lower the more I used midjourney, the more I heard from artists, and the more I learned about the training datasets.

The statistical methods used to train the models aren't themselves unethical, I'm with you on that. (Though I'd argue a human doesn't learn the way these systems appear to, not even close.) The unethical part is the non-consensual training set. People's work has been incorporated in to a commercial product without their knowledge, and (now that they know about it) against their wishes. With how these models work, you can't "opt out" once the training is complete. Once trained, the AIs don't learn, and they can't forget, either.

We can argue about how much any one training image impacts the future outputs of the model. But I don't see how you can make the argument that if the model were re-trained on an opt-in and public domain only data set, it would not behave differently. That difference in the function (and let's be honest, quality) of the tool is the impact copyrighted works have had on it. So even when you're not invoking specific artists, their work is shaping the output. They helped define the latent space you're exploring.

So if it's on pinterest it's public domain? This is news to me.


I'm not going to claim I am "an artist" even though I sure spend money on art related things like one and even went to an art college for art things and made lots of things that could be considered art, and, not that it matters, have lots of real professional artist friends. They can have that title. That chart, by existing at all, puts "artists" against "not artists" because it assumes there is negative with using AI generation at all, where I flat out disagree with that assessment. I wasn't stating "I don't even use artists so I'm one of the good ones" I was stating how I use the tool goes against what Gregs or Sarahs would claim.
I do not think that an artist whose image was used in the training data has the right to ask my permission to use these AI tools, in the same way that I can look at their images and use my own hands to try to attempt to learn to draw in their style without asking their permission. Specifically with Stable Diffusion, it's over, it isn't going back in and people move so, so fast. There are models that people make for fun, for free, on literally anything you can dream of, and they do it as jokes like the anti-AI thing that swept through twitter a couple weeks back, someone made a model that just generated versions of those. You can completely kill Stable Diffusion, dissolve the company, and nothing at all changes. It's out, it's open source, it's in people's hands, and they know how to make things like this. It is way, way too late.

I don't think people were claiming that their works are in public domain, just that it really strengthens tech companies argument that they didn't break any rules or laws since they were hosting, filing, holding the files these models were trained on. The TOSs agree that they can be used for whatever the company wants, including training. Go check out Adobe's license agreement people were freaking out about recently, turns out you gave them permission to use your anonymized data for training purposes, ~whoops~.

Essentially, the art isn't in the files, the product. I can not extract out anyone's art from anything at all in the Stable Diffusion files, no matter how many 2-8 gb models I download, no matter how many latent diffusion things I shove in, I have to generate images. They aren't there till the black box goes boop and spits something out. This is why I don't think these artists have any legs to stand on, their art was looked at my a machine's "eyes" but it wasn't ingested.


Sedgr posted:

The model doesn't contain anyone's art. It doesn't have art in it at all. It outputs data that looks like art.

Ironic, since if what these AIs spit out isn't art, then how can they replace the human artists that make real art? :v:

steckles
Jan 14, 2006

BrainDance posted:

I'm really skeptical over whether a specific "no neural networks" clause would be in any way enforceable, and not just because it isn't, but because it shouldn't be.
That hinges entirely upon whether or not training a neural network for commercial purposes counts as fair use and I don't think that's clear cut at all. Training a totally open source network that was licensed in such a way to preclude using its output for commercial purposes could be fair use. When someone's work goes into training something that a for-profit company is hoping to monetize, the question of whether or not to respect the artist's desires should be answered.

One of the reasons these AIs have been released so widely is that there isn't a big, litigious industry group, like the RIAA or MPAA, claiming to represent millions of online artists, who are interested in exploring the limits of fair use through expensive legal action. I'm not saying the RIAA or MPAA are at all noble or good entities, or that we need something like that for artists, but the lack thereof has been an enabling factor in the forgiveness-not-permission attitude that the companies behind these AIs have taken.

steckles fucked around with this message at 04:05 on Jan 19, 2023

lunar detritus
May 6, 2009


Prolonged Panorama posted:

Sounds like you're at the lower end of this scale then:




"trains the model with living artists" is probably like 12 in that scale but at least I barely post my results. I just find it so interesting to download a couple of images, throw them into the mix and ta-dah, magic happens.

ymgve
Jan 2, 2004


:dukedog:
Offensive Clock

Cabbages and Kings posted:

Why is getty suing stability for internet scraping that LAION did, in some cases years ago?

One obvious answer is "stability has money and LAION doesn't by design".

I'm not convinced anyone bringing these lawsuits understands the technology at all but I am super sure that no juror ever will so I look forward to some really hare brained verdicts that might have disasterous side effects; meanwhile Hasbro is absolutely going to be replacing commercial artists with AI-using commercial artists left and right.

As more and more models emerge, good loving luck proving that thing X was ever even included in "whatever model was used to generate this image".

We're so comically bad at litigating new technology. On the one hand, the way chemical industry code works is, the FDA has to prove something is toxic after it's already out in the world causing cancer to stand a prayer at getting it banned, in which case it will immediately be replaced by something else which is just as bad or worse (but 10 years short of having a track record showing that). On the other hand, we'll just throw sand in the gears of completely inevitable poo poo like large sets of metadata from which inferences can be drawn.

Someone ELI5 what LAION does that google crawling and caching doesn't

LAION is a dataset of URLs and textual description of the images found on those URLs. It does not contain the actual pictures. So if someone is training an AI with the LAION dataset, they have to scrape all the images themselves and feed it into the AI. This means the people training the AI is closer to potential copyright infringement than the people who made the LAION dataset originally.

Sedgr
Sep 16, 2007

Neat!

KakerMix posted:

Ironic, since if what these AIs spit out isn't art, then how can they replace the human artists that make real art? :v:

I think it's art. The AI manipulates the data and the person wielding the text prompt is the impetus. If the end result breaks copyright its the human that is to blame.

IShallRiseAgain
Sep 12, 2008

Well ain't that precious?

steckles posted:

That hinges entirely upon whether or not training a neural network for commercial purposes counts as fair use and I don't think that's clear cut at all. Training a totally open source network that was licensed in such a way to preclude using its output for commercial purposes could be fair use. When someone's work goes into training something that a for-profit company is hoping to monetize, the question of whether or not to respect the artist's desires should be answered.

One of the reasons these AIs have been released so widely is that there isn't a big, litigious industry group, like the RIAA or MPAA, claiming to represent millions of online artists, who are interested in exploring the limits of fair use through expensive legal action. I'm not saying the RIAA or MPAA are at all noble or good entities, or that we need something like that for artists, but the lack thereof has been an enabling factor in the forgiveness-not-permission attitude that the companies behind these AIs have taken.

The reality is that no matter what the artists aren't winning this fight. Even if they by some miracle manage to completely purge Midjourney/DALL-E/Stable Diffusion from the internet, Its not going to stop Disney or some other corporation from training on all of the copyrighted images they have the right to and creating their own AI. The only thing they can destroy is the general public's easy access to AI, and realistically they are just going to slow it down.

Also, process is not copyrightable. The only thing that matters is the end result as far as fair use is concerned.

BrainDance
May 8, 2007

Disco all night long!

steckles posted:

One of the reasons these AIs have been released so widely is that there isn't a big, litigious industry group, like the RIAA or MPAA, claiming to represent millions of online artists, who are interested in exploring the limits of fair use through expensive legal action.

About 15 years ago mashups went from being a niche genre to huge when Girl Talk blew up. Even with the RIAA, he avoided getting sued, even though his work was entirely built off sampling, every part of every track on Feed The Animals was built from other tracks, this many tracks



A physical cd got released, for profit. And he did shows. He didn't get sued from what I remember. He said in interviews it was all because it was fair use. Part of the reason why was that sampling had been tested in court and was common, but there's definitely a "this is different" vibe to mashups, and I remember there being some legal issues with people not wanting to host his stuff, but that was more out of fear that it would turn into a legal thing if I remember right. The RIAA couldn't get girl talk even being the RIAA. And mashups are far more directly the source material than anything an AI does.

I guess AI will have to be tested in court, too. But, barring some big change in copyright law I am optimistic it would turn out in AIs favor. And even elsewhere, I don't know of a single country where "this clause supersedes local law" is enforceable in a contract.

BrainDance fucked around with this message at 04:32 on Jan 19, 2023

Moongrave
Jun 19, 2004

Finally Living Rent Free
It will come down to tech conglomerates vs Disney and no matter the outcome will be worse than if everyone just looked away and let it quietly happen

Adbot
ADBOT LOVES YOU

Sedgr
Sep 16, 2007

Neat!

Thread needs more art. Cool art.





  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply