Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Liquid Communism
Mar 9, 2004

коммунизм хранится в яичках
https://twitter.com/robertkneschke/status/1662034837618786304

Gee, I wonder why they'd be concerned about liability if forced to disclose their training data. :v:

Adbot
ADBOT LOVES YOU

Serotoning
Sep 14, 2010

D&D: HASBARA SQUAD
HANG 'EM HIGH


We're fighting human animals and we act accordingly

Bar Ran Dun posted:

Sorry to keep posting on in following posts but historically the origins of these expert systems… and of bureaucracy itself are colonialism. Not it’s like colonialism. The administration of colonies… this is practically how it’s done. The other place it starts is vessels. These systems are how you have a vessel on the other side of the world being operated in the standardized manner Capital wants it operated in.

These are tools created as systems for colonialism and capitalism.

I love how every couple of pages a Communist (?) checks in to inform us that, in fact, everything is somehow rooted in capital- and/or colonialism. :lol:

Like guys did you know that before Colonies and Capital people didn't care about making robust and repeatable decisions :agesilaus:. Code of Hammurabi be damned. This is what is so amazing/scary about Communism, the way it insists on being THE lens for seeing everything in the world. Or at least that's how it is often treated.

LionArcher
Mar 29, 2010


So I'm curious about something, and reading the last dozen pages makes it clear a few of you understand the current batch of tools (I guess we can call them AI) on a more technically level than I do. I'm just a layman, who is a creative and sees an upside in using these tools for my business (indie author).

I see a lot of yelling on Twitter about how Stable Diffusion and all of the LLM'S are based on copyrighted work and it's "stealing" and theft to use them.

However, here's my use cases. I have three.

One, the AI's do a far better job of reading out loud works I've copy and pasted in solid voices. In other words, it's great at listening back to a chapter and picking up on sentences and tone I don't like. Same goes for editing, Chat GPT in private mode has a solid editing function, that can highlight grammar issues that grammerly misses. Great for cleaning up drafts.

I don't see any ethical issues with using AI's for these functions, despite everyone on twitter yelling about how this makes for lazy writers who don't want to write.

Use Case two. (This is the one I have the most questions about.) Making covers with Stable diffusion. It's easier to use prompts, cheap, and allows me to make covers that look the way I want quickly. The issue question is the training here. It's trained on other art work, but I was never going to hire a lot of those cover artists anyways, especially considering how slow they've been at turning around work or not giving me what I want.

My understanding is that it's trained on the data, but then not retaining those images? and if I'm downloading models to base the AI on from Civitai and those model builders are fine with folks using the models for commercial work, and then I"m also taking it into photoshop and tweaking throwing a real model's face on the body/image (using stock photos I've legally purchased) I don't see how this is "Stealing" from those artists and not me heavily modifying making my own art. I'm also not putting in people's names in the prompts.

Use Case three. We're getting close to having language models like ChatGPT run locally that can generate text. The online ones like Sudowrite/ Novelai chat GPT are currently better at this, but the ethics of those are far more iffy. However, my understanding is we're not far off from downloading a basic one that's trained to be okay at producing a few hundred words at a time based on a clear prompt, and then you can train it on your own work. Having it generate a few hundred words based on my own writing to speed up writing say "erotica" is again something I see no ethics issues with.

Is all of that about right? In terms of my thinking with how this stuff works?

Bar Ran Dun
Jan 22, 2006




Pvt. Parts posted:

I love how every couple of pages a Communist (?) checks in to inform us that, in fact, everything is somehow rooted in capital- and/or colonialism. :lol:

Like guys did you know that before Colonies and Capital people didn't care about making robust and repeatable decisions :agesilaus:. Code of Hammurabi be damned. This is what is so amazing/scary about Communism, the way it insists on being THE lens for seeing everything in the world. Or at least that's how it is often treated.

I am religious Socialist not a communist. :colbert: My analysis here is also from a systems and cybernetics perspective. That’s the actual model historical origin for these types of systems. Particularly for SMS systems on vessels, expert systems. Those have a historical origin in the lead up to the creation of the SOLAS treaty and their current structure is governed by that treaty. The vessel usage of these systems, spreads first to air lines and then later to medical systems and businesses in general.

I know a great deal about the history because I have done work evaluating the function of these SMS systems on vessels in the field on behalf of flag governments.

StratGoatCom
Aug 6, 2019

Our security is guaranteed by being able to melt the eyeballs of any other forum's denizens at 15 minutes notice


LionArcher posted:

So I'm curious about something, and reading the last dozen pages makes it clear a few of you understand the current batch of tools (I guess we can call them AI) on a more technically level than I do. I'm just a layman, who is a creative and sees an upside in using these tools for my business (indie author).

I see a lot of yelling on Twitter about how Stable Diffusion and all of the LLM'S are based on copyrighted work and it's "stealing" and theft to use them.

However, here's my use cases. I have three.

One, the AI's do a far better job of reading out loud works I've copy and pasted in solid voices. In other words, it's great at listening back to a chapter and picking up on sentences and tone I don't like. Same goes for editing, Chat GPT in private mode has a solid editing function, that can highlight grammar issues that grammerly misses. Great for cleaning up drafts.

I don't see any ethical issues with using AI's for these functions, despite everyone on twitter yelling about how this makes for lazy writers who don't want to write.

Use Case two. (This is the one I have the most questions about.) Making covers with Stable diffusion. It's easier to use prompts, cheap, and allows me to make covers that look the way I want quickly. The issue question is the training here. It's trained on other art work, but I was never going to hire a lot of those cover artists anyways, especially considering how slow they've been at turning around work or not giving me what I want.

My understanding is that it's trained on the data, but then not retaining those images? and if I'm downloading models to base the AI on from Civitai and those model builders are fine with folks using the models for commercial work, and then I"m also taking it into photoshop and tweaking throwing a real model's face on the body/image (using stock photos I've legally purchased) I don't see how this is "Stealing" from those artists and not me heavily modifying making my own art. I'm also not putting in people's names in the prompts.

Use Case three. We're getting close to having language models like ChatGPT run locally that can generate text. The online ones like Sudowrite/ Novelai chat GPT are currently better at this, but the ethics of those are far more iffy. However, my understanding is we're not far off from downloading a basic one that's trained to be okay at producing a few hundred words at a time based on a clear prompt, and then you can train it on your own work. Having it generate a few hundred words based on my own writing to speed up writing say "erotica" is again something I see no ethics issues with.

Is all of that about right? In terms of my thinking with how this stuff works?

The issue with all of these is stability to provide this has commercially displaced the human that made the training material.

Tree Reformat
Apr 2, 2022

by Fluffdaddy

LionArcher posted:

So I'm curious about something, and reading the last dozen pages makes it clear a few of you understand the current batch of tools (I guess we can call them AI) on a more technically level than I do. I'm just a layman, who is a creative and sees an upside in using these tools for my business (indie author).

I see a lot of yelling on Twitter about how Stable Diffusion and all of the LLM'S are based on copyrighted work and it's "stealing" and theft to use them.

However, here's my use cases. I have three.

One, the AI's do a far better job of reading out loud works I've copy and pasted in solid voices. In other words, it's great at listening back to a chapter and picking up on sentences and tone I don't like. Same goes for editing, Chat GPT in private mode has a solid editing function, that can highlight grammar issues that grammerly misses. Great for cleaning up drafts.

I don't see any ethical issues with using AI's for these functions, despite everyone on twitter yelling about how this makes for lazy writers who don't want to write.

People have been concerned about TTS systems displacing audiobooks for a while now, AI-enhanced TTS models just kind of accelerates that trend.

quote:

Use Case two. (This is the one I have the most questions about.) Making covers with Stable diffusion. It's easier to use prompts, cheap, and allows me to make covers that look the way I want quickly. The issue question is the training here. It's trained on other art work, but I was never going to hire a lot of those cover artists anyways, especially considering how slow they've been at turning around work or not giving me what I want.

My understanding is that it's trained on the data, but then not retaining those images? and if I'm downloading models to base the AI on from Civitai and those model builders are fine with folks using the models for commercial work, and then I"m also taking it into photoshop and tweaking throwing a real model's face on the body/image (using stock photos I've legally purchased) I don't see how this is "Stealing" from those artists and not me heavily modifying making my own art. I'm also not putting in people's names in the prompts.

This is one of the biggest bones of contention, and probably what the current court cases are going to hinge around the ultimate answer to. People against AI assert that both the scraping of copyrighted material to collect the training data in the first place without the explicit approval of the copyright holders constitutes copyright infringement in and of itself (which would mean every single webcrawler for search engines is copyright infringing), and that the models are themselves full of copyrighted material. Effectively, they assert AI researchers have developed the most efficient (if extremely lossy and resource intensive) data compression method in human history (several billion images vs about 4 gb model file).

Those more ambivalent state the models only contain statistical weights that, while tuned from the training data, don't actually retain such information. Who is correct here, will, again, likely be up to the courts and lawmakers in the end.

quote:

Use Case three. We're getting close to having language models like ChatGPT run locally that can generate text. The online ones like Sudowrite/ Novelai chat GPT are currently better at this, but the ethics of those are far more iffy. However, my understanding is we're not far off from downloading a basic one that's trained to be okay at producing a few hundred words at a time based on a clear prompt, and then you can train it on your own work. Having it generate a few hundred words based on my own writing to speed up writing say "erotica" is again something I see no ethics issues with.

Is all of that about right? In terms of my thinking with how this stuff works?

We're there now. If you can run Stable Diffusion locally, you can download and run KoboldAI on your machine as well. Now, it's less powerful (the models are trained on about a couple orders of magnitude less than what GPT-3/4 are), but its absolutely a thing you can do right now if you want.

StratGoatCom
Aug 6, 2019

Our security is guaranteed by being able to melt the eyeballs of any other forum's denizens at 15 minutes notice


Tree Reformat posted:

People have been concerned about TTS systems displacing audiobooks for a while now, AI-enhanced TTS models just kind of accelerates that trend.

This is one of the biggest bones of contention, and probably what the current court cases are going to hinge around the ultimate answer to. People against AI assert that both the scraping of copyrighted material to collect the training data in the first place without the explicit approval of the copyright holders constitutes copyright infringement in and of itself (which would mean every single webcrawler for search engines is copyright infringing), and that the models are themselves full of copyrighted material. Effectively, they assert AI researchers have developed the most efficient (if extremely lossy and resource intensive) data compression method in human history (several billion images vs about 4 gb model file).

Those more ambivalent state the models only contain statistical weights that, while tuned from the training data, don't actually retain such information. Who is correct here, will, again, likely be up to the courts and lawmakers in the end.

We're there now. If you can run Stable Diffusion locally, you can download and run KoboldAI on your machine as well. Now, it's less powerful (the models are trained on about a couple orders of magnitude less than what GPT-3/4 are), but its absolutely a thing you can do right now if you want.

:wrongful:

So many errors.

The problem is commercial displacement. Scraping with a search engine is kosher because it it is for directing sales to the owner. Same with the training because you used people's stuff for a machine to displace the people.

Jaxyon
Mar 7, 2016
I’m just saying I would like to see a man beat a woman in a cage. Just to be sure.

Liquid Communism posted:

https://twitter.com/robertkneschke/status/1662034837618786304

Gee, I wonder why they'd be concerned about liability if forced to disclose their training data. :v:

Yeah I love that "OpenAI" is not open at all, that's some solid capitalism right there.

KillHour
Oct 28, 2007


Reducing the ethics down to "anything that displaces workers is unethical" makes pretty much any technology unethical. Computers themselves are named after the job they completely eliminated.

StratGoatCom
Aug 6, 2019

Our security is guaranteed by being able to melt the eyeballs of any other forum's denizens at 15 minutes notice


KillHour posted:

Reducing the ethics down to "anything that displaces workers is unethical" makes pretty much any technology unethical. Computers themselves are named after the job they completely eliminated.

It's fair competition for IP, not mechanization. If you allow that excuse, you don't have a IP system.

GlyphGryph
Jun 23, 2013

Down came the glitches and burned us in ditches and we slept after eating our dead.

StratGoatCom posted:

It's fair competition for IP, not mechanization. If you allow that excuse, you don't have a IP system.

What are you trying to say here? I don't understand.

Bar Ran Dun
Jan 22, 2006




KillHour posted:

Reducing the ethics down to "anything that displaces workers is unethical" makes pretty much any technology unethical. Computers themselves are named after the job they completely eliminated.

Expert systems are unique because they represent the removal of a human element from a system by increased systemization of the whole system.

Look at it this way. There are things removing the human element from is a mostly straight forward good. Eg. Route manufacture line tasks.

The removal of humanity from the whole system is a different thing. It’s like the optimization of rent price setting by algorithms. That seems to have arrived at jacking up prices after the first year in all cases, which is pretty bad for society. We encounter this removal as a lessening of flexibility, and reduction of individuals to use judgement.

Recently I had to take my child into an urgent care on the weekend for pink eye. They gave us a gel and refused to deviate from their standard of care system when we requested drops instead. My son is a young child with special needs. The gel didn’t work and was a huge struggle all weekend. But we were informed they could not deviate in any way from their standards of care, codified in their companies expert system.

On Monday we made an appointment at our pediatrician who, prescribed drops and advised that the pediatric standard of care was drops, not the gel.

The removal of the human element can create situations like this. This is an essential characteristic of systematizing. All systems are models and all models fall short of reality, and they work improperly when that happens.

As we remove humans from more complex systems we need to be very careful to not remove humanity from those complex systems. If we do so we create the Luddite reaction or worse existential revolt. Historically systematic social and economic reforms have caused political reactions when they fail to retain humanity in the changes made to systems.

It very much needs to be considered when automating complex things. recently there was a eating disorder hotline that switched to a fully automated chat system and fired all the human therapists (because they unionized)

Such inhuman choices are extremely stupid and the potential for making bad choices like that in an unknown way needs to be kept in mind with any serious automation or systematizing.

StratGoatCom
Aug 6, 2019

Our security is guaranteed by being able to melt the eyeballs of any other forum's denizens at 15 minutes notice


GlyphGryph posted:

What are you trying to say here? I don't understand.

I am saying that if you let someone shovel everything and everything into a machine commercially as fair use, you may as well not have copyright because if you can shove it into a machine like that and get away with it, there's no protection against infringement.

LionArcher
Mar 29, 2010


Bar Ran Dun posted:

Expert systems are unique because they represent the removal of a human element from a system by increased systemization of the whole system.

Look at it this way. There are things removing the human element from is a mostly straight forward good. Eg. Route manufacture line tasks.

The removal of humanity from the whole system is a different thing. It’s like the optimization of rent price setting by algorithms. That seems to have arrived at jacking up prices after the first year in all cases, which is pretty bad for society. We encounter this removal as a lessening of flexibility, and reduction of individuals to use judgement.

Recently I had to take my child into an urgent care on the weekend for pink eye. They gave us a gel and refused to deviate from their standard of care system when we requested drops instead. My son is a young child with special needs. The gel didn’t work and was a huge struggle all weekend. But we were informed they could not deviate in any way from their standards of care, codified in their companies expert system.

On Monday we made an appointment at our pediatrician who, prescribed drops and advised that the pediatric standard of care was drops, not the gel.

The removal of the human element can create situations like this. This is an essential characteristic of systematizing. All systems are models and all models fall short of reality, and they work improperly when that happens.

As we remove humans from more complex systems we need to be very careful to not remove humanity from those complex systems. If we do so we create the Luddite reaction or worse existential revolt. Historically systematic social and economic reforms have caused political reactions when they fail to retain humanity in the changes made to systems.

It very much needs to be considered when automating complex things. recently there was a eating disorder hotline that switched to a fully automated chat system and fired all the human therapists (because they unionized)

Such inhuman choices are extremely stupid and the potential for making bad choices like that in an unknown way needs to be kept in mind with any serious automation or systematizing.

I don’t disagree with any of this, but it’s why I’m frustrated as an “artist” that there’s so much backlash to using these tools as an artist. If I can give customers more products that are better at a faster speed, it’s both rewarding to me finically, and artistically. These tools reduce friction for me. It’s still me doing it all. Im guiding the cover, the edit, the actual writing, and so on. It just allows me to do it faster.

My issue when folks say “if you use these tools you’re lazy and not a real artist” is to go, no, I have a ton of ideas and these tools allow me to get to more of them and produce more polished works.

As long as those tools are not just copy pasting the art/ plagiarizing fan fiction from websites.

StratGoatCom
Aug 6, 2019

Our security is guaranteed by being able to melt the eyeballs of any other forum's denizens at 15 minutes notice


LionArcher posted:

I don’t disagree with any of this, but it’s why I’m frustrated as an “artist” that there’s so much backlash to using these tools as an artist. If I can give customers more products that are better at a faster speed, it’s both rewarding to me finically, and artistically. These tools reduce friction for me. It’s still me doing it all. Im guiding the cover, the edit, the actual writing, and so on. It just allows me to do it faster.

My issue when folks say “if you use these tools you’re lazy and not a real artist” is to go, no, I have a ton of ideas and these tools allow me to get to more of them and produce more polished works.

As long as those tools are not just copy pasting the art/ plagiarizing fan fiction from websites.

The issue is that you are stealing from other artists.

Bar Ran Dun
Jan 22, 2006




“Immature poets imitate; mature poets steal; bad poets deface what they take, and good poets make it into something better, or at least something different. The good poet welds his theft into a whole of feeling which is unique, utterly different from that from which it was torn; the bad poet throws it into something which has no cohesion. A good poet will usually borrow from authors remote in time, or alien in language, or diverse in interest.”

T.S. Eliot

SubG
Aug 19, 2004

It's a hard world for little things.

Bar Ran Dun posted:

These are tools created as systems for colonialism and capitalism.
Is this intended to be an argument, or are you just using it as a rhetorical zinger? That is, is having been created by an exploitative system in and of itself an argument against a technology, or an argument for its restriction or banning?

KwegiboHB
Feb 2, 2004

nonconformist art brut
Negative prompt: amenable, compliant, docile, law-abiding, lawful, legal, legitimate, obedient, orderly, submissive, tractable
Steps: 32, Sampler: DPM++ 2M Karras, CFG scale: 11, Seed: 520244594, Size: 512x512, Model hash: 99fd5c4b6f, Model: seekArtMEGA_mega20

StratGoatCom posted:

The issue is that you are stealing from other artists.

That's right, I stole all of the art. All of it. It's all mine now, there's no more art left for anyone else.

Bar Ran Dun
Jan 22, 2006




An example of “utterly different from that from which it was torn”:

https://youtu.be/JGBXnw86Mgc

https://youtu.be/AIOAlaACuv4

StratGoatCom
Aug 6, 2019

Our security is guaranteed by being able to melt the eyeballs of any other forum's denizens at 15 minutes notice


KwegiboHB posted:

That's right, I stole all of the art. All of it. It's all mine now, there's no more art left for anyone else.
It is called 'displacing competition', and Warhol has made it more important again, or is likely to.

KwegiboHB
Feb 2, 2004

nonconformist art brut
Negative prompt: amenable, compliant, docile, law-abiding, lawful, legal, legitimate, obedient, orderly, submissive, tractable
Steps: 32, Sampler: DPM++ 2M Karras, CFG scale: 11, Seed: 520244594, Size: 512x512, Model hash: 99fd5c4b6f, Model: seekArtMEGA_mega20

StratGoatCom posted:

It is called 'displacing competition', and Warhol has made it more important again, or is likely to.

What is the difference between 'displacing competition' and just 'competition'?

StratGoatCom
Aug 6, 2019

Our security is guaranteed by being able to melt the eyeballs of any other forum's denizens at 15 minutes notice


KwegiboHB posted:

What is the difference between 'displacing competition' and just 'competition'?

Using other folk's stuff, which, as Diffusion was trained on LAION, it provably does.

KwegiboHB
Feb 2, 2004

nonconformist art brut
Negative prompt: amenable, compliant, docile, law-abiding, lawful, legal, legitimate, obedient, orderly, submissive, tractable
Steps: 32, Sampler: DPM++ 2M Karras, CFG scale: 11, Seed: 520244594, Size: 512x512, Model hash: 99fd5c4b6f, Model: seekArtMEGA_mega20

StratGoatCom posted:

Using other folk's stuff, which, as Diffusion was trained on LAION, it provably does.

So transformative fair use just doesn't exist?

KillHour
Oct 28, 2007


Bar Ran Dun posted:

Such inhuman choices are extremely stupid and the potential for making bad choices like that in an unknown way needs to be kept in mind with any serious automation or systematizing.

I'm merely responding to StratGoatCom's assertion that a thing is defacto bad if it displaces labor. You're reading way too much into my reply.

StratGoatCom posted:

The issue with all of these is stability to provide this has commercially displaced the human that made the training material.

I'm pointing out that this alone is a poor measure. I think you would be hard pressed to argue that maintaining jobs for people to hand-solve math equations was more important than everything computers have done for us in the last century. But when this was happening, all computers could do is replace those jobs - they didn't originally provide anything new, just more efficient means of what we could already do, and with less flexibility. From that point in time, an argument against computers a 'la StratGoatCom's argument would have seemed appealing.

I'm not coming from the other absolute position where everything should be automated or made into an algorithm. I'm saying that the argument of "automating jobs away = bad" is far too simplistic.

StratGoatCom posted:

It's fair competition for IP, not mechanization. If you allow that excuse, you don't have a IP system.

What makes an art job any more meaningful than a labor job? These systems don't impact your ability to make art, just your ability to monetize it. Saying you have a right to assert a monopoly on art via a specific skill is like saying you have the right to assert a monopoly on making screws via the skill of being a machinist instead of having a robot do it. You can make that argument, but I reject that you have more of a claim to being able to monetize your skills than a laborer just because you categorize it as art.

Bar Ran Dun
Jan 22, 2006




SubG posted:

Is this intended to be an argument

It’s not intended to be an argument. That statement is the same thing as the my description of the pneumatic control valves and feed back relating to first and second order controls and systems. That’s (colonialism and ships) the actual place the line of modern thinking about expert systems starts.

You should remember I don’t consider tools to be infused with meaning of what they were created for. I do not agree with the thesis of Audrey Lorde about the masters tools. But I do think we should be informed by and should consider history and the abuse of intellectual tools when we think about how to use them!

StratGoatCom
Aug 6, 2019

Our security is guaranteed by being able to melt the eyeballs of any other forum's denizens at 15 minutes notice


KwegiboHB posted:

So transformative fair use just doesn't exist?

No, but it has to be weighed against the interest of the owner. By that metric, Stability fails spectacularly.

quote:

What makes an art job any more meaningful than a labor job?


Because it pays for skill in art you can use elsewhere, and is better for econ dev then a flat machine because it makes more IP. And, if a machine can ingest whatever and be okay, then if you want an IP, you feed it into it, then you can derive from it however you like with the owner having no recourse. Completely nonviable where ownership matters.

StratGoatCom fucked around with this message at 22:39 on May 26, 2023

Serotoning
Sep 14, 2010

D&D: HASBARA SQUAD
HANG 'EM HIGH


We're fighting human animals and we act accordingly

StratGoatCom posted:

I am saying that if you let someone shovel everything and everything into a machine commercially as fair use, you may as well not have copyright because if you can shove it into a machine like that and get away with it, there's no protection against infringement.

StratGoatCom posted:

The issue is that you are stealing from other artists.

I mean this really is the heart of the matter right here; to what extent is throwing art into a big mixer and letting it spit out new art "fair use" vs. some kind of infringement. I don't think it's as cut and dry as you portray here, and I think most people would agree with that because we have 23 pages of actual discussion not just poster after poster going "yeah that's stealing and it's bad" as you seem intent on doing. Art is inherently iterative so you have a long long history of people being heavily inspired by other works when creating their own, and most people are comfortable recognizing the artistic merit of derivative works while at the same time respecting that it has lineage that exists outside of just its author's mind. Assuming that all an AI model is doing is iterating (sampling, rearranging, and recomposing) existing works into new ones, where is the infringement? Is it because it is not a human doing it themselves by their own effort? An AI is just a tool, and a tool cannot infringe on anyone's rights, only a person can, and can therefore just be treated as the extension of a human's efforts (just as how a gun cannot take someone's life, only a person can using a gun).

I'm no artist, the closest thing I have board game design which I have only dabbled in. But to me if someone were to chop my board game into a million little pieces and then reconstruct them into new games I don't think my first instinct would be feeling ripped off. It seems like a kind of lack of artistic self confidence that I fear a machine will replace my ability to speak meaningfully to other humans through my product. Maybe if those derivative games were to start cutting into my ability to market my own games my tune would change, but again existing IP laws *should* catch anything that comes too close.

StratGoatCom
Aug 6, 2019

Our security is guaranteed by being able to melt the eyeballs of any other forum's denizens at 15 minutes notice


Pvt. Parts posted:

I mean this really is the heart of the matter right here; to what extent is throwing art into a big mixer and letting it spit out new art "fair use" vs. some kind of infringement. I don't think it's as cut and dry as you portray here, and I think most people would agree with that because we have 23 pages of actual discussion not just poster after poster going "yeah that's stealing and it's bad" as you seem intent on doing. Art is inherently iterative so you have a long long history of people being heavily inspired by other works when creating their own, and most people are comfortable recognizing the artistic merit of derivative works while at the same time respecting that it has lineage that exists outside of just its author's mind. Assuming that all an AI model is doing is iterating (sampling, rearranging, and recomposing) existing works into new ones, where is the infringement? Is it because it is not a human doing it themselves by their own effort? An AI is just a tool, and a tool cannot infringe on anyone's rights, only a person can, and can therefore just be treated as the extension of a human's efforts (just as how a gun cannot take someone's life, only a person can using a gun).

I'm no artist, the closest thing I have board game design which I have only dabbled in. But to me if someone were to chop my board game into a million little pieces and then reconstruct them into new games I don't think my first instinct would be feeling ripped off. It seems like a kind of lack of artistic self confidence that I fear a machine will replace my ability to speak meaningfully to other humans through my product. Maybe if those derivative games were to start cutting into my ability to market my own games my tune would change, but again existing IP laws *should* catch anything that comes too close.

It is pretty much so, all rights reserved, no commercial use without permission outside of incidental things, which this isn't.

Jaxyon
Mar 7, 2016
I’m just saying I would like to see a man beat a woman in a cage. Just to be sure.

Pvt. Parts posted:

I'm no artist, the closest thing I have board game design which I have only dabbled in. But to me if someone were to chop my board game into a million little pieces and then reconstruct them into new games I don't think my first instinct would be feeling ripped off. It seems like a kind of lack of artistic self confidence that I fear a machine will replace my ability to speak meaningfully to other humans through my product. Maybe if those derivative games were to start cutting into my ability to market my own games my tune would change, but again existing IP laws *should* catch anything that comes too close.

So what if a company took your core gameplay, and ruleset, and spit out essentially that gameplay and ruleset but themed it for Harry Potter or Marvel, and made money off of it?

And it acknowledges it copied your game.

KwegiboHB
Feb 2, 2004

nonconformist art brut
Negative prompt: amenable, compliant, docile, law-abiding, lawful, legal, legitimate, obedient, orderly, submissive, tractable
Steps: 32, Sampler: DPM++ 2M Karras, CFG scale: 11, Seed: 520244594, Size: 512x512, Model hash: 99fd5c4b6f, Model: seekArtMEGA_mega20

StratGoatCom posted:

No, but it has to be weighed against the interest of the owner. By that metric, Stability fails spectacularly.

Because it pays for skill in art you can use elsewhere, and is better for econ dev then a flat machine because it makes more IP. And, if a machine can ingest whatever and be okay, then if you want an IP, you feed it into it, then you can derive from it however you like with the owner having no recourse. Completely nonviable where ownership matters.

I think I see now. The part you're missing is if Stability.ai were to close up shop today, absolutely nothing would change. The tools they released open-source can not be taken back. Those tools include the ability to make a model that txt2img runs on. It's a shame I can't finish building my next computer until next month, but I fully plan on starting a diffusion model from scratch using only public domain images. What argument are you going to have against this?
I want to stress again, the model file, of which there are tens of thousands of now, is completely seperate from the image generation process.
You seem to want to ban the phonograph because you don't like the record album, the tape player because you don't like the cassette, the vcr because you don't like the vhs tape, the cd player because you don't like the cd.

It is difficult to get a man to understand something when his salary depends upon his not understanding it. - Upton Sinclair

StratGoatCom
Aug 6, 2019

Our security is guaranteed by being able to melt the eyeballs of any other forum's denizens at 15 minutes notice


KwegiboHB posted:

I think I see now. The part you're missing is if Stability.ai were to close up shop today, absolutely nothing would change. The tools they released open-source can not be taken back. Those tools include the ability to make a model that txt2img runs on. It's a shame I can't finish building my next computer until next month, but I fully plan on starting a diffusion model from scratch using only public domain images. What argument are you going to have against this?
I want to stress again, the model file, of which there are tens of thousands of now, is completely seperate from the image generation process.
You seem to want to ban the phonograph because you don't like the record album, the tape player because you don't like the cassette, the vcr because you don't like the vhs tape, the cd player because you don't like the cd.

It is difficult to get a man to understand something when his salary depends upon his not understanding it. - Upton Sinclair

Open source does not exempt you from copyright and otherwise following the law, and if they fold their dev stops because fresh trains as opposed to tune.

SubG
Aug 19, 2004

It's a hard world for little things.

Bar Ran Dun posted:

You should remember I don’t consider tools to be infused with meaning of what they were created for. I do not agree with the thesis of Audrey Lorde about the masters tools. But I do think we should be informed by and should consider history and the abuse of intellectual tools when we think about how to use them!
Cool. I wouldn't say I disagree with any of this, but I really don't see how you get to the position your advocating without using "developed for colonialism" as shorthand for "bad".

Like say you're in 11th Century England, and the three field system is in the process of being introduced. We can use this (I hope) as a shorthand for the multiple refinements to agricultural technology in mediaeval Europe which were produced by what we now might call feudal power structures, and which we can now observe were absolutely essential for the expansion and stability of those power structures. Okay, if we posit that, how do we apply the precautions you're describing to this technology in this situation? Do we oppose the use of crop rotation (or the modern plow design, or, say, the wine press) because they have the effect of eliminating human labour? Because they have the effect of strengthening vassalage? Do we accept them in this situation but reject it in others (because minimising the amount of required human labour has a different meaning outside of market capitalism)?

Again, I understand the point you're making that in the abstract it's worthwhile to understand the history of things. I'm just not seeing the actual argument you're apparently trying to make predicated on this understanding.

Serotoning
Sep 14, 2010

D&D: HASBARA SQUAD
HANG 'EM HIGH


We're fighting human animals and we act accordingly

Jaxyon posted:

So what if a company took your core gameplay, and ruleset, and spit out essentially that gameplay and ruleset but themed it for Harry Potter or Marvel, and made money off of it?

And it acknowledges it copied your game.

That's not a millionth of a tiny piece, that's a big shank. This is where the board game and visual art analogy kinda breaks down because of differences in discreteness and ease of remixing. I guess you could use a like ChatGPT style story analogy where a large swath of your writing style or story tropes are used.

But in the above example, I would have no recourse whatsoever because mechanics and rulesets are not copyrightable. If we wanna use my story analogy instead, where let's say the core conceit of my story is used and with a similar writing style, etc., then maybe I would start to feel imitated, but not necessarily ripped off. Once I release the cake, the exact recipe isn't known, but a wise human (or well trained machine) can parcel out what it thinks are the essential characteristics of my cake and try to make a similar cake. And it's not like I have any lease on butter, eggs, and flour.

KwegiboHB
Feb 2, 2004

nonconformist art brut
Negative prompt: amenable, compliant, docile, law-abiding, lawful, legal, legitimate, obedient, orderly, submissive, tractable
Steps: 32, Sampler: DPM++ 2M Karras, CFG scale: 11, Seed: 520244594, Size: 512x512, Model hash: 99fd5c4b6f, Model: seekArtMEGA_mega20

StratGoatCom posted:

Open source does not exempt you from copyright and otherwise following the law, and if they fold their dev stops because fresh trains as opposed to tune.

Their dev stops. The rest of the world using it does not. The open source community has shown itself to be way more powerful than any one company based on small inputs from many many individuals.
The current metrics for 'from scratch' stable diffusion 2.0 models run at roughly one week and $50k to tune on billions of images. I plan on taking much much longer than one week and only starting with millions of images.
You will have your 'ethical set' if I have to personally shove it down your throat. What then?

StratGoatCom
Aug 6, 2019

Our security is guaranteed by being able to melt the eyeballs of any other forum's denizens at 15 minutes notice


KwegiboHB posted:

Their dev stops. The rest of the world using it does not. The open source community has shown itself to be way more powerful than any one company based on small inputs from many many individuals.
The current metrics for 'from scratch' stable diffusion 2.0 models run at roughly one week and $50k to tune on billions of images. I plan on taking much much longer than one week and only starting with millions of images.
You will have your 'ethical set' if I have to personally shove it down your throat. What then?

Yes, and open source communities will not well survive being driven off the trustworthy websides by deserved DMCA strikes, and you underestimate the core work. AI freaks will be recurring infestations that occasionally cause problems for real artists like the Ehentai wierdos, but they'll soon learn to heed the takedowns like they do.

KwegiboHB
Feb 2, 2004

nonconformist art brut
Negative prompt: amenable, compliant, docile, law-abiding, lawful, legal, legitimate, obedient, orderly, submissive, tractable
Steps: 32, Sampler: DPM++ 2M Karras, CFG scale: 11, Seed: 520244594, Size: 512x512, Model hash: 99fd5c4b6f, Model: seekArtMEGA_mega20

StratGoatCom posted:

Yes, and open source communities will not well survive being driven off the trustworthy websides by deserved DMCA strikes, and you underestimate the core work. AI freaks will be recurring infestations that occasionally cause problems for real artists like the Ehentai wierdos, but they'll soon learn to heed the takedowns like they do.

You are going to DMCA strike public domain works? Good luck with that.

Bar Ran Dun
Jan 22, 2006




SubG posted:

using "developed for colonialism" as shorthand for "bad".

Tools are amoral. One can use a hammer to nail together a deck. One can use a hammer to hit another person in the head. Neither is intrinsic to the hammer. But if we think about possible uses of hammers we can talk about good and bad uses of the tool.

“Nothing is intrinsically good but good-will, nothing is intrinsically evil but ill-will” - Moral Man and Immoral Society

We recognize that it will get used for good and bad ends and we incorporate this into the suite of ideas about how we use the tool.

Bar Ran Dun fucked around with this message at 23:19 on May 26, 2023

StratGoatCom
Aug 6, 2019

Our security is guaranteed by being able to melt the eyeballs of any other forum's denizens at 15 minutes notice


KwegiboHB posted:

You are going to DMCA strike public domain works? Good luck with that.

The models aren't, they're infringement machines and as soon as the law catches on, they're for the highjump.

SubG
Aug 19, 2004

It's a hard world for little things.

Bar Ran Dun posted:

Tools are amoral. One use a hammer to mail together a deck. One can use a hammer to hit another person in the head. Neither is intrinsic to the hammer. But if we think about possible uses of hammers we can talk about good and bad uses of the tool.

“Nothing is intrinsically good but good-will, nothing is intrinsically evil but ill-will” - Moral Man and Immoral Society

We recognize that it will get used for good and bad ends and we incorporate this into the suite of ideas about how we use the tool.
I understand that's what you're saying now. So using "These are tools created as systems for colonialism and capitalism" as the mic drop at the end of a post was just...rhetorical flourish? Non sequitur?

I mean either way that's cool. It just seemed (to me at any rate) like it had something to do with where you immediately took discussion of expert systems, making the point that "they represent the removal of a human element", with the implication that this is undesirable.

Just trying to engage with your argument.

Adbot
ADBOT LOVES YOU

KwegiboHB
Feb 2, 2004

nonconformist art brut
Negative prompt: amenable, compliant, docile, law-abiding, lawful, legal, legitimate, obedient, orderly, submissive, tractable
Steps: 32, Sampler: DPM++ 2M Karras, CFG scale: 11, Seed: 520244594, Size: 512x512, Model hash: 99fd5c4b6f, Model: seekArtMEGA_mega20

StratGoatCom posted:

The models aren't, they're infringement machines and as soon as the law catches on, they're for the highjump.

https://vaisual.com/

The big difference is that I don't plan on charging for the model I end up making.
You're going to find yourself without an argument sooner than you think.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply