Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Tei
Feb 19, 2011

What is art? art is dying of hunger.


AI present some risks, and of now, is driven by the worst type of people our society has produced, the libertarian cryptobro and libertarian CEOs. These people are basically sociopathic.

The way things like AI ART has been created and presented, is to assasinate (replace) artist, and by stealing artist work.

Of course artist are annoyed and scared. Is not only the dying of hunger thing, that was old, is their disrepect and removing them of the only thing they have, their authorship.


All the "democraticing the process" is a lie. The average person will not have the means (hardware) and knowledge (how to install and configure it) to create their own IA ART thing. Only these with the tools can do it. We are not seeing the art of Random Person, we see the art created by Corporation X. Anyone can download Gimp and paint something, but is much more complicated to have your own Midjourney that you yourself trained with your own dataset of images, and configured with your own ideas. Is very likelly that AI ART will become a service controlled by huge corporations of the size of Microsoft or Google. Democratizacion my rear end. Something that was free before (art) will be controlled by Benevolent GODS, like Microsoft, that will dictate what is possible and what is not. If they choose to ban the word "jew" or "cross" from prompt, you will be unable to make stuff with that.

Adbot
ADBOT LOVES YOU

KillHour
Oct 28, 2007


Tei posted:

What is art? art is dying of hunger.


AI present some risks, and of now, is driven by the worst type of people our society has produced, the libertarian cryptobro and libertarian CEOs. These people are basically sociopathic.

The way things like AI ART has been created and presented, is to assasinate (replace) artist, and by stealing artist work.

Of course artist are annoyed and scared. Is not only the dying of hunger thing, that was old, is their disrepect and removing them of the only thing they have, their authorship.


All the "democraticing the process" is a lie. The average person will not have the means (hardware) and knowledge (how to install and configure it) to create their own IA ART thing. Only these with the tools can do it. We are not seeing the art of Random Person, we see the art created by Corporation X. Anyone can download Gimp and paint something, but is much more complicated to have your own Midjourney that you yourself trained with your own dataset of images, and configured with your own ideas. Is very likelly that AI ART will become a service controlled by huge corporations of the size of Microsoft or Google. Democratizacion my rear end. Something that was free before (art) will be controlled by Benevolent GODS, like Microsoft, that will dictate what is possible and what is not. If they choose to ban the word "jew" or "cross" from prompt, you will be unable to make stuff with that.

This sounds like a compelling argument, but it doesn't reflect the reality of the situation at all. Stable Diffusion is open source and there are open source UIs that make setting it up not much more difficult than GIMP.

https://github.com/AUTOMATIC1111/stable-diffusion-webui

You don't need to train an entire model to get your own style. LoRA is very effective at fine tuning the model to get certain outputs, and they are easy enough to make that there are literally thousands and thousands of them made by individuals.

https://huggingface.co/blog/lora

Here's a marketplace for LoRA (warning :nws: - a lot of these are porn because of course they are): https://civitai.com/

The idea that customizing a diffusion model without being beholden to a big company is out of reach of an individual artist is just not true.

Tei
Feb 19, 2011

KillHour posted:

This sounds like a compelling argument, but it doesn't reflect the reality of the situation at all. Stable Diffusion is open source and there are open source UIs that make setting it up not much more difficult than GIMP.

https://github.com/AUTOMATIC1111/stable-diffusion-webui

You don't need to train an entire model to get your own style. LoRA is very effective at fine tuning the model to get certain outputs, and they are easy enough to make that there are literally thousands and thousands of them made by individuals.

https://huggingface.co/blog/lora

Here's a marketplace for LoRA (warning :nws: - a lot of these are porn because of course they are): https://civitai.com/

The idea that customizing a diffusion model without being beholden to a big company is out of reach of an individual artist is just not true.

No, you are wrong, dear KillHour, my friend.

poo poo like this:



code:
export MODEL_NAME="runwayml/stable-diffusion-v1-5"
export OUTPUT_DIR="/sddata/finetune/lora/pokemon"
export HUB_MODEL_ID="pokemon-lora"
export DATASET_NAME="lambdalabs/pokemon-blip-captions"

accelerate launch --mixed_precision="fp16"  train_text_to_image_lora.py \
  --pretrained_model_name_or_path=$MODEL_NAME \
  --dataset_name=$DATASET_NAME \
  --dataloader_num_workers=8 \
  --resolution=512 --center_crop --random_flip \
  --train_batch_size=1 \
  --gradient_accumulation_steps=4 \
  --max_train_steps=15000 \
  --learning_rate=1e-04 \
  --max_grad_norm=1 \
  --lr_scheduler="cosine" --lr_warmup_steps=0 \
  --output_dir=${OUTPUT_DIR} \
  --push_to_hub \
  --hub_model_id=${HUB_MODEL_ID} \
  --report_to=wandb \
  --checkpointing_steps=500 \
  --validation_prompt="Totoro" \
  --seed=1337
This is going to live in the cloud, managed by programmers and system administrators, not artist. It will be a service, with a corporation setting the rules.
Even if just now somebody can download that poo poo.
And even if just now somebody can download and understand this poo poo enough to run it, is going to be some guy with some technical side, not some random person.

Edit:
wow, the links you provided are pretty cool, thanks!

Tei fucked around with this message at 17:56 on May 18, 2023

KillHour
Oct 28, 2007


I run it just fine on my local PC. I don't need anyone's permission or a big company to do so. As for technical competency - you need to learn any tool. A command prompt is not some scary impossible barrier. Using GIMP is a technical skill just as much as installing Python. Moving a paintbrush takes training and practice to do well, just as creating a LoRA. If you want to make cutting edge digital art, you need to know how to use a computer.

That image you posted is absolutely irrelevant. You don't need to understand all of that to train or use a model any more than you need to be able to understand linear algebra use a Sobel filter in GIMP.

https://en.wikipedia.org/wiki/Sobel_operator#Technical_details

It's included in that link because that is a technical explanation of what LoRA is and how it works. Understanding it is cool if you're into that, but so is a deep dive on the chemistry of paint mixing, and most artists absolutely do not need to fully understand it.

Edit

Tei posted:

Edit:
wow, the links you provided are pretty cool, thanks!

You're welcome. Seriously, check out the AI Art thread in GBS to at least see what individuals are doing with it today. There's lots of talk there about experiments they are doing and individuals absolutely are making a direct impact and discovering new things. It's very cool.

https://forums.somethingawful.com/showthread.php?threadid=4000251

You don't have to agree that it's good, but please at least take some time to understand the reality of how individual artists are really pushing the tools.

Double Edit: I should point out that the initial implementation of LoRA into Stable Diffusion was something done by a single person in the community. Obviously, not everyone is going to be able to just hack new features into SD like that; but in the same way, not everyone is able to make a plugin for GIMP. The point is that it's individuals who are driving a lot of this in an open source way, and that's really cool.

https://github.com/cloneofsimo/lora

KillHour fucked around with this message at 18:22 on May 18, 2023

Less Fat Luke
May 23, 2003

Exciting Lemon

Tei posted:

No, you are wrong, dear KillHour, my friend.

poo poo like this:
This is going to live in the cloud, managed by programmers and system administrators, not artist. It will be a service, with a corporation setting the rules.
Even if just now somebody can download that poo poo.
And even if just now somebody can download and understand this poo poo enough to run it, is going to be some guy with some technical side, not some random person.

Edit:
wow, the links you provided are pretty cool, thanks!
There are installers for the "Automatic1111" stable diffusion package that have everything self contained in an easy web interface, including the training and there are already zillions of guides on how to do it. There will definitely be paid and free services offering this but a lot of nerds have worked quite hard to make it very accessible. It's honestly really cool to see how fast the community builds things on it.

Edit: The pace of the local LLM work too is really crazy. I was using a local model in a slick web interface named vicuna which didn't quite fit into my video card's memory so I grabbed some regular RAM to play with it and by the time the memory arrived people had figured out how to quantize the model so that it then did fit in the 24GB of GPU space (like 4 days after it's release or something).

Less Fat Luke fucked around with this message at 18:25 on May 18, 2023

KillHour
Oct 28, 2007


Less Fat Luke posted:

a lot of nerds have worked quite hard to make it very accessible.

You can tell a lot about a group of nerds based on how much effort they put into making something accessible to non-nerds. The SD community has made an insane amount of progress in a very short amount of time.

GlyphGryph
Jun 23, 2013

Down came the glitches and burned us in ditches and we slept after eating our dead.

Tei posted:

This is going to live in the cloud, managed by programmers and system administrators, not artist. It will be a service, with a corporation setting the rules.
Even if just now somebody can download that poo poo.
And even if just now somebody can download and understand this poo poo enough to run it, is going to be some guy with some technical side, not some random person.

Edit:
wow, the links you provided are pretty cool, thanks!

Lots of programmers and system administrators are artists, and a surprising number of them spend their time building tools to help other artists who aren't as technically capable just to do it. Most artists are also perfectly capable of learning some programming and technical skills - a great many of the artists I know have done just that specifically to realize some artistic vision or other. Are they experienced developers? Of course not. But they know enough to bash their way through to what they want... or they collaborate with those who do. Because every successful artist I know is also part of a community - none of them are operating in isolation.

I'm not going to argue AI is going to be a net positive for artist careers - I think it will hurt them quite badly - I just think a lot of the anger is misdirected, and its efforts like the above that will be hurt in favour of tighter copyrights and corporate control.

BrainDance
May 8, 2007

Disco all night long!

Tei posted:


poo poo like this:

You do not need to do any of this to fine-tune a model and almost no one does. You can, it's how I do it (I use the JoePenna repo for dreambooth because I have a 4090 dammit and I'm gonna use it) but the vast majority of people making dreambooth models now don't write the little scripts, which actually aren't even hard because you're just copying and pasting then changing a couple things.


Most dreambooth training now is done in a little gui in auto1111, though there are a bunch of others, some standalone, which involve just pressing some buttons and are no harder than any other graphic design tool and much easier than learning how to use Photoshop.

If you want to talk confidently about AI you really should go learn more about where things are actually at in the AI world and how this stuff actually works.

I even wrote some guides on all this stuff that are on my website and some I posted on SA that are just long "mostly just copy and paste what I do I'm holding your hand through the complicated stuff" to get people training their own language models (which has become infinitely easier since I wrote my guide actually, but I still think it's important to understand the nuts and bolts if you can)

And there are tons of other people doing the same. All the crazy nerd stuff? The nerds are doing it for you, because we all recognized how massively important accessibility is for the open source models.

BrainDance fucked around with this message at 23:51 on May 18, 2023

America Inc.
Nov 22, 2013

I plan to live forever, of course, but barring that I'd settle for a couple thousand years. Even 500 would be pretty nice.
Small diversion in topic:
I find it funny that people need to be reminded that ChatGPT doesn't always provide factual information, given that growing up with the Internet you knew that you shouldn't trust everything you find on Google and Wikipedia.

In fact, given that most people don't click through more than the first two pages of a search when these queries have thousands or millions of results, search engines were already mostly returning garbage with a few actually "good" results on top provided by companies who are good at SEO. Search engines have always been more of an advertising tool than a knowledge tool. A search engine provides you with a shelf of different items of information that compete and whose value is not in truth but in utility. There is only one answer to the question "what is 1+1?" but there's lots of different ways to make pasta. For this reason I predict Google Bard is going to be a flop.

E: Google was apparently aware of this already, which is why they were more conservative in launching in the first place.

Wikipedia isn't perfect but it's curated by people who are usually experts on a given topic.

Which leads into a question: what would be the AI equivalent of Wikipedia? That is to say, how can you provide a curated repository of knowledge that also provides answers in a conversational format?

America Inc. fucked around with this message at 10:46 on May 19, 2023

Clarste
Apr 15, 2013

Just how many mistakes have you suffered on the way here?

An uncountable number, to be sure.
I feel like the hard part would be training it to understand what a "fact" is as opposed to just mixing random parts of wikipedia together into equally unfactual statements. Like, the problem isn't that it's been fed lies or is uncurated, it's that it fundamentally has no idea what truth even is. Even if you somehow magically fed it a list of curated "facts," it's not clear how its natural language function is supposed to even process that input.

America Inc.
Nov 22, 2013

I plan to live forever, of course, but barring that I'd settle for a couple thousand years. Even 500 would be pretty nice.

Clarste posted:

I feel like the hard part would be training it to understand what a "fact" is as opposed to just mixing random parts of wikipedia together into equally unfactual statements. Like, the problem isn't that it's been fed lies or is uncurated, it's that it fundamentally has no idea what truth even is. Even if you somehow magically fed it a list of curated "facts," it's not clear how its natural language function is supposed to even process that input.

I agree, it's not possible to have someone verify every possible question that users could ask, and at best you'll have a system that warns "this may not be true" and then links to a Wikipedia article. There's not much value add.

Ultimately it's all about whether AI can help people be more creative - or not.

E: and it's interesting because what do you need for creativity? 1. knowledge of prior works, 2. good tools, and 3. experience and taste. The internet provides the first and computers the second, and AI is just combining both for humans who have the third.

America Inc. fucked around with this message at 11:02 on May 19, 2023

Blut
Sep 11, 2009

if someone is in the bottom 10%~ of a guillotine

America Inc. posted:


Wikipedia isn't perfect but it's curated by people who are usually experts on a given topic.

Which leads into a question: what would be the AI equivalent of Wikipedia? That is to say, how can you provide a curated repository of knowledge that also provides answers in a conversational format?

This is unfortunately a very optimistic assessment of wikipedia. There are far too many articles on it written by people who aren't remotely expert, they just have too much free time and a fetish for wikipedia edit wars.

https://www.theguardian.com/uk-news/2020/aug/26/shock-an-aw-us-teenager-wrote-huge-slice-of-scots-wikipedia

Its a lot closer to AI in its "confidently incorrect" style in more places than you'd think.

SaTaMaS
Apr 18, 2003

America Inc. posted:

Small diversion in topic:
I find it funny that people need to be reminded that ChatGPT doesn't always provide factual information, given that growing up with the Internet you knew that you shouldn't trust everything you find on Google and Wikipedia.

In fact, given that most people don't click through more than the first two pages of a search when these queries have thousands or millions of results, search engines were already mostly returning garbage with a few actually "good" results on top provided by companies who are good at SEO. Search engines have always been more of an advertising tool than a knowledge tool. A search engine provides you with a shelf of different items of information that compete and whose value is not in truth but in utility. There is only one answer to the question "what is 1+1?" but there's lots of different ways to make pasta. For this reason I predict Google Bard is going to be a flop.

E: Google was apparently aware of this already, which is why they were more conservative in launching in the first place.

Wikipedia isn't perfect but it's curated by people who are usually experts on a given topic.

Which leads into a question: what would be the AI equivalent of Wikipedia? That is to say, how can you provide a curated repository of knowledge that also provides answers in a conversational format?

As far as LLMs this is called "temperature". A lower temperature setting results in a more predictable and factual response, while a higher temperature setting results in a more creative response. Though a low temperature setting can still produce incorrect responses if the model is poorly trained.

Count Roland
Oct 6, 2013

I believe LLMs are poor at logic. Dealing with facts requires the AI to state things are true or false. Such statements can be logically modified ie if x is true then y. A model that is guessing the next symbols in a phrase will sometimes pull this off but can't itself be reliable. The AI needs to do logical operations. Which I assume is possible, given how logic-based computing is.

cat botherer
Jan 6, 2022

I am interested in most phases of data processing.

Count Roland posted:

I believe LLMs are poor at logic. Dealing with facts requires the AI to state things are true or false. Such statements can be logically modified ie if x is true then y. A model that is guessing the next symbols in a phrase will sometimes pull this off but can't itself be reliable. The AI needs to do logical operations. Which I assume is possible, given how logic-based computing is.
You're kind of touching on symbolic AI or expert systems, which was really the first way of doing AI before statistical methods took over. They're really useful in some problem domains. I think figuring out to integrate the two approaches will be a big deal in the future, given that they tend to be good at complimentary things.

SaTaMaS
Apr 18, 2003

cat botherer posted:

You're kind of touching on symbolic AI or expert systems, which was really the first way of doing AI before statistical methods took over. They're really useful in some problem domains. I think figuring out to integrate the two approaches will be a big deal in the future, given that they tend to be good at complimentary things.

Deepmind actually found a way to do this for AlphaFold, they figured out how to get the neural network to treat atomic physics as an immutable ground truth when figuring out how proteins fold.

parthenocarpy
Dec 18, 2003

Wolfram also has a lot of work put into creating and maintaining properties of matter to be treated as factual information by the system accessing it. You can have a loop that goes [User input] -> [Creative response draft] -> [Wolfram fact checking] -> [Final response]

Tei
Feb 19, 2011

I remember studying in the university a 4GL language with reasoning qualities, that could be used to queried information. LOGOL or something, forgot what was called.

Smiling Demon
Jun 16, 2013

Tei posted:

I remember studying in the university a 4GL language with reasoning qualities, that could be used to queried information. LOGOL or something, forgot what was called.

Do you mean Prolog?

I've always hated Prolog.

Hashy
Nov 20, 2005

SCheeseman posted:

. A technology that uses open source software based on publicly available research that runs on general computing hardware available to consumers has no chance of being stuffed back into a box.
There are a lot of things that are easy to do and reproduce, even some that make a lot of money, that we regulate against and shame people for

SCheeseman
Apr 23, 2003

Hashy posted:

There are a lot of things that are easy to do and reproduce, even some that make a lot of money, that we regulate against and shame people for
I don't think it's practical to heavily regulate machine learning, given it's already pervasive use in a bunch of benign applications (and less benign but desirable to government and business). The greater point was that anti-AI advocates solutions that use copyright as a fixer is a long term self-own that inhibits the general public's open access to the technology, which they might not care about because they don't want to use it but probably will care once they are forced to in order to keep their jobs and need to pay rent for access to it.

Has there ever been a form of automation technology, anything truly useful, that was inhibited long term by political and social pressure? Not a rhetorical question.

SaTaMaS
Apr 18, 2003

SCheeseman posted:

I don't think it's practical to heavily regulate machine learning, given it's already pervasive use in a bunch of benign applications (and less benign but desirable to government and business). The greater point was that anti-AI advocates solutions that use copyright as a fixer is a long term self-own that inhibits the general public's open access to the technology, which they might not care about because they don't want to use it but probably will care once they are forced to in order to keep their jobs and need to pay rent for access to it.

Has there ever been a form of automation technology, anything truly useful, that was inhibited long term by political and social pressure? Not a rhetorical question.

In the UK the Red Flag Act was passed in 1865 , which required a man to walk in front of every automobile waving a red flag to warn pedestrians and horse-drawn vehicles. It wasn't repealed until 1896.

Count Roland
Oct 6, 2013

As this predates general use of the internal combustion engine, I looked around to see what this law actually regulated.

Steam Powered Road Locomotives

Private Speech
Mar 30, 2011

I HAVE EVEN MORE WORTHLESS BEANIE BABIES IN MY COLLECTION THAN I HAVE WORTHLESS POSTS IN THE BEANIE BABY THREAD YET I STILL HAVE THE TEMERITY TO CRITICIZE OTHERS' COLLECTIONS

IF YOU SEE ME TALKING ABOUT BEANIE BABIES, PLEASE TELL ME TO

EAT. SHIT.


Not even unique to the UK:

Independent posted:

1896, a law was proposed in Pennsylvania requiring all drivers of horseless carriages, "upon chance encounters with cattle or livestock to (1) immediately stop the vehicle, (2) immediately and as rapidly as possible disassemble the automobile, and (3) conceal the various components out of sight behind nearby bushes until equestrian or livestock is sufficiently pacified."

More recently drones got heavily regulated in many places to stop recreational use (only a few of the new regulations apply to commercial use, where overflights and filming are a necessity given what it is used for).

Then there are things like the prohibition, abortions, weed regulation, etc. Or even modern copyright extensions for that matter.

Regulations are not inherently good.

e: Some other examples are the US banning one of the better sweeteners based on an old flawed study and regulations requiring the use of wire nuts in electrical wiring, which have a significantly higher failure rate due to installation errors than clamp connectors used in Europe. There are also regulations which limit passenger rail freight in the US in favour of commercial freight.

There's also regulatory capture where regulations are effectively written by big business to protect their interests, which very much seems like it would be the case here.

Private Speech fucked around with this message at 03:14 on May 22, 2023

SCheeseman
Apr 23, 2003

SaTaMaS posted:

In the UK the Red Flag Act was passed in 1865 , which required a man to walk in front of every automobile waving a red flag to warn pedestrians and horse-drawn vehicles. It wasn't repealed until 1896.

It went from this to "get off the loving road and make room for cars or you'll get a fine" as soon as vehicles became accessible and useful.

Private Speech
Mar 30, 2011

I HAVE EVEN MORE WORTHLESS BEANIE BABIES IN MY COLLECTION THAN I HAVE WORTHLESS POSTS IN THE BEANIE BABY THREAD YET I STILL HAVE THE TEMERITY TO CRITICIZE OTHERS' COLLECTIONS

IF YOU SEE ME TALKING ABOUT BEANIE BABIES, PLEASE TELL ME TO

EAT. SHIT.


SCheeseman posted:

It went from this to "get off the loving road and make room for cars or you'll get a fine" as soon as vehicles became accessible and useful.

Only in the US, even now you can walk along the roads in the UK.

It might not be the best idea, but US especially went really hard on "jaywalking".

Also it took a long time for the regulations to go away, not really as soon as.

Private Speech fucked around with this message at 03:15 on May 22, 2023

SCheeseman
Apr 23, 2003

Private Speech posted:

Only in the US, even now you can walk along the roads in the UK.

It might not be the best idea, but US especially went really hard on "jaywalking".

Also it took a long time for the regulations to go away, not really as soon as.

Jaywalking is a thing in Australia too, it wasn't a US only phenomenon.

Goa Tse-tung
Feb 11, 2008

;3

Yams Fan

Private Speech posted:

Regulations are not inherently good.
Not regulating is neither.

My proposal is to make entities pay that seek to profit from using others works. If i.e. DC wants to create a Powergirl cover in the style of Jen Bartel you simply gotta pay her. Make a law that requires training data to be public and force AI generated content to point to that data. You use = you pay, simple as that.

Liquid Communism
Mar 9, 2004

коммунизм хранится в яичках

Count Roland posted:

I believe LLMs are poor at logic. Dealing with facts requires the AI to state things are true or false. Such statements can be logically modified ie if x is true then y. A model that is guessing the next symbols in a phrase will sometimes pull this off but can't itself be reliable. The AI needs to do logical operations. Which I assume is possible, given how logic-based computing is.

They're not so much poor at it as incapable of it. There's no logical processing going on, just a query of what is most likely to be the right response based on what is related in their training data.

It's the same thing for fiction writing, it's honestly more work to make anything beyond a very short LLM response readable than writing your own rough draft because it is incapable of context and confidently wrong often enough that you're going to lose a ton of time editing for continuity, factuality, and catching plagiarism.

GlyphGryph
Jun 23, 2013

Down came the glitches and burned us in ditches and we slept after eating our dead.

Goa Tse-tung posted:

Not regulating is neither.

My proposal is to make entities pay that seek to profit from using others works. If i.e. DC wants to create a Powergirl cover in the style of Jen Bartel you simply gotta pay her. Make a law that requires training data to be public and force AI generated content to point to that data. You use = you pay, simple as that.

If you decide to cover artistic styles with copyright, you are loving over and absolutely massive number of artists forevermore. Just something worth keeping in mind. Giving an individual a nearly indefinite monopoly over not just the pieces of art they created but an an entire style or art is exactly the sort of bullshit I was talking about people pushing for earlier.

Also, I have no idea what you mean by "force AI generated content to point to that data" and I'm pretty sure you don't either.

Clarste
Apr 15, 2013

Just how many mistakes have you suffered on the way here?

An uncountable number, to be sure.
They mean force the program to spit out which pieces of training data it is referencing, something like a bibliography. I'm not sure that's practical if it's using like millions of things at a time, but I don't think the concept is hard to understand. As for "copywriting styles" you seem to be assuming that we would have to apply it to humans too, who of course often base their styles on others or do parody works. But that's not the case if we assume that the AI isn't actually generating anything but simply putting existing pictures into a blender in some complicated manner. If a piece of art was "copied" by the AI, which would be clear if it published a bibliography of its references, then the original artists would be owed money. Was the idea.

Edit: This isn't even a philosophical question like "what is art" or whatever, it's a legal question about which laws we want to apply to AIs, and which we don't. There's no reason to be "fair" about it.

Clarste fucked around with this message at 12:58 on May 22, 2023

GlyphGryph
Jun 23, 2013

Down came the glitches and burned us in ditches and we slept after eating our dead.

Clarste posted:

They mean force the program to spit out which pieces of training data it is referencing, something like a bibliography.

They don't reference things, though? Once the AI is created and trained it doesn't have any of the original artwork to reference, and conceptually it doesn't make any sense anyhow. That's not how these things work.

Clarste
Apr 15, 2013

Just how many mistakes have you suffered on the way here?

An uncountable number, to be sure.
Then force it to keep track of that? Rewrite the programs so they have to. If they can't, then it's illegal. Boom, solved.

Like, humans can and do read a bunch of stuff and synthesize it without keeping track of where each individual idea came from, but we still require them to cite their sources to avoid cries of plagiarism. Just because they don't do it now doesn't mean they shouldn't.

Clarste fucked around with this message at 13:11 on May 22, 2023

Tree Reformat
Apr 2, 2022

by Fluffdaddy
Yeah, let's be clear, such a system in practice would have to spit out its entire training set every time it generated literally anything.

Much like how government-mandated backdoors effectively criminalize consumer-facing encryption, this proposal would effectively criminalize all consumer-facing ai that relies on human-created data for training. It's a ban on the technology (at least at the scale we're seeing), which is the proponents' actual goal.

Clarste
Apr 15, 2013

Just how many mistakes have you suffered on the way here?

An uncountable number, to be sure.
I mean, I don't think that's a secret? It's not like some ulterior motive. It's just a straightforward "if AI cannot meet this minimum standard of copyright protection, then it should be illegal." Just like "if this industry cannot function without killing 10 children a day, it should be illegal." There is no, like, legal or moral obligation for these things to exist.

GlyphGryph
Jun 23, 2013

Down came the glitches and burned us in ditches and we slept after eating our dead.

Clarste posted:

Then force it to keep track of that? Rewrite the programs so they have to. If they can't, then it's illegal. Boom, solved.

Like, humans can and do read a bunch of stuff and synthesize it without keeping track of where each individual idea came from, but we still require them to cite their sources to avoid cries of plagiarism. Just because they don't do it now doesn't mean they shouldn't.

If your goal is to ban the tech, just ban the tech, don't wrap it up in a dressing that could potentially have serious knockoff effects and hurt real artists.

Clarste posted:

Like, humans can and do read a bunch of stuff and synthesize it without keeping track of where each individual idea came from, but we still require them to cite their sources to avoid cries of plagiarism. Just because they don't do it now doesn't mean they shouldn't.

We do not require humans to cite their stylistic sources.

Clarste posted:

I mean, I don't think that's a secret? It's not like some ulterior motive. It's just a straightforward "if AI cannot meet this minimum standard of copyright protection, then it should be illegal." Just like "if this industry cannot function without killing 10 children a day, it should be illegal." There is no, like, legal or moral obligation for these things to exist.

You specifically said there is no copyright element involved in this proposal - that was how it would avoid hurting the careers of existing artists. That this would be an additional, AI-only regulation unrelated to it, and that it would serve as a defacto ban because its custom tailored to be impossible. This is why I'm sure any such proposal would hurt artists - if you can't prevent it from becoming a copyright issue within the span of a few minutes, how could we trust legislators to?

Rogue AI Goddess
May 10, 2012

I enjoy the sight of humans on their knees.
That was a joke... unless..?
Banning all new technologies by default unless there is a moral imperative for them to exist and unless they have no impact on the status quo is certainly a stance, but not a realistically enforceable one.

Regulation aside, I would not be surprised if "AI-free" works became the creative equivalent of organic, GMO-free, no pesticides/antibiotics food, with the extra cost of hiring humans passed on to the consumers.

Serotoning
Sep 14, 2010

D&D: HASBARA SQUAD
HANG 'EM HIGH


We're fighting human animals and we act accordingly
So I was pretty firmly in the "AI models synthesize their training data into new images/outputs in much the same way a human artist pulls inspiration from various sources in a manner that we are already cool with" mindset but have been reckoning with more with this "creative merit added" aspect recently. A helpful framework for me has been likening it to music, where we have a pretty established trend of remixes and mashups with clear (and sometimes not so clear) lineages. How are remixes and (perhaps the more applicable analogy) mashups of songs treated by copyright law? Very cursory Googling and a bit of common sense would reveal discussions of the "artistic merit" of a remix or mashup apart from it's source material which allows it to be legal under fair use.

What is the artistic merit of what an AI model does to its training data, which is in most cases at the moment is basically via a random diffusion process whereby noise is iteratively "improved" to the point of resembling something the model has seen before?

Monglo
Mar 19, 2015
Cant we just say it has "artistic merit" without defining it? Feels like what we do with all art.

Adbot
ADBOT LOVES YOU

GlyphGryph
Jun 23, 2013

Down came the glitches and burned us in ditches and we slept after eating our dead.
AI works that closely mimic existing pieces enough that it could be considered the visual equivalent of a remix or a mashup seem like a question that is more likely to be handled on the distribution side, the same way it is right now. Basically, using an AI to do the heavy lifting of producing a work too similar to one that already exists won't serve as any sort of defense.

With the recent Warhol case going the way it did and many of the recent music cases, "artistic merit" is basically irrelevant so long as its commercial work, all that will likely matter is similarity and whether it competes with the original artist.

They're already systems into the AI tools that make such things less likely - everyone involved already has an incentive to not produce works that are too similar to existing pieces under the existing rules.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply