|
https://twitter.com/billyperrigo/status/1615682180201447425?t=Of2xkkmurvbHJkKwUjKq_w&s=19 Here's AI empowering the commoners!
|
# ? Jan 18, 2023 22:02 |
|
|
# ? May 30, 2024 01:54 |
|
openAI, the least open one of the lot, the one funded by elon musk, is bad????????
|
# ? Jan 18, 2023 22:22 |
|
I can't imagine it would be that hard for commercial AI platforms to put in a "copyrighted material" check when they output something, much the way YouTube et al. do it. "How similar is this to a copyrighted image in our training set?" surely can't be that hard to train a model on. That won't suffice if they decide you can't use copyrighted images in training, though (IMO, that would be completely absurd).
|
# ? Jan 18, 2023 22:23 |
|
Sir Mat of Dickie posted:I can't imagine it would be that hard for commercial AI platforms to put in a "copyrighted material" check when they output something, much the way YouTube et al. do it. "How similar is this to a copyrighted image in our training set?" surely can't be that hard to train a model on. That won't suffice if they decide you can't use copyrighted images in training, though (IMO, that would be completely absurd). then don't provide the training images keep building on them and say you destroy them after training so even you aren't sure what it's trained on.
|
# ? Jan 18, 2023 22:24 |
I too am shocked a corporation outsources AI tech work to the same places corporations outsource other tech work to. They'll keep doing it until they have an AI good enough to not need the outsourcing anymore. Same as always.
|
|
# ? Jan 18, 2023 22:26 |
|
Sourdough Sam posted:https://twitter.com/billyperrigo/status/1615682180201447425?t=Of2xkkmurvbHJkKwUjKq_w&s=19 Not saying that this stuff isn't worth investigating and reporting, but this is the same as in literally every industry.
|
# ? Jan 18, 2023 22:43 |
|
deep dish peat moss posted:Presumably only one of those people went through the copyright registration process for it. You don't have to register to own a copyright to your work, but if you don't and then someone else creates an identical work and registers it and you can't prove that it's derivative of yours, they get the copyright. You don’t register Copyright. You’re thinking of Trademark. Anything you produce is automatically copyrighted the moment you produce it.
|
# ? Jan 18, 2023 22:46 |
|
You do in fact register copyright: https://www.copyright.gov/registration/ You don't have to, but doing so provides extra protections and makes it easier to pursue claims and prove yourself as the original creator. e: from much later: Essentially if you do not register a copyright and it gets infringed, then you must prove in court that you are the original creator and own the copyright, which you may or may not be able to do depending on the circumstances. If you do register and then it gets infringed, you are already on record as the owner of the copyright and you do not need to fight (as much of) a legal battle in court to prove yourself. You own the copyright either way, but registering it is like doing all of the court legwork ahead of time - it's insurance that if you take someone to court over copyright infringement, you will win unless they can conclusively prove that their work is not derivative. And if you and another artist create an identical work at the same time, only one of you can be the officially-registered copyright holder, so if you can afford the registration fee it is best to register any copyright that you feel may be infringed.. I don't know if this is "the rule" or whatever but the precedent I've seen in reading about lots of copyright cases: Unregistered copyright = onus is on copyright holder to prove ownership Registered copyright = onus is on the infringing party to prove ownership. (again, I am not a lawyer but I've been reading about this stuff since I was 13 because I got in to editing megaman sprites back then) deep dish peat moss fucked around with this message at 00:09 on Jan 19, 2023 |
# ? Jan 18, 2023 23:10 |
|
Lol google tried building a big database of human-curated sentiment analysis of English text to train their robot on. They outsourced the tagging to lowest-bidder nonnative speakers, with the result that 70% of their human tagged data was wrong.
|
# ? Jan 18, 2023 23:10 |
|
Fitzy Fitz posted:Not saying that this stuff isn't worth investigating and reporting, but this is the same as in literally every industry. The difference is once the necessary people are exploited, made obsolete, and tossed aside you'll have your free culture and entertainment machine to make your favorite thing on demand in perpetuity and you won't have to think about these people who plied a trade ever again!
|
# ? Jan 18, 2023 23:13 |
|
Sourdough Sam posted:The difference is once the necessary people are exploited, made obsolete, and tossed aside you'll have your free culture and entertainment machine to make your favorite thing on demand in perpetuity and you won't have to think about these people who plied a trade ever again! Yeah capitalism is a real bitch, what exactly are you trying to say that is new?
|
# ? Jan 18, 2023 23:48 |
|
KakerMix posted:Yeah capitalism is a real bitch, what exactly are you trying to say that is new? Art was supposed to be safe from the evil AI so they could keep mocking people who did jobs as they lost theirs to the machines
|
# ? Jan 19, 2023 00:03 |
|
Posting will be the last true art, and it's not looking good for us either.
|
# ? Jan 19, 2023 00:07 |
|
Sourdough Sam posted:Posting will be the last true art, and it's not looking good for us either. lol where did you go to posting school and who was your professor. You should ask them for your tuition back, clearly
|
# ? Jan 19, 2023 00:13 |
|
Sourdough Sam posted:The difference is once the necessary people are exploited, made obsolete, and tossed aside you'll have your free culture and entertainment machine to make your favorite thing on demand in perpetuity and you won't have to think about these people who plied a trade ever again! Anyone with a true creative vision will still have a true creative vision that persists in the age of AI. There is always going to be a person driving the AI machine and how that person steers is always going to affect the output. There is always going to be a clear, discernible difference in the creative output of an auteur and your "average artist", whether it's an AI or a human pushing the pencil. If you think that some kind of "art AI" that creates a full-fledged, distinct, original, minimum viable consistent product on its own without trained human guidance is realistic within our lifetimes, you don't have much experience with AI image tools. I understand that it sucks for the working-class artists who are not widely recognized as the authors of their own works and that it is going to affect their livelihoods, but pretending it's about artistic integrity is absurd. What you're frustrated with is the fact that the worker is a replaceable cog in a capitalist system, not some kind of siren song of artistic integrity. Artistic integrity isn't going anywhere - corporate technical artist jobs are what AI is taking away from us. And it's going to happen to everyone, not just artists. I listened to the Greg Rutkowski podcast and literally the only reason he has a problem with any of this is because "by greg rutkowski" is such a common prompt. He's concerned that e.g. images show up with his signature on them, which only happens because people put "by Greg Rutkowski" in the prompt - which is exactly what a signature on an image is, it's a way of saying "By Greg Rutkowski". If he just talked to MJ and had them ban his name from prompts he would have no tangible complaint (ok ok he would still have a tangible complaint against SD), but instead of taking the time to understand the technology and realize that the images have his signature because people are literally asking for his signature on the images in the prompt (which tbf is not his fault but it's because most people using these tools are still like 6 months behind on prompt-writing technique) he champions the disney IP lawyer anti-AI gofundme e: Then again I have never had any interest in creating derivative works with AI and literally from the day I first started poking at it in 2019 I've been focused on creating distinct unique styles that can feel like they are "mine", and I get that the vast majority of its current audience are there specifically because they want to create derivative works, so I just, as an artist, fundamentally cannot see the problem with tools that allow people to create derivative works, because just off the top of my head here are some other tools that allow people to create derivative works: pencils, pens, MSpaint, photoshop, mosaic tiles, crayons, markers, Google Image Search, laserjet printers, drawing with their fingers in the sand, collecting and repurposing old soda cans, spraypaint, etc. Anyway as an aside here's some things I made today: (made with text prompt alone, no artist names or existing media mentioned, etc) (same, but image prompted it with my own art) deep dish peat moss fucked around with this message at 00:47 on Jan 19, 2023 |
# ? Jan 19, 2023 00:18 |
The AIs could have just been trained on public domain, CC0, or licensed/commissioned images. That would be ethical, and there'd be no legal issues either. Nobody's work being used without consent, compensation, or credit. That's what artists are asking for. Those systems would still be hugely powerful and interesting. I'm not sure what to make of people against the "ethical art AI" idea. It seems to boil down to "but those ethical AIs would be worse, I already have the version with millions (billions?) of pieces of work scraped against their creators' will, I want to keep benefiting from their work/time/labor/craft. I am entitled to it."
|
|
# ? Jan 19, 2023 00:55 |
|
deep dish peat moss posted:If he just talked to MJ and had them ban his name from prompts he would have no tangible complaint, but instead of taking the time to understand the technology and realize that the images have his signature because people are literally asking for his signature on the images in the prompt (which tbf is not his fault but it's because most people using these tools are still like 6 months behind on prompt-writing technique) he champions the disney IP lawyer anti-AI gofundme deep dish peat moss posted:because just off the top of my head here are some other tools that allow people to create derivative works: pencils, pens, MSpaint, photoshop, mosaic tiles, crayons, markers, Google Image Search, laserjet printers, drawing with their fingers in the sand, collecting and repurposing old soda cans, spraypaint, etc.
|
# ? Jan 19, 2023 01:04 |
|
deep dish peat moss posted:Then again I have never had any interest in creating derivative works Yeah like my whole deal with these tools are creating faux photos, I like seeing 'real' things, cars, machines, buildings. Lots of others want to make anime real which is cool but with me specifically I'm invoking 'feelings', I don't invoke artists or specific works, just objects, angles, film types, color gamuts. Prolonged Panorama posted:The AIs could have just been trained on public domain, CC0, or licensed/commissioned images. That would be ethical, and there'd be no legal issues either. Nobody's work being used without consent, compensation, or credit. That's what artists are asking for. Those systems would still be hugely powerful and interesting. If I look at, and take the time to draw something, based on the style of someone else, while never trying to recreate specific works or claiming my works are theirs, that's ok, right? Shameless, tasteless, sure, but legal and clear A-OK with how the status quo currently is. If the robot does it though why is that different? This is where your argument breaks down with me because you are assuming these AI models are somehow unethical, where I don't think they are because of functionally how they work. You will never get a copy of someone's work out of these machines without overtraining a model on a specific work. Remember, "style" is not protected, just specific works. I also know that lots of people complaining about these AI systems fundamentally have zero idea how they work, and most of the time go in the opposite direction and think they work in ways they don't actually. You go ahead and make an 'ethical' dataset and people will still be screaming about it because they do not understand how the systems work. It's a big whole angry twitter shitshow, that's all. Finally, the text ones are far, far more terrifying, how come nobody is screaming about those?
|
# ? Jan 19, 2023 01:19 |
|
All news worldwide has been screaming about ChatGPT for over a month nowTunicate posted:Lol google tried building a big database of human-curated sentiment analysis of English text to train their robot on. They outsourced the tagging to lowest-bidder nonnative speakers, with the result that 70% of their human tagged data was wrong.
|
# ? Jan 19, 2023 01:28 |
|
Prolonged Panorama posted:The AIs could have just been trained on public domain, CC0, or licensed/commissioned images. it was people need to read the ToS of the places that hosted their images for free for decades.
|
# ? Jan 19, 2023 01:40 |
|
Ruffian Price posted:All news worldwide has been screaming about ChatGPT for over a month now Yeah but where is the PASSION Here is a thing that I guess is unethical because it's obviously ____________'s work I've actually been drawing (lol) more than I ever have been thanks to this drawing tablet. Being able to zoom in and regenerate sections is really good for stuff like the controller, which was generated on top of this image using the Kirta plugin and inpaiting, and it allows you to render 'full res' in smaller spaces. The whole image of the box was generated then futzed with just a bit, then I masked off a section down there, MS-painted a controller in real rough fashion, then generated on top of that little section. This works really well for faces as well. EDIT not happy with those vents on the side
|
# ? Jan 19, 2023 01:56 |
|
Tree Reformat posted:We're ultimately grappling with the societal and legal implications of the CSI "Enhance Image" button being made real. Pixel phones and Google Photos already offer the option to upscale/edit/remove people from your photos. The next step is inserting advertising. Like if you have a Coca-Cola can in the background of your photo it'll be easy enough to ensure that the logo will always be visible. If you take a picture at a public event and share it online all the billboards will show products that align with each of your relatives' adsense interests, your niece will see ads for Roblox while your grandmother will see ads for Big Buck's Buttplug Bonanza Outlet
|
# ? Jan 19, 2023 02:01 |
|
I want to scream "common crawl obeyed robots.txt, this has been the standard for giving or taking consent from crawlers for nearly 30 years. You agreed to a ToS that allowed this or literally didn't do the thing in place to say you didn't want your art to get scraped if you self hosted" every time the issue of artist permission comes up. Like, there's a way to give/take permission from crawlers, robots.txt, it doesn't have to be obeyed and it's not perfect but in this case it literally was. But no one knows what robots.txt is (they should, if they're gonna post poo poo on the internet) and this gets a little too close to me going on a rant about how we should all start using gopher again and stuff.
|
# ? Jan 19, 2023 02:33 |
|
capitalism is the problem, they're also all too scared to fight the closed AI companies because they're filled with scary lawyers and will instead go after stability because they gave it out for free to everyone instead of hoarding it like Real Capitalists
|
# ? Jan 19, 2023 02:57 |
KakerMix posted:Yeah like my whole deal with these tools are creating faux photos, I like seeing 'real' things, cars, machines, buildings. Lots of others want to make anime real which is cool but with me specifically I'm invoking 'feelings', I don't invoke artists or specific works, just objects, angles, film types, color gamuts. Sounds like you're at the lower end of this scale then: I found myself dropping lower and lower the more I used midjourney, the more I heard from artists, and the more I learned about the training datasets. KakerMix posted:If I look at, and take the time to draw something, based on the style of someone else, while never trying to recreate specific works or claiming my works are theirs, that's ok, right? Shameless, tasteless, sure, but legal and clear A-OK with how the status quo currently is. If the robot does it though why is that different? This is where your argument breaks down with me because you are assuming these AI models are somehow unethical, where I don't think they are because of functionally how they work. You will never get a copy of someone's work out of these machines without overtraining a model on a specific work. The statistical methods used to train the models aren't themselves unethical, I'm with you on that. (Though I'd argue a human doesn't learn the way these systems appear to, not even close.) The unethical part is the non-consensual training set. People's work has been incorporated in to a commercial product without their knowledge, and (now that they know about it) against their wishes. With how these models work, you can't "opt out" once the training is complete. Once trained, the AIs don't learn, and they can't forget, either. We can argue about how much any one training image impacts the future outputs of the model. But I don't see how you can make the argument that if the model were re-trained on an opt-in and public domain only data set, it would not behave differently. That difference in the function (and let's be honest, quality) of the tool is the impact copyrighted works have had on it. So even when you're not invoking specific artists, their work is shaping the output. They helped define the latent space you're exploring. BARONS CYBER SKULL posted:people need to read the ToS of the places that hosted their images for free for decades. So if it's on pinterest it's public domain? This is news to me.
|
|
# ? Jan 19, 2023 03:01 |
|
BrainDance posted:I want to scream "common crawl obeyed robots.txt, this has been the standard for giving or taking consent from crawlers for nearly 30 years. You agreed to a ToS that allowed this or literally didn't do the thing in place to say you didn't want your art to get scraped if you self hosted" every time the issue of artist permission comes up. robots.txt doesn't have anything to do with how something is licensed
|
# ? Jan 19, 2023 03:04 |
|
It'll be interesting to see media companies go apeshit when this stuff evolves to the point where someone can say something like "Gossip Girl vs Xenomorphs directed by Alfred Hitchcock" and get a watchable feature-length movie.
|
# ? Jan 19, 2023 03:14 |
|
BrainDance posted:I want to scream "common crawl obeyed robots.txt, this has been the standard for giving or taking consent from crawlers for nearly 30 years. You agreed to a ToS that allowed this or literally didn't do the thing in place to say you didn't want your art to get scraped if you self hosted" every time the issue of artist permission comes up.
|
# ? Jan 19, 2023 03:15 |
|
taqueso posted:robots.txt doesn't have anything to do with how something is licensed It doesn't, though there's a good chance no license in most western countries that would make anything common crawl, LAION, or SD did a violation of it. Otherwise every license would include "one weird trick you can't use this for anything ever" and no one could ever make anything disneyesque. But it does give or take consent to crawlers, and was obeyed in this situation. steckles posted:if a particularly prescient creator included text saying "The contents of this site can not be used to train a neural network" but still let their site be scraped, then including their work in a training set would be a clear violation of the license granted to the visitors of that site. Common Crawl identifies itself as ccbot, and I'm really skeptical over whether a specific "no neural networks" clause would be in any way enforceable, and not just because it isn't, but because it shouldn't be. In the same way "fair use does not apply to this content" wouldn't and shouldn't be enforceable. BrainDance fucked around with this message at 03:20 on Jan 19, 2023 |
# ? Jan 19, 2023 03:16 |
|
Prolonged Panorama posted:Sounds like you're at the lower end of this scale then: I'm at a solid 2.5 on that one
|
# ? Jan 19, 2023 03:26 |
The model doesn't contain anyone's art. It doesn't have art in it at all. It outputs data that looks like art.
|
|
# ? Jan 19, 2023 03:36 |
|
Prolonged Panorama posted:Sounds like you're at the lower end of this scale then: I'm not going to claim I am "an artist" even though I sure spend money on art related things like one and even went to an art college for art things and made lots of things that could be considered art, and, not that it matters, have lots of real professional artist friends. They can have that title. That chart, by existing at all, puts "artists" against "not artists" because it assumes there is negative with using AI generation at all, where I flat out disagree with that assessment. I wasn't stating "I don't even use artists so I'm one of the good ones" I was stating how I use the tool goes against what Gregs or Sarahs would claim. I do not think that an artist whose image was used in the training data has the right to ask my permission to use these AI tools, in the same way that I can look at their images and use my own hands to try to attempt to learn to draw in their style without asking their permission. Specifically with Stable Diffusion, it's over, it isn't going back in and people move so, so fast. There are models that people make for fun, for free, on literally anything you can dream of, and they do it as jokes like the anti-AI thing that swept through twitter a couple weeks back, someone made a model that just generated versions of those. You can completely kill Stable Diffusion, dissolve the company, and nothing at all changes. It's out, it's open source, it's in people's hands, and they know how to make things like this. It is way, way too late. I don't think people were claiming that their works are in public domain, just that it really strengthens tech companies argument that they didn't break any rules or laws since they were hosting, filing, holding the files these models were trained on. The TOSs agree that they can be used for whatever the company wants, including training. Go check out Adobe's license agreement people were freaking out about recently, turns out you gave them permission to use your anonymized data for training purposes, ~whoops~. Essentially, the art isn't in the files, the product. I can not extract out anyone's art from anything at all in the Stable Diffusion files, no matter how many 2-8 gb models I download, no matter how many latent diffusion things I shove in, I have to generate images. They aren't there till the black box goes boop and spits something out. This is why I don't think these artists have any legs to stand on, their art was looked at my a machine's "eyes" but it wasn't ingested. Sedgr posted:The model doesn't contain anyone's art. It doesn't have art in it at all. It outputs data that looks like art. Ironic, since if what these AIs spit out isn't art, then how can they replace the human artists that make real art?
|
# ? Jan 19, 2023 03:55 |
|
BrainDance posted:I'm really skeptical over whether a specific "no neural networks" clause would be in any way enforceable, and not just because it isn't, but because it shouldn't be. One of the reasons these AIs have been released so widely is that there isn't a big, litigious industry group, like the RIAA or MPAA, claiming to represent millions of online artists, who are interested in exploring the limits of fair use through expensive legal action. I'm not saying the RIAA or MPAA are at all noble or good entities, or that we need something like that for artists, but the lack thereof has been an enabling factor in the forgiveness-not-permission attitude that the companies behind these AIs have taken. steckles fucked around with this message at 04:05 on Jan 19, 2023 |
# ? Jan 19, 2023 03:58 |
Prolonged Panorama posted:Sounds like you're at the lower end of this scale then: "trains the model with living artists" is probably like 12 in that scale but at least I barely post my results. I just find it so interesting to download a couple of images, throw them into the mix and ta-dah, magic happens.
|
|
# ? Jan 19, 2023 04:20 |
|
Cabbages and Kings posted:Why is getty suing stability for internet scraping that LAION did, in some cases years ago? LAION is a dataset of URLs and textual description of the images found on those URLs. It does not contain the actual pictures. So if someone is training an AI with the LAION dataset, they have to scrape all the images themselves and feed it into the AI. This means the people training the AI is closer to potential copyright infringement than the people who made the LAION dataset originally.
|
# ? Jan 19, 2023 04:23 |
KakerMix posted:Ironic, since if what these AIs spit out isn't art, then how can they replace the human artists that make real art? I think it's art. The AI manipulates the data and the person wielding the text prompt is the impetus. If the end result breaks copyright its the human that is to blame.
|
|
# ? Jan 19, 2023 04:29 |
|
steckles posted:That hinges entirely upon whether or not training a neural network for commercial purposes counts as fair use and I don't think that's clear cut at all. Training a totally open source network that was licensed in such a way to preclude using its output for commercial purposes could be fair use. When someone's work goes into training something that a for-profit company is hoping to monetize, the question of whether or not to respect the artist's desires should be answered. The reality is that no matter what the artists aren't winning this fight. Even if they by some miracle manage to completely purge Midjourney/DALL-E/Stable Diffusion from the internet, Its not going to stop Disney or some other corporation from training on all of the copyrighted images they have the right to and creating their own AI. The only thing they can destroy is the general public's easy access to AI, and realistically they are just going to slow it down. Also, process is not copyrightable. The only thing that matters is the end result as far as fair use is concerned.
|
# ? Jan 19, 2023 04:29 |
|
steckles posted:One of the reasons these AIs have been released so widely is that there isn't a big, litigious industry group, like the RIAA or MPAA, claiming to represent millions of online artists, who are interested in exploring the limits of fair use through expensive legal action. About 15 years ago mashups went from being a niche genre to huge when Girl Talk blew up. Even with the RIAA, he avoided getting sued, even though his work was entirely built off sampling, every part of every track on Feed The Animals was built from other tracks, this many tracks A physical cd got released, for profit. And he did shows. He didn't get sued from what I remember. He said in interviews it was all because it was fair use. Part of the reason why was that sampling had been tested in court and was common, but there's definitely a "this is different" vibe to mashups, and I remember there being some legal issues with people not wanting to host his stuff, but that was more out of fear that it would turn into a legal thing if I remember right. The RIAA couldn't get girl talk even being the RIAA. And mashups are far more directly the source material than anything an AI does. I guess AI will have to be tested in court, too. But, barring some big change in copyright law I am optimistic it would turn out in AIs favor. And even elsewhere, I don't know of a single country where "this clause supersedes local law" is enforceable in a contract. BrainDance fucked around with this message at 04:32 on Jan 19, 2023 |
# ? Jan 19, 2023 04:29 |
|
It will come down to tech conglomerates vs Disney and no matter the outcome will be worse than if everyone just looked away and let it quietly happen
|
# ? Jan 19, 2023 04:36 |
|
|
# ? May 30, 2024 01:54 |
Thread needs more art. Cool art.
|
|
# ? Jan 19, 2023 04:44 |