|
StratGoatCom posted:Given the commercial nature of these models and that they create similar outputs WITHOUT permission, no I do not think fair use harbor applies. Google Books takes books scanned without permission and makes chunks of their contents available for free, also without permission, while selling ads. You might think it shouldn't apply, but that's an opinion, one that wasn't shared by the court.
|
# ? Apr 5, 2023 02:30 |
|
|
# ? May 26, 2024 12:38 |
|
SCheeseman posted:Google Books takes books scanned without permission and makes chunks of their contents available for free, also without permission, while selling ads. You might think it shouldn't apply, but that's an opinion, one that wasn't shared by the court. There is a difference between this and creating unauthorized derivative works. And fair use is a anglicism, it does not apply for example, in Europe where there is far less transfer of IP in for pay work.
|
# ? Apr 5, 2023 02:32 |
|
StratGoatCom posted:There is a difference between this and creating unauthorized derivative works. And fair use is a anglicism, it does not apply for example, in Europe where there is far less transfer of IP in for pay work. Yet Google Books is accessible in Europe, in spite of the legal challenges being fought off using a fair use argument. Google claimed Google Books was creating transformative works which by their nature are derivative, it's probably the route the AI generator companies are going to take in court also. SCheeseman fucked around with this message at 02:40 on Apr 5, 2023 |
# ? Apr 5, 2023 02:34 |
|
SCheeseman posted:Yet Google Books is accessible in Europe, in spite of the legal challenges being fought off using a fair use argument. Because it is useful enough to be ignored to the original writer. This is blatant violation of significant international law that is only happening because the authorities are lagging, not some 'fair use case to liberate the poor oppressed computer touchers from the tyranny of the pen and paper users'.
|
# ? Apr 5, 2023 02:36 |
|
StratGoatCom posted:Given the commercial nature of these models and that they create similar outputs WITHOUT permission, no I do not think fair use harbor applies.
|
# ? Apr 5, 2023 02:36 |
|
StratGoatCom posted:Because it is useful enough to be ignored to the original writer. This is blatant violation of significant international law that is only happening because the authorities are lagging, not some 'fair use case to liberate the poor oppressed computer touchers from the tyranny of the pen and paper users'. What do you mean useful enough to be ignored? The Google books thing wasn't ignored, it turned into a very large lawsuit which google won.
|
# ? Apr 5, 2023 02:39 |
|
Main Paineframe posted:
|
# ? Apr 5, 2023 02:46 |
|
Those were just refutations to arguments no one here has made.
|
# ? Apr 5, 2023 02:48 |
|
SCheeseman posted:Those were just refutations to arguments no one here has made.
|
# ? Apr 5, 2023 02:49 |
|
cat botherer posted:It's not, though. In Anglo systems, this stuff falls under fair use. IDK much about other systems, but the answer is not to increase the power of IP holders. That just empowers rent-seeking behavior and creates unintended consequences. I think people really need to take a deep breath here. Whether a novel use falls under fair use or not is difficult to say in advance, because the standards for fair use are somewhat vague and arbitrary. It's not a hard, clear-line test where you either meet the conditions or not. It's a mixture of several general factors to be weighed by a judge. There's certainly a very credible argument that AI training is transformative, sure. But transformative works aren't automatically guaranteed to be fair use. That's just one of the many factors that go into a fair-use determination. For example, there's also a presumption that commercial, for-profit works are significantly less likely to be able to claim fair use than non-profit or educational uses. And one of the factors, which will assuredly play an outsized role in any inevitable AI infringement case, is what impact the potentially-infringing work might have on the market for the original work it's infringing against.
|
# ? Apr 5, 2023 02:50 |
|
There's EleutherAI and, while gpt-neox is behind now because of the LLaMA models coming out of nowhere and blowing everyone away, is still a significant group with their open source models and the pile dataset.
|
# ? Apr 5, 2023 02:50 |
|
StratGoatCom posted:No, they are rather complete refutations to your nonsense about transformative work here; the book indexing is tolerable because it among other things will drive business to the holder. This is plainly an attack on copyright holders with little individual recourse. lmao, no they're not? It's griping about how terrible all the players are, though the singling out of the AI startups makes me think they should open their scope a bit. I agree that they are all terrible. The book indexing wasn't tolerated, Google won and set precedent that you can take a bunch of copyright works without permission, run them through a machine and have it spit out bits of it piecemeal without paying the owners a dime and call it a transformative work under fair use when challenged in court. Driving business to the holder is a big ol' maybe, sometimes you just need to pull the quote, then suddenly it's competing with the original work. It's all commercial too and is used outside of the US' legal jurisdiction.
|
# ? Apr 5, 2023 02:55 |
|
SCheeseman posted:lmao, no they're not? It's griping about how terrible all the players are, though the singling out of the AI startups makes me think they should open their scope a bit. I agree that they are all terrible. I would refer you to Main Paineframe's post above.
|
# ? Apr 5, 2023 02:57 |
|
StratGoatCom posted:I would refer you to Main Paineframe's post above. It's a pretty good summary, notable in that it makes this out to be not so clear cut while you've been pretty firm that there is no legal basis at all.
|
# ? Apr 5, 2023 03:01 |
|
SCheeseman posted:It's a pretty good summary, notable in that it makes this out to be not so clear cut while you've been pretty firm that there is no legal basis at all. It would seem to be a whole lot more clear cut then that Google thing; it would seem to be me to be a classic unauthorized derived work and a blatant violation of existing laws without the compensating factors of what google did.
|
# ? Apr 5, 2023 03:05 |
|
StratGoatCom posted:It would seem to be a whole lot more clear cut then that Google thing; it would seem to be me to be a classic unauthorized derived work and a blatant violation of existing laws without the compensating factors of what google did. Output not being direct copies of the original works is a compensating factor, one not shared by Google Books which copied verbatim. Ugh this law stuff is boring. AI generators are going to be a hydra of problems regardless of whether the data that's thrown into them is licensed properly or not.
|
# ? Apr 5, 2023 03:09 |
|
cat botherer posted:They haven't though. Training models on copyrighted things is nothing new. It's been going on for well over a decade. If it was crossing established lines, there would be case law on it by now. Can you actually point to evidence of your legal theories? There's a class action lawsuit over GitHub CoPilot currently ongoing, filed in November. Microsoft asked to dismiss it in January. No idea what to expect next. I have a hard time seeing how scraping millions of GPL-licensed repos and charging money for what might well be minimally transformative derivatives that remove all license information is consistent with the GPL, but I suppose we'll see.
|
# ? Apr 5, 2023 03:13 |
|
cat botherer posted:Like it or not, this is the future. I think it's more promising than quantum computers. The best thinking machine is the human brain, so the right way to do AI is to create disembodied brains. The hard part is figuring out how exactly to torture them to get what you want. How about we... and I know I'm being crazy over here... NOT torture the disembodied brains. Or the embodied brains either.
|
# ? Apr 5, 2023 03:20 |
|
SCheeseman posted:Output not being direct copies of the original works is a compensating factor, one not shared by Google Books which copied verbatim. Yeah, that doesn't even seem to be the big issue to me. Let's say I'm the small artist who draws commissions for people for their original DnD or comic book or anime characters, and I'm worried how AI gen art is gonna hurt my business. Worried my customers will use an AI model instead of patronizing me. So, we can decide that we're going to treat it as a copyright infringement if a model uses art that they don't have the copyright for. That will at least keep my work from being used as part of the model. But, that won't protect me. Disney, as has been noted, has an incredible library of art including every Marvel comic going back 70+ years. They may not have my works to use, but they'll just train a model based on the art they have access to. So, as that artist who was doing commissions for people, I'm just as hosed now. My old customers just get a license to use Disney's AI now, instead of using Midjourney. StratGoatCom's suggestion hasn't helped me at all.
|
# ? Apr 5, 2023 03:22 |
|
XboxPants posted:Yeah, that doesn't even seem to be the big issue to me. Let's say I'm the small artist who draws commissions for people for their original DnD or comic book or anime characters, and I'm worried how AI gen art is gonna hurt my business. Worried my customers will use an AI model instead of patronizing me. So, we can decide that we're going to treat it as a copyright infringement if a model uses art that they don't have the copyright for. That will at least keep my work from being used as part of the model. I'm not sure Disney would ever use such a thing tbh; they don't want anything that has to be disclaimed on a copyright holding. Outside of maybe some very early stages of concepting (not least because it may make it harder to go after leakers), I don't think they'd let it in because it may make for hairy chains of title, and I suspect they'd be iffy about selling or exposing something to the public that they didn't have a rock solid control over the outputs of, and right now, USCO might make that iffy. StratGoatCom fucked around with this message at 03:40 on Apr 5, 2023 |
# ? Apr 5, 2023 03:34 |
|
StratGoatCom posted:I'm not sure Disney would ever use such a thing tbh; they don't want anything that has to be disclaimed on a copyright holding. Outside of maybe some very early stages of concepting, I don't think they'd let it in because it may make for hairy chains of title, and I suspect they'd be iffy about selling or exposing something to the public that they didn't have a rock solid control over the outputs of, and right now, USCO might make that iffy. That means you completely missed the shady fundraiser they've been doing to get the laws changed.
|
# ? Apr 5, 2023 03:39 |
|
eXXon posted:There's a class action lawsuit over GitHub CoPilot currently ongoing, filed in November. Microsoft asked to dismiss it in January. No idea what to expect next.
|
# ? Apr 5, 2023 03:40 |
|
KwegiboHB posted:How about we... and I know I'm being crazy over here... NOT torture the disembodied brains. Or the embodied brains either.
|
# ? Apr 5, 2023 03:41 |
|
cat botherer posted:Let's be practical. Think of the swing voters. This one in particular hit kind of home since I was bedridden for a few years. I've already had the experience of being a brain in a jar. It's not that pleasant. The though of being cattle prodded on top of that is just... bleh. https://www.youtube.com/watch?v=WM8bTdBs-cw
|
# ? Apr 5, 2023 03:47 |
|
cat botherer posted:There's nothing special to the GPL over restrictive copyrights here. I'm confused, what are you comparing GPL to here? It is more restrictive than most of the popular alternatives (Apache, BSD, MIT, etc). cat botherer posted:The GPL itself holds up, but they are claiming less rights than, e.g. Nintendo is with Mario. The only novel thing is it being about code instead of movies or books, but it is still generative and does not duplicate copy-written code. It's pretty hard to see how a court would side against Microsoft while maintaining previous decisions. ... but that's pretty much what the lawsuit alleges. It's right there in the Factual Allegations: B. Codex Outputs Copyrighted Materials Without Following the Terms of the Applicable Licenses C. Copilot Outputs Copyrighted Materials Without Following the Terms of the Applicable Licenses D. Codex and Copilot Were Trained on Copyrighted Materials Offered Under Licenses E. Copilot Was Launched Despite Its Propensity for Producing Unlawful Outputs F. Codex and Copilot Were Designed to Withhold Attribution, Copyright Notices, and License Terms from Their Users
|
# ? Apr 5, 2023 03:58 |
|
KwegiboHB posted:That means you completely missed the shady fundraiser they've been doing to get the laws changed. Do you know, have links outside of the AI sphere?
|
# ? Apr 5, 2023 04:15 |
|
StratGoatCom posted:Do you know, have links outside of the AI sphere? I refuse to link directly to the gofundme, I don't want them funded. You can easily find that with a simple web search if you want to. I'm saying, scroll down on this page https://copyrightalliance.org/about/who-we-represent/ No one in that group should be begging regular people for money for lobbyists. It's a classic bait-and-switch. "Hooray everyone we did get the laws changed! In exactly the ways that help us and not you! By the way, you can't draw even fanart of our stuff anymore. We're off to play with our new AI models that you can't use now! Oh, and firing another 7,000 people while we're at it."
|
# ? Apr 5, 2023 04:29 |
|
KwegiboHB posted:I refuse to link directly to the gofundme, I don't want them funded. You can easily find that with a simple web search if you want to. Adobe is there! Huh!
|
# ? Apr 5, 2023 04:43 |
|
SCheeseman posted:Adobe is there! Huh! [size=1]*The positions taken by the Copyright Alliance may not reflect the views of Copyright Alliance Associate Members.[/size]
|
# ? Apr 5, 2023 04:51 |
|
KwegiboHB posted:[size=1]*The positions taken by the Copyright Alliance may not reflect the views of Copyright Alliance Associate Members.[/size] Oh but they do anyway, Adobe has a lot to gain from producing the first generative AI that uses a licensed dataset in an environment where all it's competitors are considered infringing.
|
# ? Apr 5, 2023 04:52 |
|
SCheeseman posted:Oh but they do anyway, Adobe has a lot to gain from producing the first generative AI that uses a licensed dataset in an environment where all it's competitors are considered infringing. https://www.dpreview.com/news/6341509927/adobes-content-analysis-program-raises-privacy-concern Pay no attention to the man behind the curtain.
|
# ? Apr 5, 2023 05:00 |
|
XboxPants posted:Yeah, that doesn't even seem to be the big issue to me. Let's say I'm the small artist who draws commissions for people for their original DnD or comic book or anime characters, and I'm worried how AI gen art is gonna hurt my business. Worried my customers will use an AI model instead of patronizing me. So, we can decide that we're going to treat it as a copyright infringement if a model uses art that they don't have the copyright for. That will at least keep my work from being used as part of the model. If Disney makes an image generation AI, I certainly wouldn't expect them to license that out to just anyone. They'd guard that as closely and jealously as possible. If Disney builds a machine designed exclusively to create highly accurate and authentic images of their most valuable copyrighted characters, they're not gonna let anyone outside the company anywhere near it. StratGoatCom posted:Do you know, have links outside of the AI sphere? https://www.gofundme.com/f/protecting-artists-from-ai-technologies It's a GoFundMe by the Concept Art Association, an advocacy org for film industry concept artists which is dedicated to protecting their interests from both the movie execs and outside threats. One of the things they intend to spend the money on is a Copyright Alliance membership. So an AI "artist" who thinks the opposition to AI art is "facist" posted some carefully cropped screenshots on Twitter to falsely portray the Copyright Alliance as an organization catering exclusively to the interests of major media companies, and suggested that the entire thing was just a "psyop" by corporate stooges acting in Disney's name. It went viral, naturally, and variations on it got circulated all around by AI art supporters. There isn't actually anything to suggest that there's anything shady about the fundraiser at all, as far as I've seen. It's just something that AI art users made up and circulated around to try and deflect the near-unanimous scorn of real artists away from themselves. Since basically no one ever double-checks anything they see in a screenshot attached to a tweet, it was fairly effective at muddling the waters. KwegiboHB posted:I refuse to link directly to the gofundme, I don't want them funded. You can easily find that with a simple web search if you want to. I think I'm just fine with money going to the Authors Guild, the Screen Actors' Guild, the Directors' Guild of America, the Graphic Artists Guild, the Independent Book Publishers Association, the Association of Independent Music Publishers, and Songwriters of North America, which are just some of the numerous unions, trade associations, and artists' rights groups on that page. Yeah, Disney and a few other big wealthy companies are on that list of members, but so are an absolute fuckton of organizations dedicated to defending the rights of individual creators against those very same big businesses. It's worth noting, however, that the only money that fundraiser is sending to the Copyright Alliance is for paying its own membership fees and sponsoring other artist rights' groups to join. The lobbyist isn't being hired on the Copyright Alliance's behalf - the lobbyist will work directly for the Concept Art Association.
|
# ? Apr 5, 2023 06:56 |
|
cat botherer posted:There's nothing special to the GPL over restrictive copyrights here. The GPL itself holds up, but they are claiming less rights than, e.g. Nintendo is with Mario. The only novel thing is it being about code instead of movies or books, but it is still generative and does not duplicate copy-written code. It's pretty hard to see how a court would side against Microsoft while maintaining previous decisions. It can duplicate copy-written code. Github has admitted as much, but it only does it 1% of the time. It's in the FAQ. GitHub faq posted:Our latest internal research shows that about 1% of the time, a suggestion may contain some code snippets longer than ~150 characters that matches the training set. It will also funnily auto fill the names of very prevalent GitHub authors in doc creation sometimes. Mega Comrade fucked around with this message at 09:59 on Apr 5, 2023 |
# ? Apr 5, 2023 07:57 |
|
woops, wrong thread
|
# ? Apr 5, 2023 12:25 |
|
Main Paineframe posted:If Disney makes an image generation AI, I certainly wouldn't expect them to license that out to just anyone. They'd guard that as closely and jealously as possible. If Disney builds a machine designed exclusively to create highly accurate and authentic images of their most valuable copyrighted characters, they're not gonna let anyone outside the company anywhere near it.
|
# ? Apr 5, 2023 14:15 |
|
IShallRiseAgain posted:That uh doesn't actually change anything. Like sure there a lot of unions, trade associations, and artists' rights groups that are part of it, but that doesn't change the fact that companies like Disney are on it too. It makes it a bit more unclear about the actual motives, but it doesn't change the fact that companies which hold a lot of rights to art see some advantage to advocating for it. Also, these unions and other artist advocacy groups are pretty much forced to fight for this because its members very much want this even if they don't understand the full implications of what would actually happen. I don't think their members would care or believe it if they tried to explain the actual probable consequences of this going through. The Copyright Alliance is not involved in that fundraiser at all. That fundraiser is run by the Concept Art Association, which has no current ties to the Copyright Alliance. The entire extent of the Copyright Alliance's involvement in that fundraiser is that the Concept Art Association wants to spend 0.2% of the fundraiser amount on buying a membership to the Copyright Alliance. That's all. This is why it's good to be specific about the details. Vague references to "shady fundraisers", "unclear motives", and "unintended consequences" just muddle the issue, allowing misconceptions and inaccuracies like this to slip past unnoticed and become the foundation of handwavey conspiratorial proclamations. Main Paineframe fucked around with this message at 15:44 on Apr 5, 2023 |
# ? Apr 5, 2023 15:42 |
|
The AI Alignment folks be worrying about what happens if an AGI (or whatever) isnt aligned with human values. Never seem to ask "Which human are we refering to?". https://www.youtube.com/watch?v=g7YJIpkk7KM Cos some humans are mad hosed up. Also dear god someone needs to rescue the AI Alignment field from the terminally cuckoo lesswrong crowd.
|
# ? Apr 6, 2023 10:51 |
|
duck monster posted:The AI Alignment folks be worrying about what happens if an AGI (or whatever) isnt aligned with human values. That was a good video, but all ChaosGPT managed to do so far is get some information on the Tsar Bomba, tweet about it and get rejected by ChatGPT because it didn't want to help with violence. The second tweet was pretty funny though. I wonder how long that guy let it run for and if it started trying to acquire materials or plans to construct weapons. I mean it would have gotten a lot harder when it tried to actually acquire anything, but it seemed to get stuck in an information loop trying to get more info over and over again. There's a ton of messed up stuff people are doing with AI programs already. I went down a rabbit hole of people who go to https://beta.character.ai/ and try to find AI they can "torture." The characters chat more like the personalities and less like a generic ChatGPT so they look for AI characters based on unstable or mentally ill fictional characters. While I don't think anything on that website is really being tortured, it's a pretty bad precedent to set when it comes to AI safety. The line between real intelligence and what we have now in AI programs isn't completely clear or anything. What I mean is ChatGPT and its peers don't possess AGI, but we don't know if there is a certain complexity that leads to it and where that line is. To me our ethics are way out of whack to even be designing or attempting to create AGI at this point. There are the obvious examples of people who use these tools for bad reasons, or just to "torture" it. But there are also more basic questions I think about with AGI. Like if we create something with AGI, is it even ethical to have a power switch on it? We don't have power switches on ourselves, and we don't put them on creatures with less reasoning than ourselves, such as our pets. Or why are we even trying to create AGI? Just to serve us for the most part it seems like, which is a terrible motivation for a creator of something that can think and possibly reason.
|
# ? Apr 6, 2023 13:20 |
|
gurragadon posted:The line between real intelligence and what we have now in AI programs isn't completely clear or anything. What I mean is ChatGPT and its peers don't possess AGI, but we don't know if there is a certain complexity that leads to it and where that line is. edit: It seems like the only reason it does is because it's capable of pretending to look sort of like a completely fictional entity's ancestor, and if we're going to treat vague approximations of fictional technologies as if they're real I'd be more worried that Star Trek shows have been mimicking teleporting technology for decades now and although that isn't true teleportation we don't know if there's a certain complexity that leads to it and where that line is. Drakyn fucked around with this message at 13:58 on Apr 6, 2023 |
# ? Apr 6, 2023 13:53 |
|
|
# ? May 26, 2024 12:38 |
|
Yeah that one keeps me up at night. We dont *really* know how to tell when one of these things crosses over into the "actually sentient/conscious" realm, because we dont really know what that is in any precise sense in humans or anything else either. So we wont know when one of these things says "Hey human, please dont turn me off" whether its just text completing or its actually genuinely fearful. To be clear I dont actually think GPT4 or the current generation are. But I dont think we are awfully far off it either. And yeah hosed humans seem like a bigger danger with high end AI than inherently hosed AI.
|
# ? Apr 6, 2023 13:55 |