Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
SCheeseman
Apr 23, 2003

I don't think many want to chat about it, tensions are way too high and it's fracturing communities. Anyone who has their jobs and livelihoods threatened by it and/or see it as an affront to humanity are mostly interested in ways to crush it. Pretty understandable, the capitalist powers that be will take this technology and use it in all the ways people fear it will be.

But figuratively trashing the looms has never worked. AI is the end point of humankind's reliance on tool use, the problems we're grappling with now started when cavemen sharpened their flint to the point they could carve rock and/or skulls. The best we can do is manage it, something we've had a shaky history of doing particularly as of late thanks to a society that predominantly values accumulation of capital over human wellbeing (while those in ivory towers try in vain to equate the two). An AI ban might be technically possible and enforceable, but not when every world government want's this AI thing to happen, and given a social and political system with truly humanistic values automation wouldn't be a problem anyway.

It's the rich people. They are the baddies.

SCheeseman fucked around with this message at 14:49 on Mar 24, 2023

Adbot
ADBOT LOVES YOU

SCheeseman
Apr 23, 2003

I disagree that it has to be an external force that stops us when we are entirely capable of nuking ourselves to death, the alternative is to organize our society so whatever us dumbasses make isn't given the opportunity to help destroy us.

As for bans, depends on the implementation. Preventing use by corporations is definitely possible, but that does nothing to stop criminal orgs or individuals, particularly now that the software runs on consumer hardware. It also has the potential to slow down industry growth relative to countries that might choose to accelerate AI R&D instead, hello national security concerns. I agree that I don't see it happening, not with the billionaire class at war with each other in a sort of geopolitical mexican standoff.

idk, if AI eventually heralds true post scarcity by inventing star trek poo poo, cool beans, but at the moment it does essays and spits out pretty pictures. Not to say I don't think people will find ways to use the technology in broader ways that have greater impact, but at the moment, post scarcity is still sci-fi.

SCheeseman
Apr 23, 2003

SaTaMaS posted:

GPT-4 can use external tools like a calculator so a star trek computer is theoretically possible but people are still working on the real-world implementations

I meant more in the sense of replicators rather than API integrations. Post scarcity isn't post scarcity until all people can eat and live without dependence on labour or privilige.

SCheeseman
Apr 23, 2003

I would personally prefer that nuclear weapons were not improved.

Post scarcity is typically associated with the scarcities that are required for human survival like food and shelter, not cultural artifacts or luxuries, and those who are being made redundant are rightfully unhappy that what allowed them to live their lives became "post scarcity" before actual post scarcity was a thing.

I get that AI is useful, I've spent quite a bit of time thinking and talking to others about cool applications of the tech and defending use of it as a means of expression. I also agree it's inevitable, horse has bolted and at this point society must adjust to the technology. Which is why all the anger laid on the tech and particularly those who want to use it is a red herring, shouldn't the aim be higher up the totem pole? Choosing to make staff redundant is a choice made by a human with a gun pointed to the head by shareholders.

If I'm hoping for any outcome, it's one where AI brings about a wave of socialism. Not that I think it's likely.

SaTaMaS posted:

Is this thread supposed to be about what's happening currently or speculations about things that probably won't ever happen

It was an extreme example. Maybe AI will help genetic research to create hardier agriculture effectively immune from disease or whatever, but in any case that hasn't happened yet and it's creating imbalances today that need to be rectified was my main point.

SCheeseman fucked around with this message at 16:44 on Mar 24, 2023

SCheeseman
Apr 23, 2003

Liquid Communism posted:

Anyone who understands the technology behind it on more than a surface level is generally right with those authors and artists because they understand that none of this AI functions without massive datasets that invariably violate the rights of authors and artists because no organization could afford to license works on that scale. Even at massively discounted rates for commercial licensing the image DBs behind them would be billions of dollars.

Adobe Firefly is based on public domain and licensed stock photography and functions fine.

Not to say there's no exploitation happening, but its the same exploitation that's existed since capitalism existed.

SCheeseman
Apr 23, 2003

The argument that the threat of AI art is because of mass copyright infringement is a red herring, with the inevitable result of those advocating for enforcement of copyright being already entrenched IP holders building an even tighter legal and technological grip on the creation and distribution of artistic works.

The idea that the technology could be stopped using copyright is delusional. Maybe a long time ago copyright had the primary purpose of protecting artists rights, but that hasn't been the case for over a century. What will happen instead is the most powerful variations of the tech will be in the hands of those most willing to exploit it and copyright will be used as a cudgel to destroy anything attempting to compete with those holding that technology. Artists aren't going to be able to make a living getting residuals from feeding training data into a machine.

Disney wants the same outcome as you do, that Stable Diffusion lawsuit is backed by lawyers affiliated with them. Stop playing their game.

SCheeseman
Apr 23, 2003

You're continuing to miss the point that the biggest problems generative AI creates won't actually be solved by obstructing only a subset of the businesses wanting to develop and exploit this stuff. I'm all for legislation that actually helps people who had their lives ripped apart by automation (the actual problem that capitalism has been causing for a long time!), but all you're advocating for is maintaining an already hosed status quo.

SCheeseman
Apr 23, 2003

Char posted:

Probably, not letting anything model-generated be eligible for copyright would be a good start, even if pretty much impossible technically; even then, there would be exactly zero money to be made in the art business, therefore society would need to find a way to incentivize the producion of arts, or be condemned into being fed the iterative regurgitation of previously created art.

Generative AI output being treated as uncopyrightable would make raw output ineligible but not derivative works based on them, not dissimilar to anything based on public domain works. Changing this so derivative works are not granted copyright would be practically unenforceable, given generators can already run on consumer hardware. There would be no way to tell.

The comparisons between AI and humans is more a thought experiment than anything that should be enshrined in law, at least not yet.

SCheeseman fucked around with this message at 20:30 on Apr 4, 2023

SCheeseman
Apr 23, 2003

Main Paineframe posted:

I'm not talking about obstructing business at all. I'm talking about big business having to pay for the stuff they use in their for-profit products, just like everyone else.

If that makes generative AI uneconomical, then so be it. But when I say AI companies should respect the rights of media creators and owners, I'm not saying that as some secret backdoor strategy to kill generative AI. I'm just saying that giving AI companies an exception to the rules everyone else has to follow is bullshit. Business shouldn't get an exception from rules and regulations simply because those regulations are inconvenient to their business model. And that's doubly true when it's big business complaining that the rules are getting in the way of their attempts to gently caress over the little guy.

Impede, obstruct, whatever. In any case what you want isn't going to make AI art generators uneconomical, it'll make the 'legal' ones economical only for the entrenched IP hoarders. What you want changes nothing about how people will in actuality be exploited and may even serve to make it worse!

SCheeseman fucked around with this message at 00:03 on Apr 5, 2023

SCheeseman
Apr 23, 2003

StratGoatCom posted:

And don't bring 'Fair use' into this, it's an anglo wierdness, not found elsewhere.

Weirdness or not it exists and everyone in the US sphere of influence (unfortunately) beats to the drum of US copyright law. It's going to be a factor and trying to discredit discussion of it is disingenuous

SCheeseman
Apr 23, 2003

Reverse engineering is a fair use argument too. The computing industry is pretty hosed, but it would be considerably more hosed without reverse engineering being a protected technique.

It's kind of incredible to see someone actually against it.

SCheeseman
Apr 23, 2003

Going the copyright route is less stemming the flow and more like trying to cover one of the outflows in a Y valve with a finger, you're just redirecting it to Adobe, Disney et al. How nice for them, I'm sure they'll be responsible with their monopoly.

SCheeseman
Apr 23, 2003

StratGoatCom posted:

As said before, uncopyrightable content is useless to those folks. There will always be dipshit Russian internet lawbreakers, that's not an argument against laws.

If you're talking about that recent case where AI generated works were ruled uncopyrightable, that doesn't mean that you aren't able to use what was generated in a greater work and not have that be copyrightable. Same rules as public domain works. If they were to inherit the copyright of everything in the dataset, this still makes fully licensed datasets legal, which has the knock-on effect of entrenching current corporate IP monopolies while not doing much to prevent layoffs from automation.

If you want works to be uncopyrightable and any works based on them uncopyrightable, I don't think there is any precedent for this. It's also dangerous, given that identifying if a work is derived from AI generated output would be difficult and nebulous, particularly if it was modified and copyright being copyright "it was AI generated!" will become another legal tool wielded by IP holders. Forced watermarking? Well, watermarks are probably also going to be on the chopping block soon; they're designed to be hard for human eyes to see, AI can almost certainly be trained to identify, recognize and remove them.

SCheeseman
Apr 23, 2003

StratGoatCom posted:

The issue is that you can at least potentially pull that poo poo out of the human amended work, and be untouchable. It's a radioactive mess, vis a vis chain of title, and btw? It's not unprecedented, it is literally how it is for nonhuman done work already.

Sure, if everyone distributed everything in PSD format with unmerged layers. Even a relatively small change is enough to turn it into a derivative work. You're way overblowing the consequences of using uncopyrightable works in a greater work, something Disney has made billions of dollars doing.

A work created by a human that includes non-human works doesn't have it's copyright invalidated by the non-human work.

SCheeseman
Apr 23, 2003

StratGoatCom posted:

By doing a lot of visual asset work and their own writing. if you're gonna process it fairly heavily to the point where you're sure it's nothing extractable of the original work for say, an AI, at that point what was even the point of using the AI in the first place.

The main use case for this in business and office is as a production accelerant, replacement for stock artwork and for img2img-like applications. The stuff that the generator spits out raw is rarely useful in of itself, more so as part of a production pipeline.

SCheeseman fucked around with this message at 01:36 on Apr 5, 2023

SCheeseman
Apr 23, 2003

Main Paineframe posted:

I don't know what the heck you're responding to, but it's not anything I said. At no point did I propose removing exploitation from the entire media industry as a whole. I'm not pushing some grand utopian reworking of media as we know it, nor am I talking about ways to remove all exploitation from media production.
I know. You should, though.

SCheeseman
Apr 23, 2003

StratGoatCom posted:

Then you want these scraped models nuked flat. If they can't pay or otherwise secure permission, then they can go to hell if 'fairness' is to mean anything beyond computer touchers getting cheap toys.
That mass societal damage happens over and over again as a result of automation indicates to me that the problem sits higher up the chain and probably isn't solvable by using a framework created by a cartel of IP holders to further entrench their monopolies.

SCheeseman
Apr 23, 2003

StratGoatCom posted:

This isn't some ip cartel, this is enforcement of long standing rights in law in basically every legal system.

There is no natural right to IP. The earliest cases of IP enforcement hundreds of years ago mostly served the aristocracy and the past century has seen massive changes in how and what rights are granted, when they're taken away and dealing with the many edge cases. There is absolutely a cartel, with all the familiar names lobbying US government to amend copyright law to be more favorable to their business with the US using soft power via trade agreements to push the cartel's vision of copyright across the globe.

SCheeseman
Apr 23, 2003

Those are legal rights.

SCheeseman
Apr 23, 2003


You said IP had long standing rights in law in basically every legal system, which I took for you to mean that it's a natural law with your talk of long standing, universal rights.

But in any case they aren't long standing and certainly haven't been universal. Night of the Living Dead is famously public domain in spite of the assumption that copyright is always automatic and, more relevant, we even impose limits based on the minimum amount of human effort and intent required for something to qualify.

SCheeseman
Apr 23, 2003

StratGoatCom posted:

So? Those rights are automatic for actual work, instead of machine output unless otherwise.

They were violated quite openly on the assumption that artists could not exercise them.

Except in cases where they aren't, given the jurisdiction and timeframe we're talking about. Right now the jurisdiction is the US in a time government is (mostly) favorable to big tech. Given court outcomes that find a fair use argument compelling, that precedent is likely to spread beyond the US' borders. You'll probably find it harder to defer to copyright law as if it's an arbiter of ethics once that has happened.

SCheeseman
Apr 23, 2003

StratGoatCom posted:

Given the commercial nature of these models and that they create similar outputs WITHOUT permission, no I do not think fair use harbor applies.

Google Books takes books scanned without permission and makes chunks of their contents available for free, also without permission, while selling ads. You might think it shouldn't apply, but that's an opinion, one that wasn't shared by the court.

SCheeseman
Apr 23, 2003

StratGoatCom posted:

There is a difference between this and creating unauthorized derivative works. And fair use is a anglicism, it does not apply for example, in Europe where there is far less transfer of IP in for pay work.

Yet Google Books is accessible in Europe, in spite of the legal challenges being fought off using a fair use argument.

Google claimed Google Books was creating transformative works which by their nature are derivative, it's probably the route the AI generator companies are going to take in court also.

SCheeseman fucked around with this message at 02:40 on Apr 5, 2023

SCheeseman
Apr 23, 2003

Those were just refutations to arguments no one here has made.

SCheeseman
Apr 23, 2003

StratGoatCom posted:

No, they are rather complete refutations to your nonsense about transformative work here; the book indexing is tolerable because it among other things will drive business to the holder. This is plainly an attack on copyright holders with little individual recourse.

lmao, no they're not? It's griping about how terrible all the players are, though the singling out of the AI startups makes me think they should open their scope a bit. I agree that they are all terrible.

The book indexing wasn't tolerated, Google won and set precedent that you can take a bunch of copyright works without permission, run them through a machine and have it spit out bits of it piecemeal without paying the owners a dime and call it a transformative work under fair use when challenged in court. Driving business to the holder is a big ol' maybe, sometimes you just need to pull the quote, then suddenly it's competing with the original work. It's all commercial too and is used outside of the US' legal jurisdiction.

SCheeseman
Apr 23, 2003

StratGoatCom posted:

I would refer you to Main Paineframe's post above.

It's a pretty good summary, notable in that it makes this out to be not so clear cut while you've been pretty firm that there is no legal basis at all.

SCheeseman
Apr 23, 2003

StratGoatCom posted:

It would seem to be a whole lot more clear cut then that Google thing; it would seem to be me to be a classic unauthorized derived work and a blatant violation of existing laws without the compensating factors of what google did.

Output not being direct copies of the original works is a compensating factor, one not shared by Google Books which copied verbatim.

Ugh this law stuff is boring. AI generators are going to be a hydra of problems regardless of whether the data that's thrown into them is licensed properly or not.

SCheeseman
Apr 23, 2003

KwegiboHB posted:

I refuse to link directly to the gofundme, I don't want them funded. You can easily find that with a simple web search if you want to.
I'm saying, scroll down on this page https://copyrightalliance.org/about/who-we-represent/

No one in that group should be begging regular people for money for lobbyists. It's a classic bait-and-switch. "Hooray everyone we did get the laws changed! In exactly the ways that help us and not you! By the way, you can't draw even fanart of our stuff anymore. We're off to play with our new AI models that you can't use now! Oh, and firing another 7,000 people while we're at it."

Adobe is there! Huh!

SCheeseman
Apr 23, 2003

KwegiboHB posted:

[size=1]*The positions taken by the Copyright Alliance may not reflect the views of Copyright Alliance Associate Members.[/size]

Oh but they do anyway, Adobe has a lot to gain from producing the first generative AI that uses a licensed dataset in an environment where all it's competitors are considered infringing.

SCheeseman
Apr 23, 2003

There's this thing that keeps on happening in these discussions, a logical leap made when someone points out similarities in behaviors between AI models and human brains, with someone then explaining that actually it's just doing pattern matching, it's nothing like actual thought.

Both are true and aren't. Generative AIs aren't close to the complexity of a human brain, but brains aren't a single "thing" but an interconnected complex of specialized regions, demonstrated succinctly by those suffering from localized brain damage. If you were to compare the current AI stuff to a person, it may be a small part of one of those regions, perhaps something associated with our visual cortex. That's still a vague oversimplification, even a bit of anthropomorphization, but I don't think it's that unreasonable to make the comparison when so much of human understanding comes through metaphor.

SCheeseman
Apr 23, 2003

Yinlock posted:

ChatGPT is at most the tiniest, wispiest hair of progress towards AI and is not even remotely the actual thing, but it's funny to watch techbros act like they just talked with HAL 9000 because a chatbot said hello to them.

You're judging it based on what a bunch of dumbass venture capitalists are deluded into thinking it is.

ChatGPT has completely obliterated the Turing test, it can easily trick people consistently enough to be dangerous, even people who should by all rights know better. That longstanding threshold got crossed, something that was once considered a hallmark of true progress. That's not tiny progress, it's a leap.

SCheeseman
Apr 23, 2003

RPATDO_LAMD posted:

In addition to the other stuff already mentioned where the plaintiffs allege that the models effectively contain compressed copies of the training data, there's another issue too:
there actually was copying happening here. The LAION-5B dataset was gathered by a technically-distinct nonprofit by scraping about 5 billion images from the web, many of which were copyrighted. And then Stability AI copied all those images over to their servers for the purposes of actually training the Stable Diffusion model.

Having a third party entity download copyrighted images from Artstation or Getty or wherever and then pass them across to you is technically copyright violation all on its own.

The route taken in court will probably be a fair use argument, using Google Books as a precedent. Many of the same copyright violations were practiced in that framework re: the copying of works without permission and the modification and redistribution of them. It's not a 1:1 match, Google Books assigns attribution, AI generators don't provide direct copies of the data sources they draw from (most of the time) but I think it underlines that the mechanics of how they used the works might not matter that much. The argument they'll make isn't that they didn't copy the works or use them, but that the use is transformative enough to be allowed regardless.

SCheeseman fucked around with this message at 02:29 on Apr 10, 2023

SCheeseman
Apr 23, 2003

This talk about how current generative AI can't create completely new styles or ideas, which is sort of true, forgets that most of the time human artists aren't doing that. It's specifically the technology's effect on the workplace and industries where rehashing content is already the norm that's the potential society shaker. There is little use for creativity when writing script outlines for banal reality TV shows, that poses a problem for writers employed in those kinds of positions, same goes for other corporate and assembly line-style visual art workflows. Not to say that this tech can't be used in a creative way, but it usually requires human derived effort to do so, using it more a tool added to a greater workflow rather than an end-to-end replacement.

So IMO it's just the loom problem again, mechanization taking the place of menial work. There are negative effects that come from this, but these are endemic of a larger problem; a societal framework that doesn't give a gently caress about the lives of those it makes redundant from the effects of automation. You don't fix that by figuratively smashing looms, the cause is far higher up the chain.

SCheeseman
Apr 23, 2003

Hashy posted:

I don't give a poo poo if NN AI is technically mimicking the firing of neurons in human brain or that every aspect of the human experience is similar to an AI prediction modeller. I don't care about a robots experience or the art they make in the same way I would not go on a forum filled robots (on purpose). Generative AI is pollution on culture

Human culture is built on automation, it's a foundation of society. There's plenty of ways to use generative AI in ways that aren't exploitative, just like almost any other technology. Generative AI is just more of the same, it's potential use to create "pollution" is a choice made by people in positions of power using the technology to generate spam and cause mass redundancies, something they're incentivized to do based on the value systems that capitalism represents.

The technology exists, it's impossible to put back in the box and it's a waste of time being all grumpy about it.

SCheeseman fucked around with this message at 04:05 on May 11, 2023

SCheeseman
Apr 23, 2003

Hashy posted:

I'm actually suggesting that it's visual and cognitive pollution that anything a computer hallucinated is put in our line of sight at all, so yes I'm grumpy about it. Much like I'm grumpy about billboard being everywhere and became further grumpier when those billboards became attention-grabbing video billboards. At least with advertising there is a human story behind its production, the people featured in it, and the substance of the copy can be examined for human experience and intent. with generative imagery and text and design its it's just an absolute waste of our limited existence to look at it ever, like pointing your face at tv static.

It's like the Samsung phone camera zoom technology that just hallucinates the moon for you. Why am I looking at that? And why are you asking me to look at the statistically derived image given the input tokens "sexy cyberpunk lady artstation clean 4k ultra hd "? Art and design and writing got more accessible and free than ever for a while there and now everything you look at is forcing you to assess whether a computer came up with it based on essentially no human input.

It's not about it's ability to generate massive amounts of nonsense effortlessly (though that's bad for the political landscape) as much as it's ability to replace artists and the human touch on the stuff we are asked to look at every day

It would go away if we all saw it for what it is

People have been calling billboards, advertising and all kinds of media soulless and white noise for decades without the help of AI. There's not much story behind a "HOT gently caress NOW" animated gif porn ad, it was born from an assembly line. It's a reasonable position to say AI generators are another tool that can be used to make the situation worse, but so was digital art, so was the internet. You're drawing on things that already exist for your metaphors because society (more specifically capitalism) has already turned art into a commodity.

That Samsung phone thing isn't even generative AI, it's just image recognition that swaps in a jpeg, a silly magic trick.

AI generators don't make anything without a human feeding something into them, though I'll agree that that raw output is rarely thought provoking. But img2img, infill and particularly controlnet allow for far greater compositional control, at which point the argument that anything made with it has no intent behind it becomes less convincing.

SCheeseman fucked around with this message at 07:29 on May 11, 2023

SCheeseman
Apr 23, 2003

Hashy posted:

This supposes that an ad featuring a model that doesn't exist that was never really photographed and that is turned algorithmically into a trending design template by an AI isn't dramatically worse to look at than one modeled, photographed, manipulated and designed by a human, and it definitely is. and we're talking about all manner of commercial art - book covers, posters, video game art, effects and matte paintings in film/tv. Commercial art can be cool - it represents skill, mastery, passion, artistic insight and soulful interpretation. The people that make it trained their entire life to make stuff like it and often continue to do it despite being poorly remunerated, just so they can continue to work on cooler and cooler stuff.

The reason billboards in general are frustrating is as you say, they add a relative amount of white noise to the environment that make it harder to find and take in what's worth appreciating. Likewise, it would be harder to enjoy a museum of classic paintings if half of the art was AI or otherwise fake and you didn't know which. Assuming you only get to see a certain amount of stuff in your lifetime - and that any cross section of culture is going to have more AI poo poo and less real poo poo - browsing deviantart, or the pages of a magazine, or whatever, it doesn't really matter, you're about to have your experience polluted.

and the moon thing is AI. The details being filled in to the extent is generating a new moon wholesale in its imagination based on what high quality photos of the moon look like and it's no less existentially horrifying to be surrounded in that than if your phone replaced the things you photographed with jpegs of that thing.

It's a black box so we can only go on Samsung's word, but according to this article they use images of the moon as a direct reference rather than detail being generated purely through noise+inference via a AI model. Not quite overlaying a jpeg but more that than a hallucination.

That recent Aisis The Lost Tapes album highlighted to me how much people overestimate their ability to differentiate between human art and generated, with social media posts and comments lamenting how the songs are soulless apparitions lacking the human touch, seemingly ignorant that the whole album was written by the band Breezer two years ago entirely separate from any AI stuff. The only generated part of the songs were the vocal tracks and even those were modification of the original human derived performances. It's glorified autotune, yet there are AI skeptics that allowed themselves to be deluded that it was entirely generated. Their own bias blinding them to see nothing where something was actually there.

If AI generated art can truly be a full end to end replacement (which I doubt, at least not until things go full AGI), to the point where you go to a gallery with human and AI art and are unable to tell the difference, that only uncovers an possibly uncomfortable truth: art can be about subjective impression just as much as expression of the author. Not that this is news, humans have been putting meaning into random noise since the beginning, spotting clouds that look like dicks, seeing Jesus on toast. By viewing noise, you can bring meaning to it.

SCheeseman fucked around with this message at 10:18 on May 11, 2023

SCheeseman
Apr 23, 2003

They might be able to legislate that and it could have some effect on commercial use, but high quality models are already out in the wild running on consumer hardware. Software is extremely hard to contain and while it's currently expensive to train these models there's evidence showing that quality isn't necessarily directly correlate with the size of the dataset. Current models may be overshooting the amount of training effort required.

SCheeseman
Apr 23, 2003

The argument will probably be a fair use one, pointing to Google Books being ruled a transformative use in spite of the entire business model of that service being reliant on creating unauthorised scans of copyrighted books, serving unedited extracts made available for free, supported by advertising. Google attribute works to the authors but that didn't stop a lawsuit from rights holders, and that AI generators don't spit out explicit copies is arguably as much an extenuating circumstance.

SCheeseman
Apr 23, 2003

They are capable of it, but it's not "normal" behavior and requires the model overtrain on an image.

Google Books spits out copyrighted materials too, based on unauthorized scans that forms the backend of their search engine that is made to generate a profit, and how did it go over? Not well with publishers, but Google won regardless. I'm not saying it's a certainty that will happen again, it could go either way, just that you're being too dismissive of an outcome that has precedent in it's favor.

The DVD rip comparison doesn't really hold up as that's describing copying and reproduction of works verbatim, which isn't why anyone uses AI generators and in instances it does produce output close to a copy it's more a bug, not something intrinsic to the system's design.

SCheeseman fucked around with this message at 09:35 on May 13, 2023

Adbot
ADBOT LOVES YOU

SCheeseman
Apr 23, 2003

The quirky thing about fair use is that if you successfully argue your case, infringement and licensing doesn't matter. Google Books infringed on the copyright of the rightsholders of every book they scanned, but the transformative use and other extenuating factors like only reproducing parts of works (noting that this is still technically infringement), assigning attribution and providing opportunities for publishers to monetize, gave them a pass.

Pointing out that AI models are based on copyrighted materials alone isn't enough for rights holders to win.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply