Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
SubG
Aug 19, 2004

It's a hard world for little things.

StratGoatCom posted:

That they're mixed in doing free labor and sanitization of a not-kosher use of IP, much like LAION.
Explain what you think that means. "Doing free labour"? Like, as opposed to the purpose of other open source projects, which is...??? "Sanitization"? How? What does that mean? Why should anyone care about what's being "sanitized"? How is EleutherAI doing this "sanitization"? Like, specifically. Like point to something they have done.

And, again, what's your evidence?

Adbot
ADBOT LOVES YOU

StratGoatCom
Aug 6, 2019

Our security is guaranteed by being able to melt the eyeballs of any other forum's denizens at 15 minutes notice


SubG posted:

Explain what you think that means. "Doing free labour"? Like, as opposed to the purpose of other open source projects, which is...??? "Sanitization"? How? What does that mean? Why should anyone care about what's being "sanitized"? How is EleutherAI doing this "sanitization"? Like, specifically. Like point to something they have done.

And, again, what's your evidence?

LAION is being used as a doddle for the research exemptions in Berne and the european research code; a rather lazy one. Sanitization is both for legal and reputational affairs.

SubG
Aug 19, 2004

It's a hard world for little things.

StratGoatCom posted:

LAION is being used as a doddle for the research exemptions in Berne and the european research code; a rather lazy one. Sanitization is both for legal and reputational affairs.
Can you phrase this as a complete argument? Are you saying that LAION and EleutherAI are secretly the same organisation?

Again: what, exactly, are you alleging about EleutherAI? Point to the thing that they did and what you think it means.

StratGoatCom
Aug 6, 2019

Our security is guaranteed by being able to melt the eyeballs of any other forum's denizens at 15 minutes notice


SubG posted:

Can you phrase this as a complete argument? Are you saying that LAION and EleutherAI are secretly the same organisation?

Again: what, exactly, are you alleging about EleutherAI? Point to the thing that they did and what you think it means.

No, but they're being used in similar fashion by their funders. ElutherAI is mixed in with the various for profit outfits by being funded to do their work for them, in the classic obscurantist way that characters this chicanery.

BrainDance
May 8, 2007

Disco all night long!

You got something about their research institute that was founded this year and hasn't really done much of anything yet about an organization that's been around for 3 years making models on their own.

They're not "being used" to do anything yet.

I'm sure OpenAI does get something out of them though, because that's how open source software kinda works. Microsoft has their own Linux distro.

BrainDance fucked around with this message at 04:46 on Jun 2, 2023

Bar Ran Dun
Jan 22, 2006




Looks like the EU is gunna regulate with the AI act sooner rather than later.

Jaxyon
Mar 7, 2016
I’m just saying I would like to see a man beat a woman in a cage. Just to be sure.
OpenAI literally started as a non-profit for ethical AI research and that was always bullshit. I can't imagine they're the only one.

SubG
Aug 19, 2004

It's a hard world for little things.

StratGoatCom posted:

No, but they're being used in similar fashion by their funders. ElutherAI is mixed in with the various for profit outfits by being funded to do their work for them, in the classic obscurantist way that characters this chicanery.
How are they being "used" that way? "Similar" how? What "chicanery"? Seriously: what, specifically, are you alleging here?

Like...say some hardware manufacturer releases open source drivers for their stuff. Okay, clearly they're not doing this out of pure philanthropy or whatever; they're hoping that by releasing open source drivers they'll sell more of their stuff. Is that the sort of thing you're alleging?

Are you alleging that ElutherAI is secretly creating proprietary stuff that's then being covertly given back to their donors instead of being released publicly? So being open source is just some sort of elaborate cover story?

Something else? What?

I get that you think there's something shady going on, but I don't understand why you're being so coy about actually saying what it is.

SCheeseman
Apr 23, 2003

It's "seize the means of production" because capitalists are the ones who've been making the instruments of labor. This is no different.

That OpenAI have ulterior motives doesn't really change that, those motives (exploit people for profit) are a core tenet of capitalist society and every modern automated tool was created in that framework. A smart commie wouldn't seize tractors created by capitalists to destroy them, they'd use them for themselves.

SCheeseman fucked around with this message at 09:39 on Jun 2, 2023

Tei
Feb 19, 2011

SubG posted:

How are they being "used" that way? "Similar" how? What "chicanery"? Seriously: what, specifically, are you alleging here?

I think is clear what he is saying.

https://www.youtube.com/watch?v=JyxSm91eun4

There are laws like copyright. These laws binds us, but there are exceptions. Having these exceptions make sense and we all like these exceptions existing.

Then we have this group of people, with no moral. They are using and abusing the loophole for profit. Either creating, financing or just using non profits to dig a hole in copyright laws and ignore these laws.

Where they are using a existing non profit is somewhat okay. Still a burden we may have to think about.
Where they are creating these non profit with the purpose to skip copyright laws, is a fraud on society and a massive copyright violation in the spirit of the law, if not in the text itself. Has legal loopholes work.

Reverend Harry Powell is the AI companies singing sweet tones to induce neoliberalistcons to sleep and have get-rich-quick dreams, so they can rip and tear with pleasure.

Charlz Guybon
Nov 16, 2010
I post about literal military AI rebellion and the thread just keeps on nattering on about ChaptGPT.

Disgusting

Charlz Guybon fucked around with this message at 10:02 on Jun 2, 2023

SCheeseman
Apr 23, 2003

The military industrial complex is bad, everything they make are machines of death and no laws will prevent them from developing that stuff. There isn't much else to say.

Well, I suppose it reminds me of early experiments with AI for playing video games where choosing re-enforcement goals poorly caused strange behaviors like finding ways to indefinitely postpone playing the game in order to prevent the high score counter from being reset to zero.

Gentleman Baller
Oct 13, 2013

Charlz Guybon posted:

I post about literal military AI rebellion and the thread just keepa on nattering on avout ChaptGPT.

Disgusting

It's funny but I'm sure everyone doing AI in the military knows about incentive alignment. If its not a fake story, it was probably some really basic early test to see if the AI could identify targets properly, or if it could ask for confirmation. Something like that.

BrainDance
May 8, 2007

Disco all night long!

Tei posted:


Where they are using a existing non profit is somewhat okay. Still a burden we may have to think about.
Where they are creating these non profit with the purpose to skip copyright laws, is a fraud on society and a massive copyright violation in the spirit of the law, if not in the text itself. Has legal loopholes work.

Yeah but he's just making this up. EleutherAI isn't doing any of that poo poo, except in the way that open source software contributes back into everyone.

OpenAI isnt out there going "thank God we secretly had EleutherAI make The Pile, a dataset that does us no good! And all it cost us was serious development in the open source AI world that we're currently trying to restrict."

SCheeseman posted:

There isn't much else to say.

Made me think of something, I hope we get a sequel to Wargames out of this except in real life. As long as it ends like Wargames too, I guess. And the AI teaches us all a lesson about the futility of war.

BrainDance fucked around with this message at 09:40 on Jun 2, 2023

KillHour
Oct 28, 2007



The surprising thing isn't that this happened. The surprising thing is that it's implied that it was somehow unexpected. We know this kind of thing is going to happen because we already have tons of examples of it. There is a fairly famous paper called Concrete Problems in AI Safety that goes over this in detail, but here are just a few:


Here's a bunch more:
https://gwern.net/tank#alternative-examples
https://www.alexirpan.com/2018/02/14/rl-hard.html
https://arxiv.org/abs/1803.03453
https://docs.google.com/spreadsheets/d/e/2PACX-1vRPiprOaC3HsCf5Tuum8bRfzYUiKLRqJmbOoC-32JorNdfyTiRRsR7Ea5eWtvsWzuxo8bjOxCG84dAg/pubhtml

Anyways, this is such a common thing that $274k worth of prizes were awarded by a single group last year just for ideas that might be useful in solving a narrow subset of the issue (How do you know if an AI is telling you everything it knows or if it is lying to you. Basically, the thing we now call hallucinations).

The idea of giving AIs any kind of physical autonomy without provably solving alignment is completely loving insane.

Edit: That said, this smells like clickbait to me. For one thing, in a simulation, why would you model the operator? Why would you model the infrastructure necessary to communicate to the system? All of that takes work and the only reason you would do it is if you were expecting something like this to happen.

Double Edit: All of the people in the replies to that Tweet saying "Well now they can take that into account when they build the real thing" are loving delusional. The simulation may allow for the possibility of things that humans can anticipate being a factor in the AI's decision making, but can't possibly include things nobody thought of. The best case scenario of putting this thing in the real world is that the AI overfits to the training environment in a way that renders it inoperable. The much more likely case is that it overfits to the training environment in a way that causes it to do poo poo we Do Not Want(TM). Those people are the real tech bro hucksters you need to watch out for.

Triple Edit: To head off the argument that the solution to overfitting is to extend the training to outside of the simulation, you very, very, very much do not want the flying murderbot AI to have the opportunity to learn arbitrary things about the real world. Remember the Microsoft chat bot that became horrifically toxic because they thought it was a good idea to keep training after deployment?

KillHour fucked around with this message at 11:04 on Jun 2, 2023

Tei
Feb 19, 2011

Humans do it all the time. Poliicians have to carefully configure the reward, because the moment they offer monetary rewards for stuff, people abuse these for gains not intended by the politicians.

Is a problem of the idea of enforcing a behavior with rewards.

Tei fucked around with this message at 13:12 on Jun 2, 2023

Gentleman Baller
Oct 13, 2013

It's not too important to the discussion, but the tweet about the army AI drone targeting its own operator was deleted because the Col. quoted has clarified that it was "just a thought experiment" and not an actual simulation.

https://twitter.com/ArmandDoma/status/1664600937564893185

KillHour
Oct 28, 2007


You mean the military wrote a report on theoretical risks given a scenario - something they do constantly for everything you could imagine - and it was reported as something that actually happened? I'm shocked. This is my shocked face.

Edit: I'm glad my bullshit detector at least worked well enough to see the obvious holes in the story.

Tei posted:

Humans do it all the time. Poliicians have to carefully configure the reward, because the moment they offer monetary rewards for stuff, people abuse these for gains not intended by the politicians.

Is a problem of the idea of enforcing a behavior with rewards.

Yeah, it's a fundamental thing any time the measure you use to gauge success isn't the actual thing you care about. See also: capitalism and using the capacity to acquire wealth as a proxy for total economic contribution.

It turns out that algorithms for finding local minima are really good at abusing those situations.

KillHour fucked around with this message at 13:46 on Jun 2, 2023

GlyphGryph
Jun 23, 2013

Down came the glitches and burned us in ditches and we slept after eating our dead.

Charlz Guybon posted:

I post about literal military AI rebellion and the thread just keeps on nattering on about ChaptGPT.

Disgusting

Because it's not suprising that it would have happened happened in broad strokes, it's normal AI behaviour
Because the bit about how and why it happened is obviously bullshit by someone who doesn't understand how this stuff works
It's very much not a "literal military AI rebellion" even if it happened as described
And, of course, it's transparently complete made up bullshit

People wanting to talk about actual stuff that actually matters is completely understandable.

Liquid Communism
Mar 9, 2004

коммунизм хранится в яичках

KillHour posted:

You mean the military wrote a report on theoretical risks given a scenario - something they do constantly for everything you could imagine - and it was reported as something that actually happened? I'm shocked. This is my shocked face.

Edit: I'm glad my bullshit detector at least worked well enough to see the obvious holes in the story.

Yeah, it's a fundamental thing any time the measure you use to gauge success isn't the actual thing you care about. See also: capitalism and using the capacity to acquire wealth as a proxy for total economic contribution.

It turns out that algorithms for finding local minima are really good at abusing those situations.

It's also the plot from Terminator, where Skynet starts a global thermonuclear war because the easiest way to stop locals from turning it off was to provoke a mutually assured destruction nuclear counterstrike.

StratGoatCom
Aug 6, 2019

Our security is guaranteed by being able to melt the eyeballs of any other forum's denizens at 15 minutes notice


Tei posted:

I think is clear what he is saying.

https://www.youtube.com/watch?v=JyxSm91eun4

There are laws like copyright. These laws binds us, but there are exceptions. Having these exceptions make sense and we all like these exceptions existing.

Then we have this group of people, with no moral. They are using and abusing the loophole for profit. Either creating, financing or just using non profits to dig a hole in copyright laws and ignore these laws.

Where they are using a existing non profit is somewhat okay. Still a burden we may have to think about.
Where they are creating these non profit with the purpose to skip copyright laws, is a fraud on society and a massive copyright violation in the spirit of the law, if not in the text itself. Has legal loopholes work.

Reverend Harry Powell is the AI companies singing sweet tones to induce neoliberalistcons to sleep and have get-rich-quick dreams, so they can rip and tear with pleasure.

Also the more nuts and bolts issue that they're abusing cutouts in laws intended for not for profit research and using these orgs to obfuscate that fact.

Jaxyon
Mar 7, 2016
I’m just saying I would like to see a man beat a woman in a cage. Just to be sure.

Charlz Guybon posted:

I post about literal military AI rebellion and the thread just keeps on nattering on about ChaptGPT.

Disgusting

That's because a lot of people thought it was a BS story and were right to not give it any time.

Though also maybe because the outcome of "MIC turns on it's masters" isn't concerning.

SubG
Aug 19, 2004

It's a hard world for little things.

Tei posted:

Where they are creating these non profit with the purpose to skip copyright laws, is a fraud on society and a massive copyright violation in the spirit of the law, if not in the text itself. Has legal loopholes work.
Are you saying this describes EleutherAI? If so, then please point to what you're actually talking about. Like you're saying there's some "massive" violation going on...what specifically has EleutherAI done that you're describing this way? When you say "they" created EleutherAI for these purposes...are you saying that the accepted story about how EleutherAI came about (guys talking about how to create an open source equivalent to GPT-3 in on Discord) is...a cover story? That actually some "they" (OpenAI?)...fabricated this story to cover the fact that "they" were actually creating EleutherAI to exploit these "loopholes"?

Because I thought e.g. OpenAI was supposed to already be doing those things themselves. In the open. Like I thought that was the main thrust of the argument against them: that they're just ignoring copyright to produce their models and then use them for commercial gain. I think that understanding of the situation is predicated on a misunderstanding of both the technical and legal details of the situation (as discussed at length in this thread an others)...but I understand that argument. But if it's true...then what exactly do you think their angle is in creating EleutherAI? If they're, out in the open, doing all these bad things...why would they create a non-profit to produce "clean" open source equivalents? Like are they secretly planning to sabotaging the open source efforts to make their commercial products look better? Is the whole "open source" thing a fake and you think they're secretly doing proprietary work and they're engaging in this elaborate charade for some reason I can't even begin to speculate on? Like seriously, what's the hidden agenda supposed to be?

Like I understand disliking and distrusting large corporations full of techbros. But if your argument is that they're running roughshod over copyright in the pursuit of their own profits...open source projects like EleutherAI seem like exactly what you'd want. Unless you're actually just opposed to AI in and of itself. Which, you know, if that's true, fine. But then all the arguments about copyright and competition are non sequiturs. Or rather, they're a completely separate argument.

StratGoatCom
Aug 6, 2019

Our security is guaranteed by being able to melt the eyeballs of any other forum's denizens at 15 minutes notice


SubG posted:

Are you saying this describes EleutherAI? If so, then please point to what you're actually talking about. Like you're saying there's some "massive" violation going on...what specifically has EleutherAI done that you're describing this way? When you say "they" created EleutherAI for these purposes...are you saying that the accepted story about how EleutherAI came about (guys talking about how to create an open source equivalent to GPT-3 in on Discord) is...a cover story? That actually some "they" (OpenAI?)...fabricated this story to cover the fact that "they" were actually creating EleutherAI to exploit these "loopholes"?

Because I thought e.g. OpenAI was supposed to already be doing those things themselves. In the open. Like I thought that was the main thrust of the argument against them: that they're just ignoring copyright to produce their models and then use them for commercial gain. I think that understanding of the situation is predicated on a misunderstanding of both the technical and legal details of the situation (as discussed at length in this thread an others)...but I understand that argument. But if it's true...then what exactly do you think their angle is in creating EleutherAI? If they're, out in the open, doing all these bad things...why would they create a non-profit to produce "clean" open source equivalents? Like are they secretly planning to sabotaging the open source efforts to make their commercial products look better? Is the whole "open source" thing a fake and you think they're secretly doing proprietary work and they're engaging in this elaborate charade for some reason I can't even begin to speculate on? Like seriously, what's the hidden agenda supposed to be?

Like I understand disliking and distrusting large corporations full of techbros. But if your argument is that they're running roughshod over copyright in the pursuit of their own profits...open source projects like EleutherAI seem like exactly what you'd want. Unless you're actually just opposed to AI in and of itself. Which, you know, if that's true, fine. But then all the arguments about copyright and competition are non sequiturs. Or rather, they're a completely separate argument.

They can do both, and indeed it serves their purposes to do both. And stop falling for the open source doddle, much of that scene was essentially subverted by the companies a long time ago as a way to get free coding.

Gentleman Baller
Oct 13, 2013

StratGoatCom posted:

And stop falling for the open source doddle, much of that scene was essentially subverted by the companies a long time ago as a way to get free coding.

What do you even mean by this? Open source projects are open source. You can download and play with eleutherAI's projects right now if you want. Explain the subversion for free coding.

Gentleman Baller fucked around with this message at 03:50 on Jun 3, 2023

BrainDance
May 8, 2007

Disco all night long!

EleutherAI, over the last 3 years, they've done stuff that's useful to, like, people who want to use AI themselves but they've done absolutely nothing yet that would actually benefit OpenAI. A bunch of models OpenAI has their own better versions of and a dataset that OpenAI doesn't need and is likely inferior to their own.

EleutherAI's thing has basically been trying to accomplish what OpenAI already accomplished, but, open source.

But this is one of those situations where the conclusions already decided, where no matter what evidence there is there will be an excuse pulled out of the air which, where we're at now, is literally just completely invented nonsense about hypothetically what EleutherAI could be in the future. Something that there's no way to prove wrong and no way to prove right. Except it just really doesn't make sense when it comes to the kind of things they've been doing. Why does OpenAI need to secretly funnel development through an open source shell to accomplish things they've already done?

So in this thread we've had guy who just shouts out things he believes but is like "this is just a fact because I say it is" and we got " conspiracy theory with no evidence besides 'yeah I just bet'" which wtf.

We've now basically argued against open source software. Just, as a whole, that's the only conclusion that can come from this. If this argument works against EleutherAI before they've even done anything like what the argument says then it applies to every other open source project ever. Yeah I'm sure that makes sense.

"Free coding", what are they coding? Lol. That tells me you're completely unfamiliar with what they've actually done. Seriously, they haven't done anything OpenAI or any other proprietary AI organization actually needed, that's not the point.

StratGoatCom
Aug 6, 2019

Our security is guaranteed by being able to melt the eyeballs of any other forum's denizens at 15 minutes notice


https://notesfrombelow.org/article/open-source-is-not-enough
https://techhq.com/2022/08/open-sou...ves%20%E2%80%93

It's an ideological fetish of spec labour, of outsource to hobbyists these days; ironically, another generative product, co-pilot was a rugpull that made it especially obvious.

StratGoatCom fucked around with this message at 02:28 on Jun 3, 2023

SCheeseman
Apr 23, 2003

Those articles don't support what you said in your posts at all. They're still strongly in support of the concept of FOSS, but critical of how they've been hijacked and exploited by commercial interests. They're right, but they aren't proposing to throw it all in the bin. Closed source development models are still worse.

SubG
Aug 19, 2004

It's a hard world for little things.

StratGoatCom posted:

https://notesfrombelow.org/article/open-source-is-not-enough
https://techhq.com/2022/08/open-sou...ves%20%E2%80%93

It's an ideological fetish of spec labour, of outsource to hobbyists these days; ironically, another generative product, co-pilot was a rugpull that made it especially obvious.
Can you re-phrase this as an argument about EleutherAI?

That first article you link. Among other things, it makes a distinction between being "merely" open source vs the gnu/FSF notion of the "four freedoms". This does not appear to apply here; although we've been calling EleutherAI's stuff "open source" it all embodies the four freedoms. I'd further point out that insofar as anyone in this conversation has been hostile to the traditional four freedoms, it's you: you're the one advocating limitations on how software is run, a violation of freedom 0.

The article also objects that corporate interest in open source projects often occurs late in development, basically reinforcing a project that's already succeeding instead of nurturing new ideas before they're fully developed. Again, that doesn't appear to apply here.

So...were you intending to offer that as a counterexample? Just googling for arguments against open source and you just liked the title?

SCheeseman
Apr 23, 2003

FOSS cuts both ways. Enthusiasts and hobbyists privileged enough to work for free have their work exploited by commercial entities, something hard to avoid since giving away something for free with the intent to see wide distribution makes it difficult to gate off commercial entities in an social and legal environment where the line between personhood and company is blurred. But there's also a hell of a lot of people being paid by a whole host of companies that contribute code to the kernel and other projects that I have total access and the legal rights to take and modify and use for myself without paying a dime.

There's a bunch of problems around the edges, but they're caused by capitalism! As I've been saying over and over in the thread, that's the root cause of the problem. There's no legal or organizational framework you can create in a capitalist society that won't get poisoned by it.

SCheeseman fucked around with this message at 03:36 on Jun 3, 2023

SubG
Aug 19, 2004

It's a hard world for little things.

SCheeseman posted:

FOSS cuts both ways. Enthusiasts and hobbyists privileged enough to work for free have their work exploited by commercial entities, something hard to avoid since giving away something for free with the intent to see wide distribution makes it difficult to gate off commercial entities in an social and legal environment where the line between personhood and company is blurred. But there's also a hell of a lot of people being paid by a whole host of companies that contribute code to the kernel and other projects that I have total access and the legal rights to take and modify and use for myself without paying a dime.
Open source development also frequently has a prophylactic effect against e.g. later copyright and patent claims on the technology by corporate players. Like, if you're really a mustache-twirling capitalist tech company and want to own all the tech, the last thing you want is an open source project putting stuff into a public repo that could be used as a prior art claim against anything you're developing that you want to protect as proprietary.

StratGoatCom
Aug 6, 2019

Our security is guaranteed by being able to melt the eyeballs of any other forum's denizens at 15 minutes notice


SubG posted:

Open source development also frequently has a prophylactic effect against e.g. later copyright and patent claims on the technology by corporate players. Like, if you're really a mustache-twirling capitalist tech company and want to own all the tech, the last thing you want is an open source project putting stuff into a public repo that could be used as a prior art claim against anything you're developing that you want to protect as proprietary.

The problem with that is that Open source turns out to really well suited to profitably training the milkmaids and getting the milk without paying for the cow; the difference here is that capitalists are more sophisticated then the older closed source lot here.

KwegiboHB
Feb 2, 2004

nonconformist art brut
Negative prompt: amenable, compliant, docile, law-abiding, lawful, legal, legitimate, obedient, orderly, submissive, tractable
Steps: 32, Sampler: DPM++ 2M Karras, CFG scale: 11, Seed: 520244594, Size: 512x512, Model hash: 99fd5c4b6f, Model: seekArtMEGA_mega20

StratGoatCom posted:

https://notesfrombelow.org/article/open-source-is-not-enough
https://techhq.com/2022/08/open-sou...ves%20%E2%80%93

It's an ideological fetish of spec labour, of outsource to hobbyists these days; ironically, another generative product, co-pilot was a rugpull that made it especially obvious.

Did you realize your first link: https://notesfrombelow.org/article/open-source-is-not-enough is openly calling for the abolition of copyright?

quote:

There are many tactics at our disposal, and we should be clear about the changes we want to see. In the long term, this means the abolition of copyright, building technology for people over profit, and providing for everyone’s needs.

Because it seemed like from your past posts you think this would be the worst case scenario.

StratGoatCom posted:

And again, if this kind of copyright circumvention is legal by machine laundering, the IP system ceases to function entire. It's not like distribution at all.

StratGoatCom posted:

No, we've just realized that the copyleft and other anti-copyright idiots were peddling bullshit pseudolaw that isn't worth the electrons to render;

StratGoatCom posted:

I am saying that if you let someone shovel everything and everything into a machine commercially as fair use, you may as well not have copyright because if you can shove it into a machine like that and get away with it, there's no protection against infringement.

StratGoatCom posted:

It's fair competition for IP, not mechanization. If you allow that excuse, you don't have a IP system.

StratGoatCom
Aug 6, 2019

Our security is guaranteed by being able to melt the eyeballs of any other forum's denizens at 15 minutes notice


KwegiboHB posted:

Did you realize your first link: https://notesfrombelow.org/article/open-source-is-not-enough is openly calling for the abolition of copyright?

Because it seemed like from your past posts you think this would be the worst case scenario.

I know there's a lot of dumb folks in that scene, but that doesn't diminish the argument made, that OS is basically captured.

KwegiboHB
Feb 2, 2004

nonconformist art brut
Negative prompt: amenable, compliant, docile, law-abiding, lawful, legal, legitimate, obedient, orderly, submissive, tractable
Steps: 32, Sampler: DPM++ 2M Karras, CFG scale: 11, Seed: 520244594, Size: 512x512, Model hash: 99fd5c4b6f, Model: seekArtMEGA_mega20

StratGoatCom posted:

I know there's a lot of dumb folks in that scene, but that doesn't diminish the argument made, that OS is basically captured.

That is especially funny because I'm planning on switching to FreeBSD, soon as I get around to finishing these tutorial videos.

SCheeseman
Apr 23, 2003

We're all captured, everything is by default. Copyleft licenses work from that assumption and try to thread a needle to create some kind of balance, but as they're a reactions to capitalism in a capitalist society they can't help but be compromised by it just like everything else is. GPL's commercial allowances are a part of this, a deliberate attempt to court capitalist organizations into contributing. Which they have, the Linux kernel is overwhelmingly maintained by people who are paid. This also means that volunteers have their work used by those entities, though since contributions make their way back it's a stretch to say that it's been taken.

I'd love for everyone who contributed to get paid. They should be. If only there were some system of governance that would allow for this.

SubG
Aug 19, 2004

It's a hard world for little things.

StratGoatCom posted:

The problem with that is that Open source turns out to really well suited to profitably training the milkmaids and getting the milk without paying for the cow; the difference here is that capitalists are more sophisticated then the older closed source lot here.
This is another case where I think your argument is undermined by your unwillingness to talk in specifics. In this case the "milk" is...what? An open source equivalent to GPT-3? OpenAI already has GPT-3 itself. Already paid for it. So they're not getting the "milk" for free. They already had the "milk". They're paying extra for some slightly different, open source "milk".

And even if you want to frame this as inherently shady...what's the apparent harm? There was already proprietary, closed source "milk" that was entirely controlled by OpenAI. Now there's open source "milk" whose origin and composition is now documented and available to everyone to use freely for whatever purpose they want. If you're worried about evil corporations controlling the entire supply of "milk", then that's precisely what you want. And if you're trying to argue that "milk" itself is an empiric ill or something like that, then all the vague doomsaying about open source stuff is just a non sequitur.

Jaxyon
Mar 7, 2016
I’m just saying I would like to see a man beat a woman in a cage. Just to be sure.

StratGoatCom posted:

https://notesfrombelow.org/article/open-source-is-not-enough
https://techhq.com/2022/08/open-sou...ves%20%E2%80%93

It's an ideological fetish of spec labour, of outsource to hobbyists these days; ironically, another generative product, co-pilot was a rugpull that made it especially obvious.

You want to summarize or give your positions on those?

Owling Howl
Jul 17, 2019
I understand that it's unpalatable that corporations profit from open source projects but it's neither here nor there. The tools are available to people who would otherwise have to pay for them which is what matters.

Adbot
ADBOT LOVES YOU

Gentleman Baller
Oct 13, 2013
Just checking, if open source developers genuinely don't want people to profit from their work without paying coders they could just publish their work under a licence that restricts commercial use, right?

Gentleman Baller fucked around with this message at 11:20 on Jun 3, 2023

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply