Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
How many quarters after Q1 2016 till Marissa Mayer is unemployed?
1 or fewer
2
4
Her job is guaranteed; what are you even talking about?
View Results
 
  • Post
  • Reply
PhazonLink
Jul 17, 2010
being careful/smart and engaged with cloud backup settings is also only one layer.

you should probably try using a cam app that strips out metadata, oh and then another metadata stripper.

oh and another meta data stripper on a real computer.

Adbot
ADBOT LOVES YOU

Volmarias
Dec 31, 2002

EMAIL... THE INTERNET... SEARCH ENGINES...

PhazonLink posted:

being careful/smart and engaged with cloud backup settings is also only one layer.

you should probably try using a cam app that strips out metadata, oh and then another metadata stripper.

oh and another meta data stripper on a real computer.

Instructions unclear, I am now dating a stripper named Meta, is this intended?

withoutclass
Nov 6, 2007

Resist the siren call of rhinocerosness

College Slice

Volmarias posted:

Instructions unclear, I am now dating a stripper named Meta, is this intended?

Only if you're really into Zuckerberg.

Nenonen
Oct 22, 2009

Mulla on aina kolkyt donaa taskussa
Meta is just short for "meth addict".

Neito
Feb 18, 2009

😌Finally, an avatar the describes my love of tech❤️‍💻, my love of anime💖🎎, and why I'll never see a real girl 🙆‍♀️naked😭.

withoutclass posted:

Only if you're really into Zuckerberg.

He never metadata he didn't like to use to blackmail his oponents.

The Lone Badger
Sep 24, 2007

Heck Yes! Loam! posted:

on android there is an option to save the photo to a secure local folder that is not automatically uploaded.

Until they change default settings in an update and upload them all anyway.

Nothingtoseehere
Nov 11, 2010


... on what android phone is their cloud sync by default? I'm on a Samsung, never logged into a samsung account, and my photos are all local.

Motronic
Nov 6, 2009

Nothingtoseehere posted:

... on what android phone is their cloud sync by default? I'm on a Samsung, never logged into a samsung account, and my photos are all local.

You've never logged into a google account on your phone? How are you using the app store?

It absolutely defaults to "sync everything!" when you add the account.

Heck Yes! Loam!
Nov 15, 2004

a rich, friable soil containing a relatively equal mixture of sand and silt and a somewhat smaller proportion of clay.
You have to install Google photos and enable cloud backup for it to happen automatically. The secure folder has never been an issue for me, even swapping phones.

PT6A
Jan 5, 2006

Public school teachers are callous dictators who won't lift a finger to stop children from peeing in my plane
The mechanisms by which one can avoid this nonsense are largely irrelevant, because we should be discussing why this is a problem in the first place.

The issue is that the algorithm is making a decision about what constitutes child sexual abuse images, and as far as we know, the criteria is: does the photo/video contain the image of something that looks like or is a naked child? But, of course, this whole situation occurred because that's an overly simplistic definition. As in this case, there are contexts where an image of a naked child (or any naked person) is not at all pornographic (consider the Pulitzer-winning photo of Phan Thi Kim Phuc, the "napalm girl"), and there are cases where children are being exploited that don't necessarily involve them being naked (there was that whole scandal a while back where pedos were trading timestamps of youtube videos of "gymnastics challenges" and poo poo, the Subreddit that was collecting pictures of children that were not technically illegal, but obviously being used for sexual purposes, and that time the weirdo from Nickelodeon asked children to post feet pics on Twitter).

I bet Google and the other big tech companies don't want to talk about all that poo poo, because actually recognizing and solving that problem is a lot more work than slapping together a lovely algorithm that can't tell the difference between child sexual abuse and a picture of a rash and calling it all good. Dealing with child exploitation and child sexual abuse in a substantive way would, after all, cut into profits.

Blut
Sep 11, 2009

if someone is in the bottom 10%~ of a guillotine
^ in addition the issue is also that Google's algorithm has been shown to be completely wrong in this case - but Google are still refusing to admit fault and restore the accounts in question.

Which is almost worse in some ways, its just malicious. At least the initial mistake could be blamed on the AI.

Vegetable
Oct 22, 2010

The case in the article is a more clear cut one. But in general I imagine it’s a nightmare for Google to try to handle appeals. Who’s to say a doctor writing letters of explanation isn’t himself part of a pedophile ring? What does a foolproof appeal process even look like? It’s a domain where truth is extremely murky and Google have clearly decided they’d rather have false positives than false negatives.

PT6A
Jan 5, 2006

Public school teachers are callous dictators who won't lift a finger to stop children from peeing in my plane
If the police and doctors and the court system are all in on the conspiracy to produce child sexual abuse images by cleverly disguising them as photos of a child with a rash, I'd say there are problems there large enough that Google cannot possibly solve them.

Now, what is a larger issue, and I'm guessing this might be why Google gives a gently caress in the first place, is: could this image be illegal if stripped of its context? If, for example, there were some kind of data breach at Google, and this picture were leaked to people who have no legitimate medical interest in the child's rash, could Google be liable for contributing to child sexual exploitation (and arguably, removing the medical context of the image, intentionally or otherwise, would be an exploitation of that child for sexual purposes, even if the parent and the doctor acted legally and in good faith at all times)?

Harold Fjord
Jan 3, 2004
I understand why we would err in that direction but if that if that is Google's concern it seems like some of these laws are way over-broad.

Nenonen
Oct 22, 2009

Mulla on aina kolkyt donaa taskussa
Imagine working for Google with the specific job of inspecting photos that AI believes are cp but customers say are just innocent photos of their child's genital rash.

"JUST LOOK AT THIS IMAGE OF MY CHILD'S DISEASED GENITAL AREA AND TELL ME IS THIS PORN?!"

:shepicide:

PT6A
Jan 5, 2006

Public school teachers are callous dictators who won't lift a finger to stop children from peeing in my plane

Harold Fjord posted:

I understand why we would err in that direction but if that if that is Google's concern it seems like some of these laws are way over-broad.

I mean, you're not wrong, but it's politically and socially unpalatable to be the one who says "we need to re-examine the laws and policies surrounding child sexual abuse images, they're maybe a little too broad."

I think our natural revulsion towards child sexual abuse, probably even more than other incredibly heinous crimes like... murder, or the rape/sexual abuse of an adult, makes it very hard to have a reasonable discussion about things like this in general.

Foxfire_
Nov 8, 2010

I am pretty okay with a world where uploading a picture of a naked toddler's crotch to cloud storage sends a ping to law enforcement, then law enforcement decides what to do on a case-by-case basis. I'd prefer that to either doing nothing, or to someone at google (either human or automated) peeping through other pictures/emails to try to decide if it's fine in context or abusive. I am also okay with Google saying categorically "Do not upload pictures of naked toddlers, including legal ones". The only thing that seems wrong with that guy's case is that Google probably ought to be more forgiving of accidental violations of that policy.

PT6A
Jan 5, 2006

Public school teachers are callous dictators who won't lift a finger to stop children from peeing in my plane
Based on my personal feelings, I agree, but I think we'd be rightfully outraged if potential evidence of crimes other than this one very specific crime, were routinely determined by algorithm and sent to law enforcement.

This is my point: we allow our (completely justifiable) revulsion to pedophiles and child exploitation to override what we rationally understand about crimes and law enforcement. I'll give you an example: at various points in my life, I had two co-workers who were serious criminals, who were caught, tried, convicted and served their time before ending up working where I did. One guy had committed manslaughter. The other send a picture of his dick to a 13-year-old child and tried to lure her. I believe in rehabilitation, I believe convicts deserve a second chance; I had no problem with the guy who killed someone, but the guy who sent a picture of his pickle to a child, I couldn't stand being around him. Neither could anyone else once it was found out, and he got canned.

Ultimately, is what he did worse than killing someone? Rationally, it's not. But, emotionally: gently caress that guy, I hate him and I don't give a poo poo if he's paid his debt to society, I don't want to be anywhere near him.

So it is with this style of enforcement being done by Google. If it were anything else there's no way in hell people would accept this; not here in this thread, not anywhere. I have to consider, then, that this is a bad policy which should not exist, that nonetheless does have the happy consequence of making life hell for pedophiles who frankly deserve it.

Cool Dad
Jun 15, 2007

It is always Friday night, motherfuckers

The thing about CP is that the crime is just having the picture. Having pictures of a murdered guy isn't a crime, but murdering a guy is. Having inappropriate pictures of kids is a crime in and of itself, whether you took them or not. I can see why Google would want to look for that and notify law enforcement, because just having those pictures on their server could be a problem otherwise.

And the difference between manslaughter guy and dickpic guy is that manslaughter is usually unintentional, or at least situational, and most people who are convicted of manslaughter aren't likely to re-offend. Sending a dick pic to a child is fully intentional and the only reason someone who does once might not do it again is fear of consequences.

I actually think we need full institutionalization for pedophiles because I don't even know what else you do with them. It's really pretty tragic that people can be broken in that way and there's seemingly no way to fix them.

PT6A
Jan 5, 2006

Public school teachers are callous dictators who won't lift a finger to stop children from peeing in my plane

Cool Dad posted:

The thing about CP is that the crime is just having the picture. Having pictures of a murdered guy isn't a crime, but murdering a guy is. Having inappropriate pictures of kids is a crime in and of itself, whether you took them or not. I can see why Google would want to look for that and notify law enforcement, because just having those pictures on their server could be a problem otherwise.

And the difference between manslaughter guy and dickpic guy is that manslaughter is usually unintentional, or at least situational, and most people who are convicted of manslaughter aren't likely to re-offend. Sending a dick pic to a child is fully intentional and the only reason someone who does once might not do it again is fear of consequences.

I actually think we need full institutionalization for pedophiles because I don't even know what else you do with them. It's really pretty tragic that people can be broken in that way and there's seemingly no way to fix them.

Right, but having a picture of a child who happens to be naked is, as we've established, not inherently a crime. That's why this is an issue. It is a crime if it's an image of child sexual abuse, but an algorithm can't necessarily determine that.

I agree with your assessment of the dickpic guy versus the drunk driver who killed someone.

On the last point: there is one crime here, and it's called rape. It is abhorrent in all its forms. Minors are a special case, because they definitionally cannot consent to sex, but honestly: rape is rape. I don't see a huge distinction between child molesters and any other rapist. If you share pictures of that crime, I don't think the age of the victim matters. Does the solution involve institutionalization? Maybe, but if it does, then that should be extended to every other rapist as well.

As a note: I've specifically avoided using the phrase "child pornography" or its abbreviation when discussing this, because it understates the seriousness of the offense. This is not pornography, these are images of child sexual abuse; the very creation of the images is, by definition, criminal, whereas the word "pornography" has less harmful connotations, which are certainly not justified given the subject matter.

BiggerBoat
Sep 26, 2007

Don't you tell me my business again.
Where does the Google photo filter come down on the cover of Nirvana's Nevermind album I wonder.

PJOmega
May 5, 2009

BiggerBoat posted:

Where does the Google photo filter come down on the cover of Nirvana's Nevermind album I wonder.

Straight to jail.

VikingofRock
Aug 24, 2008




Cool Dad posted:

I actually think we need full institutionalization for pedophiles because I don't even know what else you do with them. It's really pretty tragic that people can be broken in that way and there's seemingly no way to fix them.

My understanding is that current forms of therapy can't change the attraction (the pedophilia), but it can help prevent that from turning into child sexual abuse. So, probably in a just world, people who sexually abuse children (or anyone) would be treated like anyone else who committed a crime: they would have to perform some form of restitution for the victim (admittedly I don't have a great idea what that looks like in this case), they would have some restrictions placed on them to lower their chance of re-offending (e.g., they can't be teachers, priests, scout troop leaders, etc), and they would get a lot of societal support to help them with their issues. My understanding is that the perpetrators usually personally know and are trusted by the victims, and given the intensely personal nature of the crime, restorative justice (especially getting the perpetrator to recognize the harm done) would help the victim heal.

That all said, as a society the US really struggles to recognize the humanity in people who commit much less disturbing and serious crimes, so treating people who sexually abuse children as people is probably a long, long way off. Maybe they could do it in Scandinavia, but I'm not holding my breath for anything close to the above in the US.

Tuxedo Gin
May 21, 2003

Classy.

We don't really understand it so it is impossible to say whether or not it is treatable. The only studies done on it are the people who end up in prison for it. Non-offenders can't get treatment or be studied properly due to stigma and near automatic reporting to authorities if they vocalize their urges to a therapist. Like most mental illnesses it can likely be addressed, but people would rather maintain a "castrate them all and execute them all" stance. Even extremely bipolar and psychotic people can lead relatively normal and productive lives, I don't see why pedophilia should be "unfixable".

Elias_Maluco
Aug 23, 2007
I need to sleep
https://twitter.com/SaycheeseDGTL/status/1562198726604513280

Main Paineframe
Oct 27, 2010

Can't believe the company that invented virtual rapper Lil Bitcoin and wrote his debut single I Love Bitcoin could make such terrible mistakes!

In a more serious note, the CEO of the company behind FN Meta had an interview last year where he made his motives fairly clear:
https://www.musicbusinessworldwide....and-unreliable/

quote:

Q: HOW DID FN MEKA COME TO BE?
Factory New is a media company focused solely on virtual and digital talent, there will be no human artists on our roster. Our first character is FN Meka, an AI driven robot rapper. He was created using thousands of data points compiled from video games and social media.

The old model of finding talent is inefficient and unreliable. It requires spending time scouring the internet, traveling to shows, flying to meetings, expending resources all in search of the magic combination of qualities that just might translate into a superstar act.

Even with all the money labels devote to finding talent, the success rate is a pitiful 1%. Now we can literally custom-create artists using elements proven to work, greatly increasing the odds of success.

Even if we can get to 2% success rate then we’ve doubled the industry standard.

Q: CAN YOU EXPLAIN THE BASICS OF THE AI PROCESS THAT LEADS TO EACH RECORDING? HOW ARE THE LYRICS WRITTEN, HOW ARE THEY PERFORMED ETC.?
We’ve developed a proprietary AI technology that analyzes certain popular songs of a specified genre and generates recommendations for the various elements of song construction: lyrical content, chords, melody, tempo, sounds, etc. We then combine these elements to create the song.

As of now, a human voice performs the vocals, but we are working towards the ability to have a computer come up with and perform its own words – and even collaborate with other computers as “co-writers”.

Q: WHAT’S THE AMBITION HERE; WITH THE NUMBER OF TIKTOK VIEWS ALREADY ATTRACTED, IS THERE AN OPPORTUNITY FOR A CHART / AUDIO STREAMING CROSSOVER?
Anyone that has kids knows the future is virtual, and Factory New is creating celebrities for that world. TikTok has been an amazing platform for us. In less than a year, we’ve gained over 9 million followers which has led to a ton of opportunities from brand partnerships to artist collaborations.

We’ve recently been in talks with their execs about some interesting things you’ll hear about soon. As the lines between the physical and virtual world blur further, we don’t think there will be much of a distinction between the two.

Our artists will definitely have chart and streaming success, but we want to be more dynamic than that. Our artists aren’t limited by the human form, so as a company, we don’t want to be limited by traditional business models either.

Q: WE’VE OFTEN WRITTEN ABOUT THE CONCEPT OF ‘ROBOT STARS’ IN A WORLD WHERE HITS SEEM TO BE MORE IMPORTANT THAN ARTISTS. WHAT’S YOUR VIEW ON THE LIKELIHOOD OF THAT DEVELOPMENT IN THE FUTURE?
If a song is good, people will listen to it. Maybe the fact that it’s made by a robot makes it even more interesting. Not to get all philosophical but, what is an “artist” today? Think about the biggest stars in the world. How many of them are just vessels for commercial endeavors?

Most hits are written by teams of people who get paid to make music that will “sell”. We think machines can eventually run this process more efficiently than humans. How many fans ever actually meet the stars they idolize anyway?

People’s fandom develops from digital images on screens projecting expertly designed content – who actually knows with certainty whats real and whats not? If the content is good enough, do people even care how it’s made?

In other words, it's an effort to remove the artist from the equation as much as possible. To mash every successful song into an algorithm so the execs can just push a button on a machine that spits out a hit song on-demand, without having to deal with performers or songwriters or anyone else, without having to put in the effort and resources to finding talented people and then treat them well so that they stay on. In the long run, the goal is to minimize the dependence on labor and artists as much as possible; the ideal is to have the CEO just sitting in front of a button that sends hit songs to content farms, with no one to share the profits with except a couple of engineers to make sure poo poo doesn't break.

And in this particular case, it led to a white CEO putting out lyrics that dropped the N-word all the drat time, to be voiced by an anonymous and uncredited human VA who the CEO swears is totally not white.

sebmojo
Oct 23, 2010


Legit Cyberpunk









This is interesting, feels like a gross yet inevitable consequence of handing over your music collection to a robot

Sundae
Dec 1, 2005

Main Paineframe posted:

Can't believe the company that invented virtual rapper Lil Bitcoin and wrote his debut single I Love Bitcoin could make such terrible mistakes!

In a more serious note, the CEO of the company behind FN Meta had an interview last year where he made his motives fairly clear:
https://www.musicbusinessworldwide....and-unreliable/

In other words, it's an effort to remove the artist from the equation as much as possible. To mash every successful song into an algorithm so the execs can just push a button on a machine that spits out a hit song on-demand, without having to deal with performers or songwriters or anyone else, without having to put in the effort and resources to finding talented people and then treat them well so that they stay on. In the long run, the goal is to minimize the dependence on labor and artists as much as possible; the ideal is to have the CEO just sitting in front of a button that sends hit songs to content farms, with no one to share the profits with except a couple of engineers to make sure poo poo doesn't break.

And in this particular case, it led to a white CEO putting out lyrics that dropped the N-word all the drat time, to be voiced by an anonymous and uncredited human VA who the CEO swears is totally not white.


Very fitting that we re-invented Vocaloids, but even worse.

BiggerBoat
Sep 26, 2007

Don't you tell me my business again.

Main Paineframe posted:



In other words, it's an effort to remove the artist from the equation as much as possible. To mash every successful song into an algorithm so the execs can just push a button on a machine that spits out a hit song on-demand, without having to deal with performers or songwriters or anyone else, without having to put in the effort and resources to finding talented people and then treat them well so that they stay on. In the long run, the goal is to minimize the dependence on labor and artists as much as possible; the ideal is to have the CEO just sitting in front of a button that sends hit songs to content farms, with no one to share the profits with except a couple of engineers to make sure poo poo doesn't break.


It's worked fairly well in the graphic design industry

starkebn
May 18, 2004

"Oooh, got a little too serious. You okay there, little buddy?"
I personally don't really care where the media I consume comes from, although I'd prefer the labour that goes into it to be fairly compensated. So much creative output exists off the back of horrible exploration already, I'm not sure why this is a sudden stumbling block.

Volmarias
Dec 31, 2002

EMAIL... THE INTERNET... SEARCH ENGINES...

Main Paineframe posted:

Can't believe the company that invented virtual rapper Lil Bitcoin and wrote his debut single I Love Bitcoin could make such terrible mistakes!

In a more serious note, the CEO of the company behind FN Meta had an interview last year where he made his motives fairly clear:
https://www.musicbusinessworldwide....and-unreliable/

In other words, it's an effort to remove the artist from the equation as much as possible. To mash every successful song into an algorithm so the execs can just push a button on a machine that spits out a hit song on-demand, without having to deal with performers or songwriters or anyone else, without having to put in the effort and resources to finding talented people and then treat them well so that they stay on. In the long run, the goal is to minimize the dependence on labor and artists as much as possible; the ideal is to have the CEO just sitting in front of a button that sends hit songs to content farms, with no one to share the profits with except a couple of engineers to make sure poo poo doesn't break.

And in this particular case, it led to a white CEO putting out lyrics that dropped the N-word all the drat time, to be voiced by an anonymous and uncredited human VA who the CEO swears is totally not white.

You sound shocked, yet you're already in this thread, curious. :benshapiro:

Honestly, the music industry has been a nightmare, basically forever, unless you happen to be one of the rare artists that becomes huge and can actually earn money. Boy bands with auto tuned music and premade lyrics are nothing new, they just need to find someone attractive with a passable singing voice and an ego small enough that they probably won't run off right away. With most of the artists' take being from touring and merch sales (streaming services pay fractions of pennies per play, and lol at getting on radio now that clear channel owns everything), the labels already consider artists to be disgusting creatures to be tolerated if they can make money. Rather than having the human in the loop, they'd rather just automate everything, and I'm not even a little bit surprised.

Main Paineframe
Oct 27, 2010

Volmarias posted:

You sound shocked, yet you're already in this thread, curious. :benshapiro:

Honestly, the music industry has been a nightmare, basically forever, unless you happen to be one of the rare artists that becomes huge and can actually earn money. Boy bands with auto tuned music and premade lyrics are nothing new, they just need to find someone attractive with a passable singing voice and an ego small enough that they probably won't run off right away. With most of the artists' take being from touring and merch sales (streaming services pay fractions of pennies per play, and lol at getting on radio now that clear channel owns everything), the labels already consider artists to be disgusting creatures to be tolerated if they can make money. Rather than having the human in the loop, they'd rather just automate everything, and I'm not even a little bit surprised.

I'm not really shocked, but I'm amused that it keeps failing this hard. They get into this space and overreach way too hard every time, and it turns out there is a limit to how much low-effort grift people will tolerate.

BiggerBoat posted:

It's worked fairly well in the graphic design industry

If you mean the AI drawing stuff, I think there's still a while to go for that stuff. It can make some pretty good stuff out of a fairly short prompt, but there's plenty of people who want to get into the details and find that the current stuff out there has its limits on how specific a request it can accommodate.

https://twitter.com/abcdentminded/status/1561211951161569280

Mister Facetious
Apr 21, 2007

I think I died and woke up in L.A.,
I don't know how I wound up in this place...

:canada:

Main Paineframe posted:

I'm not really shocked, but I'm amused that it keeps failing this hard. They get into this space and overreach way too hard every time, and it turns out there is a limit to how much low-effort grift people will tolerate.

If you mean the AI drawing stuff, I think there's still a while to go for that stuff. It can make some pretty good stuff out of a fairly short prompt, but there's plenty of people who want to get into the details and find that the current stuff out there has its limits on how specific a request it can accommodate.

https://twitter.com/abcdentminded/status/1561211951161569280

It's... Peggy Bundy?

Precambrian Video Games
Aug 19, 2002



Since people keep, uh, gushing over DALL-E 2 I've been wondering how many people are trying to use it to make porn and how it avoids generating depraved/unethical poo poo. Actually, they seem to have done a pretty good job on setting a content and moderation policy, documenting and addressing its limitations. I presume it's just a matter of months until someone releases Unscrupulous DALL-E, or we find out yet another case of Oops the Big Tech AI-Generated Milkshake Duck is Racist. I wonder if the bigger obstacles are writing/using the code (much of which is open source but not trivial to use) or the cost of GPU time to do the training.

Volmarias
Dec 31, 2002

EMAIL... THE INTERNET... SEARCH ENGINES...

Main Paineframe posted:

I'm not really shocked, but I'm amused that it keeps failing this hard. They get into this space and overreach way too hard every time, and it turns out there is a limit to how much low-effort grift people will tolerate.

Sure, but if you think this isn't the direction we're going to go for at least some stuff in the future you're kidding yourself.

E: I mean the future we're going to eventually arrive at

Ghost Leviathan
Mar 2, 2017

Exploration is ill-advised.
I mean it happens in every market where suits try to cut the creative process out of the equation entirely to flood it with low-effort dross and everyone gets bored of it and moves onto something that actually sparks their interest quickly. See reality TV booms and busts. That poo poo has no legacy and no tail. No one's going back to it.

Arivia
Mar 17, 2011

Ghost Leviathan posted:

I mean it happens in every market where suits try to cut the creative process out of the equation entirely to flood it with low-effort dross and everyone gets bored of it and moves onto something that actually sparks their interest quickly. See reality TV booms and busts. That poo poo has no legacy and no tail. No one's going back to it.

Have you seen the new series on Netflix these days? Reality TV has no longevity but it still has huge audiences and is remarkably cheap.

BiggerBoat
Sep 26, 2007

Don't you tell me my business again.

Main Paineframe posted:



If you mean the AI drawing stuff, I think there's still a while to go for that stuff. It can make some pretty good stuff out of a fairly short prompt, but there's plenty of people who want to get into the details and find that the current stuff out there has its limits on how specific a request it can accommodate.


I mean stuff like pre-made templates, cheap clip art and websites that generate low cost layouts based on basic parameters (text, colors, logo). This kind of thing also exists for logo design and like 80% of customers are fine with whatever these sites generate or maybe they just need a minor tweak like adding a FB logo or something.

SCheeseman
Apr 23, 2003

eXXon posted:

Since people keep, uh, gushing over DALL-E 2 I've been wondering how many people are trying to use it to make porn and how it avoids generating depraved/unethical poo poo. Actually, they seem to have done a pretty good job on setting a content and moderation policy, documenting and addressing its limitations. I presume it's just a matter of months until someone releases Unscrupulous DALL-E, or we find out yet another case of Oops the Big Tech AI-Generated Milkshake Duck is Racist. I wonder if the bigger obstacles are writing/using the code (much of which is open source but not trivial to use) or the cost of GPU time to do the training.

That horse is out of the barn.

Adbot
ADBOT LOVES YOU

Freakazoid_
Jul 5, 2013


Buglord

lol riker go to your quarters you're drunk

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply