|
being careful/smart and engaged with cloud backup settings is also only one layer. you should probably try using a cam app that strips out metadata, oh and then another metadata stripper. oh and another meta data stripper on a real computer.
|
# ? Aug 22, 2022 18:00 |
|
|
# ? Jun 1, 2024 14:33 |
|
PhazonLink posted:being careful/smart and engaged with cloud backup settings is also only one layer. Instructions unclear, I am now dating a stripper named Meta, is this intended?
|
# ? Aug 22, 2022 18:04 |
|
Volmarias posted:Instructions unclear, I am now dating a stripper named Meta, is this intended? Only if you're really into Zuckerberg.
|
# ? Aug 22, 2022 19:14 |
|
Meta is just short for "meth addict".
|
# ? Aug 22, 2022 19:59 |
|
withoutclass posted:Only if you're really into Zuckerberg. He never metadata he didn't like to use to blackmail his oponents.
|
# ? Aug 22, 2022 20:10 |
|
Heck Yes! Loam! posted:on android there is an option to save the photo to a secure local folder that is not automatically uploaded. Until they change default settings in an update and upload them all anyway.
|
# ? Aug 23, 2022 08:13 |
... on what android phone is their cloud sync by default? I'm on a Samsung, never logged into a samsung account, and my photos are all local.
|
|
# ? Aug 23, 2022 09:28 |
|
Nothingtoseehere posted:... on what android phone is their cloud sync by default? I'm on a Samsung, never logged into a samsung account, and my photos are all local. You've never logged into a google account on your phone? How are you using the app store? It absolutely defaults to "sync everything!" when you add the account.
|
# ? Aug 23, 2022 14:21 |
|
You have to install Google photos and enable cloud backup for it to happen automatically. The secure folder has never been an issue for me, even swapping phones.
|
# ? Aug 23, 2022 15:02 |
|
The mechanisms by which one can avoid this nonsense are largely irrelevant, because we should be discussing why this is a problem in the first place. The issue is that the algorithm is making a decision about what constitutes child sexual abuse images, and as far as we know, the criteria is: does the photo/video contain the image of something that looks like or is a naked child? But, of course, this whole situation occurred because that's an overly simplistic definition. As in this case, there are contexts where an image of a naked child (or any naked person) is not at all pornographic (consider the Pulitzer-winning photo of Phan Thi Kim Phuc, the "napalm girl"), and there are cases where children are being exploited that don't necessarily involve them being naked (there was that whole scandal a while back where pedos were trading timestamps of youtube videos of "gymnastics challenges" and poo poo, the Subreddit that was collecting pictures of children that were not technically illegal, but obviously being used for sexual purposes, and that time the weirdo from Nickelodeon asked children to post feet pics on Twitter). I bet Google and the other big tech companies don't want to talk about all that poo poo, because actually recognizing and solving that problem is a lot more work than slapping together a lovely algorithm that can't tell the difference between child sexual abuse and a picture of a rash and calling it all good. Dealing with child exploitation and child sexual abuse in a substantive way would, after all, cut into profits.
|
# ? Aug 23, 2022 16:01 |
|
^ in addition the issue is also that Google's algorithm has been shown to be completely wrong in this case - but Google are still refusing to admit fault and restore the accounts in question. Which is almost worse in some ways, its just malicious. At least the initial mistake could be blamed on the AI.
|
# ? Aug 23, 2022 17:12 |
|
The case in the article is a more clear cut one. But in general I imagine it’s a nightmare for Google to try to handle appeals. Who’s to say a doctor writing letters of explanation isn’t himself part of a pedophile ring? What does a foolproof appeal process even look like? It’s a domain where truth is extremely murky and Google have clearly decided they’d rather have false positives than false negatives.
|
# ? Aug 23, 2022 17:28 |
|
If the police and doctors and the court system are all in on the conspiracy to produce child sexual abuse images by cleverly disguising them as photos of a child with a rash, I'd say there are problems there large enough that Google cannot possibly solve them. Now, what is a larger issue, and I'm guessing this might be why Google gives a gently caress in the first place, is: could this image be illegal if stripped of its context? If, for example, there were some kind of data breach at Google, and this picture were leaked to people who have no legitimate medical interest in the child's rash, could Google be liable for contributing to child sexual exploitation (and arguably, removing the medical context of the image, intentionally or otherwise, would be an exploitation of that child for sexual purposes, even if the parent and the doctor acted legally and in good faith at all times)?
|
# ? Aug 23, 2022 18:26 |
|
I understand why we would err in that direction but if that if that is Google's concern it seems like some of these laws are way over-broad.
|
# ? Aug 23, 2022 18:35 |
|
Imagine working for Google with the specific job of inspecting photos that AI believes are cp but customers say are just innocent photos of their child's genital rash. "JUST LOOK AT THIS IMAGE OF MY CHILD'S DISEASED GENITAL AREA AND TELL ME IS THIS PORN?!"
|
# ? Aug 23, 2022 18:47 |
|
Harold Fjord posted:I understand why we would err in that direction but if that if that is Google's concern it seems like some of these laws are way over-broad. I mean, you're not wrong, but it's politically and socially unpalatable to be the one who says "we need to re-examine the laws and policies surrounding child sexual abuse images, they're maybe a little too broad." I think our natural revulsion towards child sexual abuse, probably even more than other incredibly heinous crimes like... murder, or the rape/sexual abuse of an adult, makes it very hard to have a reasonable discussion about things like this in general.
|
# ? Aug 23, 2022 19:01 |
|
I am pretty okay with a world where uploading a picture of a naked toddler's crotch to cloud storage sends a ping to law enforcement, then law enforcement decides what to do on a case-by-case basis. I'd prefer that to either doing nothing, or to someone at google (either human or automated) peeping through other pictures/emails to try to decide if it's fine in context or abusive. I am also okay with Google saying categorically "Do not upload pictures of naked toddlers, including legal ones". The only thing that seems wrong with that guy's case is that Google probably ought to be more forgiving of accidental violations of that policy.
|
# ? Aug 23, 2022 21:09 |
|
Based on my personal feelings, I agree, but I think we'd be rightfully outraged if potential evidence of crimes other than this one very specific crime, were routinely determined by algorithm and sent to law enforcement. This is my point: we allow our (completely justifiable) revulsion to pedophiles and child exploitation to override what we rationally understand about crimes and law enforcement. I'll give you an example: at various points in my life, I had two co-workers who were serious criminals, who were caught, tried, convicted and served their time before ending up working where I did. One guy had committed manslaughter. The other send a picture of his dick to a 13-year-old child and tried to lure her. I believe in rehabilitation, I believe convicts deserve a second chance; I had no problem with the guy who killed someone, but the guy who sent a picture of his pickle to a child, I couldn't stand being around him. Neither could anyone else once it was found out, and he got canned. Ultimately, is what he did worse than killing someone? Rationally, it's not. But, emotionally: gently caress that guy, I hate him and I don't give a poo poo if he's paid his debt to society, I don't want to be anywhere near him. So it is with this style of enforcement being done by Google. If it were anything else there's no way in hell people would accept this; not here in this thread, not anywhere. I have to consider, then, that this is a bad policy which should not exist, that nonetheless does have the happy consequence of making life hell for pedophiles who frankly deserve it.
|
# ? Aug 23, 2022 21:28 |
|
The thing about CP is that the crime is just having the picture. Having pictures of a murdered guy isn't a crime, but murdering a guy is. Having inappropriate pictures of kids is a crime in and of itself, whether you took them or not. I can see why Google would want to look for that and notify law enforcement, because just having those pictures on their server could be a problem otherwise. And the difference between manslaughter guy and dickpic guy is that manslaughter is usually unintentional, or at least situational, and most people who are convicted of manslaughter aren't likely to re-offend. Sending a dick pic to a child is fully intentional and the only reason someone who does once might not do it again is fear of consequences. I actually think we need full institutionalization for pedophiles because I don't even know what else you do with them. It's really pretty tragic that people can be broken in that way and there's seemingly no way to fix them.
|
# ? Aug 23, 2022 23:18 |
|
Cool Dad posted:The thing about CP is that the crime is just having the picture. Having pictures of a murdered guy isn't a crime, but murdering a guy is. Having inappropriate pictures of kids is a crime in and of itself, whether you took them or not. I can see why Google would want to look for that and notify law enforcement, because just having those pictures on their server could be a problem otherwise. Right, but having a picture of a child who happens to be naked is, as we've established, not inherently a crime. That's why this is an issue. It is a crime if it's an image of child sexual abuse, but an algorithm can't necessarily determine that. I agree with your assessment of the dickpic guy versus the drunk driver who killed someone. On the last point: there is one crime here, and it's called rape. It is abhorrent in all its forms. Minors are a special case, because they definitionally cannot consent to sex, but honestly: rape is rape. I don't see a huge distinction between child molesters and any other rapist. If you share pictures of that crime, I don't think the age of the victim matters. Does the solution involve institutionalization? Maybe, but if it does, then that should be extended to every other rapist as well. As a note: I've specifically avoided using the phrase "child pornography" or its abbreviation when discussing this, because it understates the seriousness of the offense. This is not pornography, these are images of child sexual abuse; the very creation of the images is, by definition, criminal, whereas the word "pornography" has less harmful connotations, which are certainly not justified given the subject matter.
|
# ? Aug 24, 2022 00:15 |
|
Where does the Google photo filter come down on the cover of Nirvana's Nevermind album I wonder.
|
# ? Aug 24, 2022 00:54 |
|
BiggerBoat posted:Where does the Google photo filter come down on the cover of Nirvana's Nevermind album I wonder. Straight to jail.
|
# ? Aug 24, 2022 00:57 |
Cool Dad posted:I actually think we need full institutionalization for pedophiles because I don't even know what else you do with them. It's really pretty tragic that people can be broken in that way and there's seemingly no way to fix them. My understanding is that current forms of therapy can't change the attraction (the pedophilia), but it can help prevent that from turning into child sexual abuse. So, probably in a just world, people who sexually abuse children (or anyone) would be treated like anyone else who committed a crime: they would have to perform some form of restitution for the victim (admittedly I don't have a great idea what that looks like in this case), they would have some restrictions placed on them to lower their chance of re-offending (e.g., they can't be teachers, priests, scout troop leaders, etc), and they would get a lot of societal support to help them with their issues. My understanding is that the perpetrators usually personally know and are trusted by the victims, and given the intensely personal nature of the crime, restorative justice (especially getting the perpetrator to recognize the harm done) would help the victim heal. That all said, as a society the US really struggles to recognize the humanity in people who commit much less disturbing and serious crimes, so treating people who sexually abuse children as people is probably a long, long way off. Maybe they could do it in Scandinavia, but I'm not holding my breath for anything close to the above in the US.
|
|
# ? Aug 24, 2022 06:01 |
|
We don't really understand it so it is impossible to say whether or not it is treatable. The only studies done on it are the people who end up in prison for it. Non-offenders can't get treatment or be studied properly due to stigma and near automatic reporting to authorities if they vocalize their urges to a therapist. Like most mental illnesses it can likely be addressed, but people would rather maintain a "castrate them all and execute them all" stance. Even extremely bipolar and psychotic people can lead relatively normal and productive lives, I don't see why pedophilia should be "unfixable".
|
# ? Aug 24, 2022 17:04 |
|
https://twitter.com/SaycheeseDGTL/status/1562198726604513280
|
# ? Aug 24, 2022 17:31 |
|
Can't believe the company that invented virtual rapper Lil Bitcoin and wrote his debut single I Love Bitcoin could make such terrible mistakes! In a more serious note, the CEO of the company behind FN Meta had an interview last year where he made his motives fairly clear: https://www.musicbusinessworldwide....and-unreliable/ quote:Q: HOW DID FN MEKA COME TO BE? In other words, it's an effort to remove the artist from the equation as much as possible. To mash every successful song into an algorithm so the execs can just push a button on a machine that spits out a hit song on-demand, without having to deal with performers or songwriters or anyone else, without having to put in the effort and resources to finding talented people and then treat them well so that they stay on. In the long run, the goal is to minimize the dependence on labor and artists as much as possible; the ideal is to have the CEO just sitting in front of a button that sends hit songs to content farms, with no one to share the profits with except a couple of engineers to make sure poo poo doesn't break. And in this particular case, it led to a white CEO putting out lyrics that dropped the N-word all the drat time, to be voiced by an anonymous and uncredited human VA who the CEO swears is totally not white.
|
# ? Aug 24, 2022 20:16 |
|
This is interesting, feels like a gross yet inevitable consequence of handing over your music collection to a robot
|
# ? Aug 24, 2022 21:23 |
|
Main Paineframe posted:Can't believe the company that invented virtual rapper Lil Bitcoin and wrote his debut single I Love Bitcoin could make such terrible mistakes! Very fitting that we re-invented Vocaloids, but even worse.
|
# ? Aug 24, 2022 21:47 |
|
Main Paineframe posted:
It's worked fairly well in the graphic design industry
|
# ? Aug 25, 2022 00:04 |
|
I personally don't really care where the media I consume comes from, although I'd prefer the labour that goes into it to be fairly compensated. So much creative output exists off the back of horrible exploration already, I'm not sure why this is a sudden stumbling block.
|
# ? Aug 25, 2022 01:36 |
|
Main Paineframe posted:Can't believe the company that invented virtual rapper Lil Bitcoin and wrote his debut single I Love Bitcoin could make such terrible mistakes! You sound shocked, yet you're already in this thread, curious. :benshapiro: Honestly, the music industry has been a nightmare, basically forever, unless you happen to be one of the rare artists that becomes huge and can actually earn money. Boy bands with auto tuned music and premade lyrics are nothing new, they just need to find someone attractive with a passable singing voice and an ego small enough that they probably won't run off right away. With most of the artists' take being from touring and merch sales (streaming services pay fractions of pennies per play, and lol at getting on radio now that clear channel owns everything), the labels already consider artists to be disgusting creatures to be tolerated if they can make money. Rather than having the human in the loop, they'd rather just automate everything, and I'm not even a little bit surprised.
|
# ? Aug 25, 2022 02:15 |
|
Volmarias posted:You sound shocked, yet you're already in this thread, curious. :benshapiro: I'm not really shocked, but I'm amused that it keeps failing this hard. They get into this space and overreach way too hard every time, and it turns out there is a limit to how much low-effort grift people will tolerate. BiggerBoat posted:It's worked fairly well in the graphic design industry If you mean the AI drawing stuff, I think there's still a while to go for that stuff. It can make some pretty good stuff out of a fairly short prompt, but there's plenty of people who want to get into the details and find that the current stuff out there has its limits on how specific a request it can accommodate. https://twitter.com/abcdentminded/status/1561211951161569280
|
# ? Aug 25, 2022 03:38 |
|
Main Paineframe posted:I'm not really shocked, but I'm amused that it keeps failing this hard. They get into this space and overreach way too hard every time, and it turns out there is a limit to how much low-effort grift people will tolerate. It's... Peggy Bundy?
|
# ? Aug 25, 2022 03:42 |
|
Since people keep, uh, gushing over DALL-E 2 I've been wondering how many people are trying to use it to make porn and how it avoids generating depraved/unethical poo poo. Actually, they seem to have done a pretty good job on setting a content and moderation policy, documenting and addressing its limitations. I presume it's just a matter of months until someone releases Unscrupulous DALL-E, or we find out yet another case of Oops the Big Tech AI-Generated Milkshake Duck is Racist. I wonder if the bigger obstacles are writing/using the code (much of which is open source but not trivial to use) or the cost of GPU time to do the training.
|
# ? Aug 25, 2022 04:20 |
|
Main Paineframe posted:I'm not really shocked, but I'm amused that it keeps failing this hard. They get into this space and overreach way too hard every time, and it turns out there is a limit to how much low-effort grift people will tolerate. Sure, but if you think this isn't the direction we're going to go for at least some stuff in the future you're kidding yourself. E: I mean the future we're going to eventually arrive at
|
# ? Aug 25, 2022 07:52 |
|
I mean it happens in every market where suits try to cut the creative process out of the equation entirely to flood it with low-effort dross and everyone gets bored of it and moves onto something that actually sparks their interest quickly. See reality TV booms and busts. That poo poo has no legacy and no tail. No one's going back to it.
|
# ? Aug 25, 2022 08:15 |
|
Ghost Leviathan posted:I mean it happens in every market where suits try to cut the creative process out of the equation entirely to flood it with low-effort dross and everyone gets bored of it and moves onto something that actually sparks their interest quickly. See reality TV booms and busts. That poo poo has no legacy and no tail. No one's going back to it. Have you seen the new series on Netflix these days? Reality TV has no longevity but it still has huge audiences and is remarkably cheap.
|
# ? Aug 25, 2022 08:41 |
|
Main Paineframe posted:
I mean stuff like pre-made templates, cheap clip art and websites that generate low cost layouts based on basic parameters (text, colors, logo). This kind of thing also exists for logo design and like 80% of customers are fine with whatever these sites generate or maybe they just need a minor tweak like adding a FB logo or something.
|
# ? Aug 25, 2022 11:09 |
|
eXXon posted:Since people keep, uh, gushing over DALL-E 2 I've been wondering how many people are trying to use it to make porn and how it avoids generating depraved/unethical poo poo. Actually, they seem to have done a pretty good job on setting a content and moderation policy, documenting and addressing its limitations. I presume it's just a matter of months until someone releases Unscrupulous DALL-E, or we find out yet another case of Oops the Big Tech AI-Generated Milkshake Duck is Racist. I wonder if the bigger obstacles are writing/using the code (much of which is open source but not trivial to use) or the cost of GPU time to do the training. That horse is out of the barn.
|
# ? Aug 25, 2022 11:26 |
|
|
# ? Jun 1, 2024 14:33 |
|
lol riker go to your quarters you're drunk
|
# ? Aug 25, 2022 11:44 |