Scanning files in a directory is not this.As literally pointed out earlier in this thread, other companies have been already doing this for decades!!!
Do you think Google photos users are worried about people storming into the house, killing them over false positives? Of course not.
Completely ridiculous argument, apples not losing billions over anything, they’re not the first huge company to implement this
Your problem here is that with how the 4A and subpoenas go, anywhere you go for cloud storage will have the same problem. You won't physically own your data on those servers; the people who own the servers will. This is the same problem everyone has with Facebook: When you upload any photos or make any posts to it, they own those posts and pictures, as they are now in possession of it, not you.
That concept carries over to any cloud service, so user beware.
BL.
I have 3 kids. They all use iMessage for communicating with my wife and I. They don't use iMessage to communicate with their friends. I don't know how much of an impact this will have if all other messaging apps don't get on board and do the same thing.This is great. The iMessage stuff is smart for a way to stop unwanted pics from circulating.
But the CSAM paper says that it's scanned on device for hashes. They're not scanned in the cloud, they're scanned on device. I honestly don't see the problem here?Yes but that mean Apple will be scanning your photos. And trust me soon its gonna be completly different justification of the invasion. That is the reason I do not use any cloud service. Apple or any other
99.9% of seriously have a reading comprehension issue, are just plain stupid, or are seriously bored and want stuff to complain about.
The very first picture at the top of the article:
* ON DEVICE SOFTWARE..not cloud based
* APPLE NEVER SEES YOUR PHOTOS
* ZERO REPORTING DONE BY APPLE!!!!!! Only the receiver of the pics is warned and then may report if necessary/appropriate.
This is an OPTIONAL feature that parent's can use to protect kids. And if some woman (more so than men typically) is getting sick of getting unwanted d**k pics, I guess they could turn it on as well.
On device software is the problem. The child porn is a red herring.99.9% of you seriously have a reading comprehension issue, are just plain stupid, or are seriously bored and want stuff to complain about.
The very first picture at the top of the article:
* ON DEVICE SOFTWARE..not cloud based
* APPLE NEVER SEES YOUR PHOTOS
* ZERO REPORTING DONE BY APPLE!!!!!! Only the receiver of the pics is warned and then may report if necessary/appropriate.
This is an OPTIONAL feature that parent's can use to protect kids. And if some woman (more so than men typically) is getting sick of getting unwanted d**k pics, I guess they could turn it on as well.
Importantly this is not a legal requirement, the companies choose to structure their legal T&Cs like this. They are not required to own content you upload to their cloud.
Another way to get around this is to ACTUALLY end to end encrypt the data on cloud servers such that Apple getting access is impossible (the lavabit strategy). That way the legal conversation goes "Sure you can have access to our servers, but the data is completely useless as it's literally impossible for anyone except the owner to read it"
99.9% of you seriously have a reading comprehension issue, are just plain stupid, or are seriously bored and want stuff to complain about.
The very first picture at the top of the article:
* ON DEVICE SOFTWARE..not cloud based
* APPLE NEVER SEES YOUR PHOTOS
* ZERO REPORTING DONE BY APPLE!!!!!! Only the receiver of the pics is warned and then may report if necessary/appropriate.
This is an OPTIONAL feature that parent's can use to protect kids. And if some woman (more so than men typically) is getting sick of getting unwanted d**k pics, I guess they could turn it on as well.
Second, starting this year with iOS 15 and iPadOS 15, Apple will be able to detect known Child Sexual Abuse Material (CSAM) images stored in iCloud Photos, enabling Apple to report these instances to the National Center for Missing and Exploited Children (NCMEC), a non-profit organization that works in collaboration with U.S. law enforcement agencies.
But most importantly, stepping out of this whole situation for a moment - which I don't think some people can - is it tangibly going to affect my life or usage of my iPhone? Absolutely not.
To clear things up… uploading to the cloud is not handing over your ownership.Importantly this is not a legal requirement, the companies choose to structure their legal T&Cs like this. They are not required to own content you upload to their cloud.
Another way to get around this is to ACTUALLY end to end encrypt the data on cloud servers such that Apple getting access is impossible (the lavabit strategy). That way the legal conversation goes "Sure you can have access to our servers, but the data is completely useless as it's literally impossible for anyone except the owner to read it"
"You all want to prevent exploited children or are you a monster?"
So a few takeaways here for me:
1. “Scanning hash” is part of NN/ML identifying images. But this will always have chances for false positive, and there is no telling those “human reviewers” are gonna be in line most of the time. Apple can definitely ease people’s mind by saying a ridiculously low number of false positive but I highly doubt this is true in practice.
2. “Nothing to worry cause nothing to hide” is the catalyst for full 1984. Dont worry. We are on our way there, ever faster than before.
3. You disagree with those EULA? Then there is literally no device you can use. Live like a caveman? Not an option now and will NOT be an option in the near future.
4. On-device Scanning, this time it’s for CSAM. What’s next? Political contents? Vaguely ”offensive“ materials deemed by countries and law enforcement? The possibility is endless. But yeah don‘t worry. There will definitely be a nice name in disguise for scanning all sorts of contents, not Just images.
5. Tom Scott got an amazing short video imagining A world where people of different class sees literally different version of the same world. I have a feeling this will come true much faster than people might think.
Read again, not only is it option for iMessage, but only applies to those users with a families account set up. Individual users most likely will not be able to even activate it. Parents protecting their kids...always a good option to have.I guess you're in that 99.9% percent because literally every bullet point you just wrote down does nothing to dissuade the concerns that people have. It's not an "OPTIONAL" feature either, that's just for the nude scanning part of iMessage, I guess you would know that if you weren't in the 99.9% of people that have reading comprehension issues according to you.
The main problem I see with this is this demonstrates an access or backdoor to peoples phones Apple, supposedly, didn't have before. On a massive scale too. As per the article. This doesn't scan iCloud. This scans the users phone before it goes to iCloud. Then reports the phone to Apple. If something is found.
Which means they can potentially scan users phones against any database. Once you get past looking at the apps. It's all a hierarchical filesystem underneath.
What's to stop a dictatorial country from demanding Apple scan against their own databases of known illegal images. Then reporting their citizens. Perhaps databases of adult pornography or gay imagery. Even political or religious imagery.
It's also scanning text messages for types of images. Not just those found in databases. Which potentially means any type of image may also be flagged by this system. Even if it doesn't exist in a database. As all they'd need to do is change the criteria.
If they have the ability to scan images on users devices. Text would be far simpler. Flagging keywords in messages or outlawed books.
Apple knows full well how bad this may be twisted. That's why they introduced it with the tried and true method of "Won't somebody think of the children."
On device software is the problem. The child porn is a red herring.
This is a local hash analysis engine.
It can be updated to be the "white supremacy" finder. So who cares, that's bad too!
How about the "racist word" hash finder?
How about the "I voted for Trump/Biden" hash finder?
Do you get it yet?
I already answered this above, but why would anyone be worried about this unless they have known child porn images on their phone/in their icloud account?You may want to include yourself in that reading comprehension issue, because straight from the OP:
This is Cloud based, as it pertains also to iCloud Photos. So it may help to practice what you preach, and instead of carrying on about the perceived stupidity of other users, RTFM yourself so you don't get caught up in your own rants.
BL.
HUGE PROBLEM WITH THIS. It doesn't matter to me that it only will flag "known bad image". In order to find the known bad images ALL of my images will be processed through who's servers? Or will my own phone scan and report back that I'm "clean"? And how exactly are these scanned? By counting pixel colors and comparing to a database then going keeping on that photo to see if it's a problem? Or literal matching against the "known bad images" database?
This literally nukes Apple's whole privacy marketing angle. If you are scanning photos, in any way, even at the metadata level, then privacy on the phone is 💀.