Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
You're comparing manufacturing defects to a scanning technology? Ok...

But heck, even if the error rate were 1/10th of what they said (1 in 100 billion), it would still be excellent.
So Apple code is now perfect?

They aren't looking for perfect hash matches. There actual error will be many times lower then claimed. Given that billions of pictures are taken. There will be a non trival number of people effected by this.

But whats a few innocent people thrown in jail if we catch just one bad guy? Right?
 
I got you the first time. So you're saying that Apple is going through all this trouble for nothing. Not just implementing the scanning technology, but all the legal costs of developing those terms of service (not just for this upcoming update, but for all time past, present and future) was all for nothing because any case arising from something authorized in the terms of service will be thrown out of court unless Apple has an actual video showing the defendant tapping on "agree".

I don't think so.
You sound confident for someone who started by saying they are no legal expert.
 
  • Like
Reactions: ssgbryan
So Apple code is now perfect?

They aren't looking for perfect hash matches. There actual error will be many times lower then claimed. Given that billions of pictures are taken. There will be a non trival number of people effected by this.

But whats a few innocent people thrown in jail if we catch just one bad guy? Right?

What the actual **** are you on about? First of all, I never said Apple code or anything is perfect, so stop putting words in my mouth. Secondly, no one is going to be thrown in jail for a false positive, because it will be manually reviewed. You're trolling, right? Surely you can't be this confused.
 
You sound confident for someone who started by saying they are no legal expert.

So do you. And if you were a legal expert, you would have mentioned it by now. The reason I'm confident that you're wrong is because Apple isn't stupid, and they're not going to waste time and money implementing a technology into iOS that won't benefit anything. You don't have to be a legal expert to make that simple deduction. I'm not saying I understand the ins and out of the entire legal process, but I'm confident that Apple and their legal team know what they're doing, unlike you or I, and aren't on a fool's errand.
 
So do you. And if you were a legal expert, you would have mentioned it by now. The reason I'm confident that you're wrong is because Apple isn't stupid, and they're not going to waste time and money implementing a technology into iOS that won't benefit anything. You don't have to be a legal expert to make that simple deduction. I'm not saying I understand the ins and out of the entire legal process, but I'm confident that Apple and their legal team know what they're doing, unlike you or I, and aren't on a fool's errand.
Your justification is that a company must be right because they wouldn't try and do something if they didn't have to right to do it? Wow.

Of course they are going to spend time and money making it look like they have the right to do something. Best case is people believe them. The worst case is they are told they can't say something the way they said it.

Actually, no. The worst case is someone assumes they know better because they are rich and powerful.
 
Your justification is that a company must be right because they wouldn't try and do something if they didn't have to right to do it? Wow.

Of course they are going to spend time and money making it look like they have the right to do something. Best case is people believe them. The worst case is they are told they can't say something the way they said it.

Actually, no. The worst case is someone assumes they know better.

I'm not trying to "justify" anything.

So you're telling me with a straight face, that Apple's legal team knows that every single case of prosecution for illegal images that arises from this scanning technology will be thrown out of court because there's no way they can prove the defendant physically tapped on "Agree" when installing the iOS update that authorized the scanning (and therefore the evidence was obtained illegally)? And yet they're still wasting tons of time and money implementing it? Again, I don't think so. They could fool the general public about what they have a right to do, but a judge isn't going to be fooled, and that's all that matters. Apple's lawyers know this and they're not idiots. Therefore I submit your theory about them having to physically prove someone clicked on "Agree" is wrong, because it makes no logical sense.
 
  • Like
Reactions: flowsy
First off, child sexual abuse and the child sex trade are HUGE problems. Period. Yes, the problem has increased exponentially with the advent and growth of the web and global communication. I operated a Central American children's home for abused and neglected minors for 17 years. The vast majority of our residents were brought to us due to child abuse and trafficking of one type or another. The problem is real!

HOWEVER, this scanning of images on iCloud and devices as well as communications between individuals is coming from the company that continually states, "privacy and security matter!" Well, obviously they really don't. It's no different than a company manufacturing dangerous products using a moniker, "Safety first!". That is demonstrably not so, but to the gullible maybe it can be convincing. Not unexpected, I suppose, from probably the world's largest manufacturer of disposable electronics who also claims to be focused on 'green.' The fact is, someday this will be hacked and harm will be done to people having nothing to do with sex crimes on children.

IMO, what we need are truly severe penalties, up to and including capital punishment, for those found to engage in the trafficking of minors for sex. We also need to stop allowing politicians, entertainment elites and powerful oligarchs to get away with these things while also pretending to be mortified that people like Epstein and Weinstein built empires while taking advantage of young people and trading in their innocent to the rich and powerful.
 
I'm not trying to "justify" anything.

So you're telling me with a straight face, that Apple's legal team knows that every single case of prosecution for illegal images that arises from this scanning technology will be thrown out of court because there's no way they can prove the defendant physically tapped on "Agree" when installing the iOS update that authorized the scanning (and therefore the evidence was obtained illegally)? And yet they're still wasting tons of time and money implementing it? Again, I don't think so. They could fool the general public about what they have a right to do, but a judge isn't going to be fooled, and that's all that matters. Apple's lawyers know this and they're not idiots. Therefore I submit your theory about them having to physically prove someone clicked on "Agree" is wrong, because it makes no logical sense.
Yes. With a straight face, I am saying a trillion-dollar company would do things to create the impression they have rights they don't actually have in the hopes that people believe it and act as they do. The amount of money they make as a result of this statement far outweighs the cost. And Apple knows they can keep a case going for so long most people would be bankrupt, so they never test most of the rights they claim they have in front of a judge.
 
  • Like
Reactions: jk1221
And Apple knows they can keep a case going for so long most people would be bankrupt, so they never test most of the rights they claim they have in front of a judge.

Huh? It appears you don't even understand the issue at all. It's not APPLE who would be prosecuting these people; it would be a federal agency. All Apple is doing is reporting evidence of a crime.
 
Huh? It appears you don't even understand the issue at all. It's not APPLE who would be prosecuting these people; it would be a federal agency. All Apple is doing is reporting evidence of a crime.
We are talking about how apple can prove the user of the device was the one that agreed to the device's T&C that authorized the use scanning of images.
 
We are talking about how apple can prove the user of the device was the one that agreed to the device's T&C that authorized the use scanning of images.

sigh...please follow the bouncing ball.

1. device user has, let's say, 100 illegal images on their iPhone
2. the scanning technology flags all 100 of these photos as matching the hashes of known child abuse images
3. device user uploads all 100 of these images to iCloud
4. Apple reviews the images and confirms all 100 are indeed true matches and illegal
5. Apple reports the illegal images to the appropriate agencies and federal prosecutors bring charges against the device user

Now, you're trying to tell me that the judge is going to require the prosecutors to prove the device user physically clicked on the Agree button of the TOS for iOS in order for the prosecutors to submit the illegal images as valid evidence against the device user, right? Well, if that's the case, then they'll have to let all these device users walk unless they have surveillance video (which also must be proved was obtained in a legal manner) which clearly shows the defendant clicking on the Agree button for the specific TOS. We all know THAT'S not going to happen. Therefore, Apple's efforts were all for naught if your theory for how this works is true. Does that make a lick of sense to you? It doesn't to me. Therefore I conclude that the laws must be written in such a way that they will NOT require such specific evidence of the user clicking Agree in order to submit the illegal images as evidence and successfully prosecute these cases.
 
sigh...please follow the bouncing ball.

1. device user has, let's say, 100 illegal images on their iPhone
2. the scanning technology flags all 100 of these photos as matching the hashes of known child abuse images
3. device user uploads all 100 of these images to iCloud
4. Apple reviews the images and confirms all 100 are indeed true matches and illegal
5. Apple reports the illegal images to the appropriate agencies and federal prosecutors bring charges against the device user

Now, you're trying to tell me that the judge is going to require the prosecutors to prove the device user physically clicked on the Agree button of the TOS for iOS in order for the prosecutors to submit the illegal images as valid evidence against the device user, right? Well, if that's the case, then they'll have to let all these device users walk unless they have surveillance video (which also must be proved was obtained in a legal manner) which clearly shows the defendant clicking on the Agree button for the specific TOS. We all know THAT'S not going to happen. Therefore, Apple's efforts were all for naught if your theory for how this works is true. Does that make a lick of sense to you? It doesn't to me. Therefore I conclude that the laws must be written in such a way that they will NOT require such specific evidence of the user clicking Agree in order to submit the illegal images as evidence and successfully prosecute these cases.
This is a new discussion that has nothing to do with what we were talking about.
 
That's so wrong! Hey Apple... Focus on fixing this stuff before going after the consumer's. This is creepy and wrong in so many different levels.



Isn’t this violating of a Privacy? I have a lot of nudes of myself on my phone :( should I start deleting them or what?

Well over 34,000 photos. You got to be kidding me.

I’m scared 😱 Should I be worried?
It really feels like an invasion of privacy. I think that we are being too kind to tech companies. First they do Siri queries and recognition for photos etc in the cloud (like google) then they move it to “on-device.” It’s just becoming more and more difficult to say it’s “your” phone or “your” laptop. Just have to trust these companies with whatever hashes they are doing on device. I just don’t see people pushing back. This is just one example of many. Of course I’m for protecting children I just feel we continue to place more and more power into the hands of big tech and we shouldn’t.
 
Last edited:
What the actual **** are you on about? First of all, I never said Apple code or anything is perfect, so stop putting words in my mouth. Secondly, no one is going to be thrown in jail for a false positive, because it will be manually reviewed. You're trolling, right? Surely you can't be this confused.
manually reviewed? so, you are okey with your own child photos being compared with some false match and all the question from god knows, what kind of officer you are talking with? we will face same discrimination issues(race/religion/gender what not) with this kind of reviewing/policing.
 
I understand the core of your comment, but I don't think someone should save a picture when they find out that person is 15!
I agree. But it is possible that you will never find out that the girl is 15 if she looks older in the pictures.

People share stuff on the web and you easily lose track of where a particular image came from.

Somebody at the beginning of the chain might know that the nude is of an underage girl, but if you get such a picture from "Joe", that in turn got it from "Alan", that in turn got it from "James" that in turn found it on a website... the chances of you knowing are slim to none.

Basically the only way to be actually safe is to only save on your photo library pictures from moms and grannies (the ones that you can be 100% certain are not even close to look like they are 18).
 
Last edited:
I suppose it's theoretically possible to be tricked into saving an image of someone who is in the database they are scanning against. You are then flagged up by a matching on-device scan, and days/weeks/months later you hear law enforcement breaking down your door, before they seize all your computer equipment for forensic examination.

Do I think it's likely? No.

Apple has every motivation to build in all the safeguards necessary to ensure no horror stories such as the ones raised by you ever turn into reality. It will be an absolute disaster for their reputation if things did start going horribly wrong with this.
I think you are right saying that the chances are slim. But then you could also justify any other exploit that puts user privacy at risk in the name of doing something good.

Also, remember that when somebody is sharing content over the internet, you cannot tell where it actually comes from.

Let's say you get a nude of a girl that looks 18-21 years old from your friend "Joe" and you save it to your photo library.

Joe got it from Alan, that in turn got it from James. James found it on a website. The guy who uploaded the picture on the website is somebody who tricked this girl into posing nude.

After a while this girl goes to the police to report the abuse/revenge porn and the authorities flag her nudes because she is only 15 years old (even if in the picture she looked older or at least not underage).

Now you have a picture in your photo library that has been flagged, and your account is reported to the authorities.

Can you imagine what kind of consequences you would face as a private citizen or even more as a public figure if only people suspected that you owned child pornography because of an investigation triggered by an automated system like this?
 
Last edited:
  • Like
Reactions: jk1221
This is an invasion of privacy. I can understand Apple et al wanting to protect the integrity of their serves, etc. However, to be able to, to be allowed to invade individuals' phones is a not so thin end of a very big wedge.

There will be mistaken allegations, people's reputations destroyed, etc. Think of baby bathes, different cultures such as Japan were family bathes, indeed, communal public bathes etc are common.....

And of course, every authoritarian and/or dictatorial government will want the technology for their own legal purposes. Think China, Burma, Russia, Saudi, etc. As has been well reported, tech companies are experts at using the defense of "we must follow the laws of the country".....Does anyone seriously think Apple will walk away from the China market when asked to use the technology to find demonstrators?

Finally, how long before this technology will be everywhere in the Apple universe; iPads, desktops, MacBook Air, MacBook Pro.....?
 
The justification for a “solution” to a “problem” starts with referencing something abhorrent that nobody in their right mind could possibly defend.

You can’t object to this technology because that would mean you’re supporting child abusers.

You can’t object to anti misinformation technology and laws because that would mean you’re supporting neo nazis and fascism.

Ultimately there’s a long list of things we can do to help combat both of those issues WITHOUT compromising on peoples privacy and freedom of speech in the process but none of those solutions would come with the desired side effect that these people really want: control over what information gets shared amongst the population. I put “solution” and “problem” in inverted commas for a reason, the people in power are the ones setting the narrative on how to define the problem and solution. God forbid they instead choose a solution that actually aims to fix these issues at their core.
I have a solution. Everyone stop using the cloud services that exist.
If everyone boycotted these services, these companies would take a massive economic hit, which is the only pressure they really listen to.
But in today’s social media world, where data mining is the real goal of all these so called services, privacy is just an illusion.
 
I dont even understand why are Macrumors so worked up about it. I mean Macrumors comments have been telling you for YEARS.

It is Apple's iPhone, Apple's iOS, Apple's App Store and Apple everything. They are not yours. If you dont like it go somewhere else.

/s
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.