Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Stop and think about that: “Hey, some complete stranger, unbeknownst to me, reviewed an image of mine that I did NOT put on a public server, and that’s OK because I didn’t know about it.”
That may not be what you intended, but that’s how it came across.
Huh? A false positive would immediately be recognized and dismissed once it is reviewed by an actual person. It would never even be sent to law enforcement. In fact, you probably won't ever even know about it.
And no malicious app ever has gotten into the AppStore because Apple's methods for reviewing them never fail?

False positives will be forwarded to law enforcement. Police will storm into someone's house with a no-knock warrant and people will be killed. Apple will lose billions in court for this. This is the apex, sell your shares.
 
Sooo many of you didn't even read the article.

We all read the article but some of us fail to understand the technology beyond what the article tells them.

Continually claiming "it's just scanning hashes!" is an oversimplification of what Apple has described in the article. The system Apple themselves have described is not plain one to one hashes against known content, it's capable of detecting changes to known content (crops? color changes? warping? who knows?) which means there's some element of filling in the gaps between a base comparison abuse image vs. the image being scanned. That's where the system is open for abuse or false positives (creating images that look like white noise to the end user but look like bad content to the computer).

And that's just if we were to critique the technology itself rather than critique the precedent for scanning all images on my phone which is an even bigger deal in my opinion, regardless of intention or implementation. The "it's just scanning hashes" argument shouldn't even be a thing in the first place: my photos on my device should not be scanned by any system unless I give it explicit permission to do so!
 
Standards for morale and unacceptable behavior change over time. What might be perfectly alright today, may not be okay some years into the future. Photos that are considered fine by today’s standards might be offending in the future.

Once they have the technology available, what stops the wrong person in power to scan for e.g. people wearing rainbow colored watch bands? Combine this with face identification routines and you have a very powerful tool to spy on whomever you deem dangerous for your power. 10/10 dictators like that.

And it’s not unprecedented, unfortunately: They are already changing words in reprints of decades old books, because some wording is considered inappropriate by today’s standard, even though it was considered perfectly fine by the time of the release of the book.

“Thought crime”, anyone? Newspeak is already here, this time it disguises as pictures and - as often is the case - uses an honorable attempt as inroads …
 
100% chance that this is just step 1 to scanning our content for other types of content at the behest of law enforcement/government. Once the camel has their nose under the tent....

Yes, I understand the privacy concern, but I'm responding to the people acting all alarmist about false positives (separate topic).
 
  • Like
Reactions: flowsy
Hey people, even if there are false positives, even if is not well implemented. We are talking about children, this could help discouraging criminal networks and monsters, this is a step (small or large) to protect children. Don't you think tracking down abusers based on a known database of images is more important than you worrying about your nudes?.. even you talk from ignorance (because you didn't read the article) some of the comments here show the decadence and selfishness of us, humans.
Its a step in privacy erosion with children as a shield. nothing more nothing less
 
Parents are in for a surprise then they find out all their high schoolers are sharing nudes.

My kids say most the kids swap images and so by the time you graduate high school you should have a large collection of what will suddenly become child porn.

What a world.

Any tools that allow parents to help their children navigate the current digital age according to the parents preferences are welcome. Nothing perfect but these seem like reasonable tools.

The searching for child porn is trickier. Who maintains such a data base. How up to date is it? Can I troll someone by getting a harmless picture of their child on it.
 
Last edited:
All it takes is for someone to mention child safety and suddenly if you are opposed you're not only selfish but "must be a criminal". It's infuriating and wrong and the same stuff repeated by every oppressor in history.

This new feature is definitely a way for Apple to ease into pleasing CCP demands that they get to spy on Chinese consumers' iPhones for political content they do not like.

Come on, this is APPLE. Suddenly everyone thinks they're more concerned about safety than device privacy and that makes sense to anyone? This is the company that refused to give the FBI access to iPhones to fight literal terrorism. Suddenly that argument is out the window all in one sweep. They know exactly what they're doing and where it leads. They did it on purpose because they care more about being allowed to sell iPhones in China more than they care about user privacy and standing up to government oppression such as NSA surveillance, which they've always been happy to do in the US. They get to do that in America because we are still free enough for a company to say no to the government. Well, not overseas, and when the others are doing it? FBI is gonna demand the same tools in no time.
 
I’m sorry to tell you this, but despite the advertisements, nothing digital is private.
Even though at the moment there’s still a low chance of it ever happening, it’s still theoretically possible to break encryption.
If *this* is where you draw the line at what you’re willing to have on your phone, that says a lot more about you than it does Apple.
Not that I don’t see the implications of this kind of technology being used for more worrying uses in The future, but right now it’s only being used for criminals. If you’re not a criminal, then stop worrying.
and if you truly want everything of yours to be secure and private, well… too late because if you’re reading this, your data isn’t *100%* secure
 
Standards for morale and unacceptable behavior change over time. What might be perfectly alright today, may not be okay some years into the future. Photos that are considered fine by today’s standards might be offending in the future.

Once they have the technology available, what stops the wrong person in power to scan for e.g. people wearing rainbow colored watch bands? Combine this with face identification routines and you have a very powerful tool to spy on whomever you deem dangerous for your power. 10/10 dictators like that.

And it’s not unprecedented, unfortunately: They are already changing words in reprints of decades old books, because some wording is considered inappropriate by today’s standard, even though it was considered perfectly fine by the time of the release of the book.

“Thought crime”, anyone? Newspeak is already here, this time it disguises as pictures and - as often is the case - uses an honorable attempt as inroads …
Except they're not "scanning" photos, they're matching hashes from the NCMEC's database.
 
Hey people, even if there are false positives, even if is not well implemented. We are talking about children, this could help discouraging criminal networks and monsters, this is a step (small or large) to protect children. Don't you think tracking down abusers based on a known database of images is more important than you worrying about your nudes?.. even you talk from ignorance (because you didn't read the article) some of the comments here show the decadence and selfishness of us, humans.
I don’t think being worried about false accusations of child pornography is selfish. YMMV.
 
And Apple claiming to protect user privacy could be seen as fraudulently misleading terms of the contract. Since the T&C don't require one to wait the amount of time reasonable to read it Apple could be accused of misleading customers with public statements and then hiding language in a document designed to ignore. Since Apple doesn't allow one to perform partial updates an argument could be made that there is a violation of mutuality of obligation.

Also, there is the question of who agreed to the terms and who is using the device. Anyone who lets their grandkid set their phone up, or had their device configured by the VZW rep while they went to get something to eat at Red Robbin or whatever, never had the chance to agree to the terms.

I'm no law expert, but none of that makes any sense. You are given a chance to read the terms - Apple doesn't need to lock out the "Agree" button for 10 minutes or whatever to make it legally binding. And if you let someone else set up an account in your name without supervision and review, that's on you. Maybe things like this will make people start being a bit more vigilant about things like that.
 
  • Sad
Reactions: peanuts_of_pathos
I sense the iPhone 13 family may be the first Apple products that are Boycott worldwide !
Boycotted in favor of… what exactly?
The Google pixel 6 series?
Lmao get serious for a second, 99.99999999999999% of people will never ever know this feature exists, unless they’re doing something wrong
 
  • Sad
Reactions: peanuts_of_pathos
I'm no law expert, but none of that makes any sense. You are given a chance to read the terms - Apple doesn't need to lock out the "Agree" button for 10 minutes or whatever to make it legally binding. And if you let someone else set up an account in your name without supervision and review, that's on you. Maybe things like this will make people start being a bit more vigilant about things like that.
Apple rolls in "critical" security updates in their system updates all the time.

I'd guess this is the number 1 reason why people automatically update their systems.

To place such an invasive piece of software in an update, and then for people to claim that "well, the user accepted the terms!" is just. plain. nuts.
 
The main problem I see with this is this demonstrates an access or backdoor to peoples phones Apple, supposedly, didn't have before. On a massive scale too. As per the article. This doesn't scan iCloud. This scans the users phone before it goes to iCloud. Then reports the phone to Apple. If something is found.

Which means they can potentially scan users phones against any database. Once you get past looking at the apps. It's all a hierarchical filesystem underneath.

What's to stop a dictatorial country from demanding Apple scan against their own databases of known illegal images. Then reporting their citizens. Perhaps databases of adult pornography or gay imagery. Even political or religious imagery.

It's also scanning text messages for types of images. Not just those found in databases. Which potentially means any type of image may also be flagged by this system. Even if it doesn't exist in a database. As all they'd need to do is change the criteria.

If they have the ability to scan images on users devices. Text would be far simpler. Flagging keywords in messages or outlawed books.

Apple knows full well how bad this may be twisted. That's why they introduced it with the tried and true method of "Won't somebody think of the children."
 
Apple rolls in "critical" security updates in their system updates all the time.

But we're not talking about a critical security update, so that's irrelevant. This would be something you'd have to specifically agree to as part of the terms of installing the update.
 
Standards for morale and unacceptable behavior change over time. What might be perfectly alright today, may not be okay some years into the future. Photos that are considered fine by today’s standards might be offending in the future.

Once they have the technology available, what stops the wrong person in power to scan for e.g. people wearing rainbow colored watch bands? Combine this with face identification routines and you have a very powerful tool to spy on whomever you deem dangerous for your power. 10/10 dictators like that.

And it’s not unprecedented, unfortunately: They are already changing words in reprints of decades old books, because some wording is considered inappropriate by today’s standard, even though it was considered perfectly fine by the time of the release of the book.

“Thought crime”, anyone? Newspeak is already here, this time it disguises as pictures and - as often is the case - uses an honorable attempt as inroads …

"some years into the future" is sooner than we think considering the current Presidential administration declaring "climate extremists and activists" and "extremists that oppose the capitalist system of the US" as terror groups, lumped into the same category as fascists, white supremacists, etc.

Makes sense, there's practically no difference between someone that doesn't want the planet to implode vs. someone that wants to kill people for no reason other than hate. /s

The same people thinking this system will not be abused or that we should only put up a fuss after the system is abused are the same group of people that gasp and claim "impossible!" whenever a previous example of the USA doing the "impossible" by grossly violating people's human rights is brought up. Current systems (i.e, patriot act fueled mass surveillance programs and illegal warrantless searches of computers at the US border/airpots) that are abused TODAY are still ongoing because the "complain when the bad thing happens" crowd actually mean "I'll complain when I'm personally affected by this thing". By that time it's already too late.
 
So, it was all BS, all of it.

I don't even have 200 photos in iCloud, less than nothing to worry about, but I'm out if they do this. Don't know where I'll go because the major alternatives will probably start doing it too. Maybe it's time to go gangsta with a flip phone.

It won't stop here either, if they have your data and information, it will get used, always, which Apple just proved.

And FFS, if there was ever a company with the resources to do the right thing it's Apple, but once again, Cooke is selling out to save pennies of Apple's billions hiding in some offshore tax-haven

Your problem here is that with how the 4A and subpoenas go, anywhere you go for cloud storage will have the same problem. You won't physically own your data on those servers; the people who own the servers will. This is the same problem everyone has with Facebook: When you upload any photos or make any posts to it, they own those posts and pictures, as they are now in possession of it, not you.

That concept carries over to any cloud service, so user beware.

BL.
 
And no malicious app ever has gotten into the AppStore because Apple's methods for reviewing them never fail?

False positives will be forwarded to law enforcement. Police will storm into someone's house with a no-knock warrant and people will be killed. Apple will lose billions in court for this. This is the apex, sell your shares.
As literally pointed out earlier in this thread, other companies have been already doing this for decades!!!
Do you think Google photos users are worried about people storming into the house, killing them over false positives? Of course not.
Completely ridiculous argument, apples not losing billions over anything, they’re not the first huge company to implement this
 
  • Love
Reactions: flowsy
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.