Except that with CSAM, there is a back door on your device. Hopefully they bury it for good.So…they will continue to provide any legally requested data from iCloud…like they always have since iCloud existed…nothing changes…got it!
Except that with CSAM, there is a back door on your device. Hopefully they bury it for good.So…they will continue to provide any legally requested data from iCloud…like they always have since iCloud existed…nothing changes…got it!
Except that with CSAM, there is a back door on your device. Hopefully they bury it for good.
Are those scans being shared with the government. Jesus why can’t some people see past Apples world. This is bad.…oh…you mean the same “back door” that has been scanning your personal photos for POI’s, breeds of dogs and cats, and types of flowers for years now?
People may have different views of this than you. Not everyone cares. For those who care enough to make a change, there are options.Are those scans being shared with the government. Jesus why can’t some people see past Apples world. This is bad.
The ones you are referring to aren't being shared with the government either.Are those scans being shared with the government. Jesus why can’t some people see past Apples world. This is bad.
We're complaining now, and the timing of the complaint has no bearing on the merits of our concerns. And I guess you missed the part whereby once a threshold number of 'hits' is exceeded a human being will review your private picture. That is new, and we have no idea what the false positives might look like - normal pictures of your kids? You partner wearing lingerie or a swim suit? Who knows? You won't, because Apple will not tell you which pictures were flagged and which ones were examined by a human being, and you won't know who that human being was, nor how well they were vetted not to be a perv. Using local AI on the iPhone to police users without any indication of criminal activity is new. Claiming that the new system detects illegal content but guarantees privacy (it doesn't) is new. You're focusing on the engineering, which is what Apple has done, without taking into account the larger social and legal context....
The post I was responding to was referring to a "back door" as if Apple was adding something that did not exist before. In this case, it HAS existed for years now and I never heard anyone complain before about Apple scanning images "on device" before.
...
This is the part that you (and many others) either do not understand or do not want to believe the evidence given to you. Or my favorite part, people not believing Apple's detailed explanation and instead linking to some online know-it-all showing how perceptual hash imaging of a cat can be read as a dog. Not the same thing or even close to what Apple is doing despite people trying to dumb it down or compare it to that.That is new, and we have no idea what the false positives might look like - normal pictures of your kids? You partner wearing lingerie or a swim suit? Who knows? You won't, because Apple will not tell you which pictures were flagged and which ones were examined by a human being, and you won't know who that human being was, nor how well they were vetted not to be a perv.
If the ones we are talking about aren’t being shared. Why scan?The ones you are referring to aren't being shared with the government either.
The post I was responding to was referring to a "back door" as if Apple was adding something that did not exist before. In this case, it HAS existed for years now and I never heard anyone complain before about Apple scanning images "on device" before.
Please read the multiple white papers and the FAQ from Apple if you want to see what Apple is actually doing since your statement is 100% incorrect.
I am a neuroscientist who has studied biological information processing for decades. Consequently I have been following the research literature in machine perception and machine learning since the early 1980's because it is relevent to my research. I have read the Apple literature. I know very well the outline of how the proposed system works, although Apple, of necessity, is keeping many details hidden. Frankly I am getting really tired of those blithely accepting Apple's proposed system accusing me of not educating myself.This is the part that you (and many others) either do not understand or do not want to believe the evidence given to you. Or my favorite part, people not believing Apple's detailed explanation and instead linking to some online know-it-all showing how perceptual hash imaging of a cat can be read as a dog. Not the same thing or even close to what Apple is doing despite people trying to dumb it down or compare it to that.
Trillion to one odds is enough for me as a chance that I will have multiple known CSAM images on my phone that are also uploaded to iCloud to be reviewed (as low res btw), but someone at Apple to confirm or deny they are indeed matching images from that limited database.
...trillion...to...one....
Again, you don't have to believe it (or understand how it works), but that's on you, not me.
An article from Forbes that can be read in 30 seconds? C'mon...I am a neuroscientist who has studied biological information processing for decades. Consequently I have been following the research literature in machine perception and machine learning since the early 1980's because it is relevent to my research. I have read the Apple literature. I know very well the outline of how the proposed system works, although Apple, of necessity, is keeping many details hidden. Frankly I am getting really tired of those blithely accepting Apple's proposed system accusing me of not educating myself.
Let me be clear: I believe Apple's description. However, their system can be copied or subverted, the latter prevented only by the promises of Apple's current management. The system is prone to an unknown rate of false positives of unknown characteristics. I don't trust Apple's estimate of the false positive rate because I suspect it doesn't take into account that people often take a series of pictures that are very similar and that the statistics of natural images are constrained - Apple might be overestimating the variety in pictures that people take. The calculations for this estimate have not been made public so far as I can tell, so I cannot say for certain. The one thing that is certain is that the only way to know for sure is to actually try the system on customers, in a massive uncontrolled experiment. Moreover, the proposed system entails the possibility of a human being looking at pictures without the user's knowledge or specific consent. And ultimately it will be ineffective, as pedophiles learn to evade the system by modifying images.
Did you read the link I posted above about the successful attempts by computer science researchers to bypass detection in systems like the one Apple proposes? It is likely that pedophiles would use such tactics, strategically changing some aspects of a picture to escape detection. Did you read the conclusion of the researchers, which states that Apple would have to relax its criteria to detect selectively modified images to the point they would trigger >1 billion false positives a day? Here's the link again: https://forums.macrumors.com/thread...esearchers-in-new-study.2317024/post-30610726
...
Your second paragraph is clear; you don't believe them. I said that's fine.
And in your first paragraph, "Apple, of necessity, is keeping many details hidden." No one really knows how their system is different other than what they have mentioned (which in itself has shown that it IS different than existing hash programs). So how can it be fairly evaluated/compared at this point?
And I think that one of the largest companies in the world whose whole business is based on hardware and accompanying software might take some of of your concerns about false positives into consideration, no? Do you really think they will turn on a system that hasn't already been tested (not just in theory) and will be surprised by billions of false positives in the first day. Again...C'MON!!
I think we all know that Apple does not need our permission to turn this on and I believe they have already done so, even if it is limited to their own employees or some other small user group at this point. Criminals already know how to not get caught with this system; don't use iCloud. But if there is one thing in life that I am certain of and has been proven over and over again. 99.9% of criminals are stupid and always get caught.
Wow. So many answers to a question of a feature Google and Amazon already have since over 10 years.
The short answer: No. You shouldn’t.
Wow. So many answers to a question of a feature Google and Amazon already have since over 10 years.
The short answer: No. You shouldn’t.
Also, don't forget that nothing would be scanned in this proposed system if the user chooses to not upload their photos to iCloud, just like how Google or Amazon can't scan anything that isn't uploaded as well.Can you explain your comment a bit? Neither Google nor Amazon has implemented client side on device and their cloud scanning is on share.
Also, don't forget that nothing would be scanned in this proposed system if the user chooses to not upload their photos to iCloud, just like how Google or Amazon can't scan anything that isn't uploaded as well.
Understood, but opting out reduces the functionality of iCloud, so it is a choice between two bad options. As a customer I am not happy with that.Also, don't forget that nothing would be scanned in this proposed system if the user chooses to not upload their photos to iCloud, just like how Google or Amazon can't scan anything that isn't uploaded as well.
So the other option would be to scan the cloud and then you’d be in the same boat.Understood, but opting out reduces the functionality of iCloud, so it is a choice between two bad options. As a customer I am not happy with that.
Actually I find that more acceptable. My iPhone is my property. Apple's servers are theirs. I doubt Apple would accept me scanning their servers to look for illegal content, just in case I might find something.So the other option would be to scan the cloud and then you’d be in the same boat.
Actually I find that more acceptable. My iPhone is my property. Apple's servers are theirs. I doubt Apple would accept me scanning their servers to look for illegal content, just in case I might find something.
More importantly, Apple's scheme leverages local processing on mobile devices compatible with AI (in the broadest sense). The capability of local AI is only going to increase. It could reach a stage at which your phone refuses to record certain events or scenes, based on the whims of an authoritarian government. Alternatively, the camera and microphone might be continuously scanned by local AI to report you whenever you express thought crime, again at the whim of authoritarian regimes. Democracy has been under attack for some time now - do we really want to add more surveillance capabilities to authoritarian governments? I guarantee you that dictators around the world are having their government experts scan Apple's white papers about the CSAM system and asking whether the system can be replicated and modified to detect anti-government messages, flags, posters, memes, speech, etc. Apple appears to have looked only one move ahead. We need to be looking many moves ahead.
I’m sorry, but at what point in the current process does any government write the code or have access to the creation of iOS? And if they did, since this tech already exists and is on your phone already, again, where was your concern when Apple added it nearly 5 years ago?
The government has NOTHING to do with this. Apple is doing this on their own.
To be fair, your assertion that AI will report my “thought crimes” at the whim of authoritarian regimes because of this started my day off with a laugh.
Btw…Apple only put a camera and microphone on your phone so governments could watch and listen to you.
Whoops!! The chip planted in me when I got my COVID vaccine is telling me it’s time to get up and go to work.
Comparing actually concerning developments with wearing tin-foil hats is as inappropriate and wrong as it getsWhoops!! The chip planted in me when I got my COVID vaccine is telling me it’s time to get up and go to work.
Nope..."thought crimes" and "tin foil hats" (which I never mentioned) actually go hand in hand in this case.Comparing actually concerning developments with wearing tin-foil hats is as inappropriate and wrong as it gets