I think it’s funny people worry about this so much, yet google and others are scanning your pictures too, without a doubt. It’s not a matter of when they might launch something like this, but when, for every service.
As I said, it’s a slippery slope. You can change what is illegal in a daily basis.
It’s either way a slippery slope. Especially for a company who advertises itself with all those privacy slogans. CSAM is just the first step. Playing the safety of children card first. After that they’ll just start adding more “features”.
If you need to ask the opinion of others – those you don’t even know — concerning your buying decisions, and you flop back’n’forth because of it, then you really shouldn’t be buying anything.
And slippery slope arguments are fallacious unless you can prove that this "first step" is logically related to these other steps that you think will lead us to "hell" (so to speak). The thing is, it's not.
That’s what you’re saying right now. Just take covid for example. All those vaccine passports and booster shots were conspiracies until they weren’t.
Prove that they are doing it on my device.I think it’s funny people worry about this so much, yet google and others are scanning your pictures too, without a doubt. It’s not a matter of when they might launch something like this, but when, for every service.
So the answer is yes.As I said, it’s a slippery slope. You can change what is illegal in a daily basis.
Actually, you bring up an important point. Just because something you "predicted" would happen comes to pass doesn't mean your logic was valid. To take a silly example, I could argue: "Dogs have four legs. Humans have two legs. Therefore, I'm going to have a car accident today." Now, if I indeed got into a car accident, does that mean my logic was valid because my conclusion was true? Obviously not. Or to take a slippery slope example: "Don't let your kid get their driver's license. Before you know it, they'll be picking up hitchhiker and get kidnapped." Now, that very well may happen, but that's not a direct logical result of someone getting their driver's license. It's a very poor argument.
There's basically no technology out there that can't be exploited and abused, especially by people in power. That's no reason to not continue to develop technology.
Ofcourse, tech development should continue but what Apple wants to develop is not where tech is supposed to head. There are other more important things to develop. Like better and longer lasting batteries for start and so on.
The sad reality is that the world is heading to a total surveillance and control over the individual. For every new advancement to this, they say it’s for xxx safety. And I mean yes, on paper I’m all hands on deck for the safety of children but who’s safety comes after that? It’s just a matter of time.
That's obviously an opinion and one that apparently Apple and many others DON'T share with you (and of course many DO share it). And it's a straw man to imply they are focusing just on this to the detriment of other technological developments. Obviously this is merely one of the many, many things they are working on.
Again, this is fallacious reasoning. And Apple is not interested in surveilling what you have on your local iPhone storage. They never have and I believe they never will. But when it come to using their optional cloud service, you are now moving files from your local iPhone storage to someone else's computers (servers). They have every right to ensure you aren't uploading illegal material to their servers. If you really don't want Apple to know about ANYTHING on your iPhone, then I'd suggest not using iCloud, as it's not 100% private - Apple has always had the ability to decrypyt and view your files/data stored there.
We might never know who is interested in surveilling our data. Probably (maybe) not Apple, but the government. Apple didn’t play along with the government regarding backdoor access to iDevices so this might be where they agreed to play along or face consequences.
Weird that they haven’t been interested in illegal mp3’s and movies uploaded to iCloud all those years but suddenly they’re so interested in photos.
Apple's way is completely blind until it gets to the server, so no, your phone has no idea what if the scans are a match or not until they're processed on Apple's servers.
Would you prefer I make up a word? That's what it is, so that's what I call it. It's an informal logical fallacy called "appeal to emotion" (argumentum ad passiones) that is rampant in our society in many different arenas. People fall for it all the time, which is why you and others use it - far easier than actually using reason and evidence.
You can use any word you want. That is up to you.
Of course, but I choose to use the accurate word.
Sad but true when you think about it. I guess I'm being naive and like to believe this world is a lot better than it actually is.I dont have a percentage, but I'm guessing a WHOLE lot more than 0.1% of Apple consumers have CSAM collections. It is a HUGE problem world-wide, and there's nothing special about Apple consumers that would make them any different from the rest of the population. I can promise you Apple isn't going through all this blood, sweat, and tears for a 0.1% problem.
Once again, the term used is your choice.
As I said, it’s a slippery slope. You can change what is illegal in a daily basis.
CSAM concerns to me seem overblown. As I understand it, Apple will be scanning for known images or faces of children on iCloud. If something is flagged, then a human will review the image and it escalates from there. For me, I would imagine 99.9% of Apple consumers do not have sexual images of children so things will not be flagged/escalated. If an accidental flag happens, a human will review and resolve. Can this technology be exploited? Maybe. Are we fools for expecting 100% privacy in the year 2021? Probably.
They're not merely blocking offending content from going onto iCloud like you say. That would be understandable. But they're also reporting you to the authorities if you attempt to upload it. That's the mass surveillance part, and there's no excuse for it.That's simply untrue. This is exactly what I'm talking about - you've clearly bought into the conspiracy theory stuff surrounding this. ALL they're wanting to do is scan for illegal imagery (which is not protected by the Constitution as free speech) to prevent people from uploading it to iCloud, because that's a violation of the TOS. If you don't like that, then you can disable iCloud for photos. If you think Apple is going to "hand the keys over" to governments to abuse this system, then what's your evidence for this? Do you have a secret recording of Apple execs discussing this with top government officials or something? What motive would Apple have for doing so? Heck, they wouldn't even give the FBI their way to give them access to a terrorist's iPhone. Now you think they're suddenly going to lie down and be walked all over? I don't think so.
I actually don't. I ask friends who know their stuff. According to product reviews, the best phone is some random piece of **** Android, cause nobody gets clicks hyping up something typical. The OnePlus Maxx Pro China+ 5G or whatever.So I take it you never read product reviews? 🤔
They're not merely blocking offending content from going onto iCloud like you say. That would be understandable. But they're also reporting you to the authorities if you attempt to upload it. That's the mass surveillance part. There's no excuse for it.
I actually don't. I ask friends who know their stuff. According to product reviews, the best phone is some random piece of **** Android, cause nobody gets clicks hyping up the iPhone.