Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Blue Quark

macrumors regular
Original poster
Oct 25, 2020
196
147
Probabilistic
So, here's a question I never thought I'd ask: I'm beyond the return period for my M1 MacBook Air. I'm seriously thinking about selling it in light of what Apple's doing with iOS and their iPhone devices.

Is anyone hearing anything about them going to or deliberately not going to do this in macOS? Because, at this point, I really don't feel like I can trust them.

And I really, really hate this because, hardware-wise, it's a great laptop.
 

Mikael H

macrumors 6502a
Sep 3, 2014
864
539
So, here's a question I never thought I'd ask: I'm beyond the return period for my M1 MacBook Air. I'm seriously thinking about selling it in light of what Apple's doing with iOS and their iPhone devices.

Is anyone hearing anything about them going to or deliberately not going to do this in macOS? Because, at this point, I really don't feel like I can trust them.

And I really, really hate this because, hardware-wise, it's a great laptop.
If you use any cloud services, you're subject to this kind of controls. The on-device scanning is a separate solution and - as yet - opt-in and for kids only.
You of course decide what software you're using on your Mac, so that part is less dependent on Apple's central choices. But sure: if you want privacy at the cost of convenience there's always OpenBSD on a Thinkpad.
 

sstreky

macrumors newbie
May 9, 2021
13
17
So, here's a question I never thought I'd ask: I'm beyond the return period for my M1 MacBook Air. I'm seriously thinking about selling it in light of what Apple's doing with iOS and their iPhone devices.

Is anyone hearing anything about them going to or deliberately not going to do this in macOS? Because, at this point, I really don't feel like I can trust them.

And I really, really hate this because, hardware-wise, it's a great laptop.
It doesn’t seem so unreasonable. Anything the machine find suspicious will have to be reviewed by a real person ultimately.

I can tell you working in a financial institution this happens every day with your finances machines constantly scrutinize your every transaction and look for certain patterns.

Mainly to prevent money laundering but if yours is flagged it does get reviewed by a compliance officer who then decides whether or not to forward that information to our government.

And no they never tell you.

So you’re getting your life analyzed in many ways already you just don’t realize it.
 

cmaier

Suspended
Jul 25, 2007
25,405
33,474
California
So, here's a question I never thought I'd ask: I'm beyond the return period for my M1 MacBook Air. I'm seriously thinking about selling it in light of what Apple's doing with iOS and their iPhone devices.

Is anyone hearing anything about them going to or deliberately not going to do this in macOS? Because, at this point, I really don't feel like I can trust them.

And I really, really hate this because, hardware-wise, it's a great laptop.
The only scandal is people mis-characterizing what this system and does and how it works.
 

Shirasaki

macrumors P6
May 16, 2015
16,263
11,764
My MacBook Air is 7 years old at this point so way overdue for an upgrade, and I recently found teamviewer to be very good. I am strongly against what apple is doing to combat CSBM regarding scanning stuff locally but I have waived my right to migrate to another “ecosystem” years ago so whatever. ?‍♀️ ?‍♂️ ?
 

honam1021

macrumors regular
Nov 4, 2012
240
105
Personally I'm not concerned about it on macOS, I don't use iCloud Photos and it likely won't be for long before someone come up with patches to disable to scanner daemon entirely.

On iOS/iPadOS you're simply screwed.
 
  • Like
Reactions: JMacHack

Mikael H

macrumors 6502a
Sep 3, 2014
864
539
Not true. On-device scanning is for CSAM *and* nude photos/content for children under 13 (that is opt-in). The CSAM scanning is NOT opt-in, and there's no opt-out. The most you can do is disable iCloud photos.
Oh, yes, you're right.

My initial statement still stands, though: Don't use cloud services if you want privacy.
 

400

macrumors 6502a
Sep 12, 2015
760
319
Wales
Really not bothered with this after a read of the process. The others are at it did I read (cloud content providers)? That comes with the existing privacy and content ownership issues when you sign their T+C. Apple already has T+C to this effect, now it has a hall monitor (a clever one).
 

Natrium

macrumors regular
Aug 7, 2021
125
246
Really not bothered with this after a read of the process. The others are at it did I read (cloud content providers)? That comes with the existing privacy and content ownership issues when you sign their T+C. Apple already has T+C to this effect, now it has a hall monitor (a clever one).
The fact it’s in Apple’s TOS doesn’t mean it’s legal and certainly doesn’t answer the moral question whether this on device scanning system is a good thing or not.

The difference between Apple and cloud providers is that this system runs on your very own device and could potentially be expanded beyond imagination, whereas cloud providers abilities are technically limited to what has been uploaded to their service.
 

leman

macrumors Core
Oct 14, 2008
19,522
19,679
I am frankly appalled that Apple would decide to undo all of their privacy efforts just like that. Right now, I feel extremely conflicted. I do love the convenience and the functionality of Apple ecosystem, but the on-device scanning for me constitutes a massive breach of trust. Luckily enough, I am not in the USA, so I am not directly affected, but if Apple ever decides to turn this functionality on in my country I will be put in front of a very uncomfortable decision. I can only hope that they will find their common sense back and drop this extremely dangerous precedent. Anyway, I write Tim Cook the moment Apple announced these changes and I urge anyone concerned to do the same.
 

Grey Area

macrumors 6502
Jan 14, 2008
433
1,030
Really not bothered with this after a read of the process. The others are at it did I read (cloud content providers)? That comes with the existing privacy and content ownership issues when you sign their T+C. Apple already has T+C to this effect, now it has a hall monitor (a clever one).
I would agree that there is (for now) hardly any principal difference between a cloud provider checking the data coming from the user and the iPhone checking the data right before the upload - they are just placing the checkpoints at different ends of the tunnel.

However, I think there is something psychologically queasy about having the user device perform the check. People pay a lot of money for smartphones, they quickly become very personal devices, and people trust them with personal data (maybe more than they should, but that is how it is). And now this trusted personal smartphone starts using AI and various blackboxed methods beyond the user's oversight to check this personal data, with the sole purpose of finding an offense to report the user to the authorities. Basically, a trusted extension of your brain turns into an adversary out to get you.

Moreover, if the new on-device check is acceptable, then there is no reason why it should be limited to uploads. In fact, I would argue that it is morally inconsistent to do so: If child protection is a valid cause to check outgoing images, then it is also a valid cause to check any other data on the device. I do not even see that as a slippery slope. Organizations like NCMEC will want this, and Apple has no standing to refuse as they have already accepted the premises. Full scanning will come, I am certain. They are just cooking the frog slowly now.
 

leman

macrumors Core
Oct 14, 2008
19,522
19,679
However, I think there is something psychologically queasy about having the user device perform the check. People pay a lot of money for smartphones, they quickly become very personal devices, and people trust them with personal data (maybe more than they should, but that is how it is). And now this trusted personal smartphone starts using AI and various blackboxed methods beyond the user's oversight to check this personal data, with the sole purpose of finding an offense to report the user to the authorities. Basically, a trusted extension of your brain turns into an adversary out to get you.

Moreover, if the new on-device check is acceptable, then there is no reason why it should be limited to uploads. In fact, I would argue that it is morally inconsistent to do so: If child protection is a valid cause to check outgoing images, then it is also a valid cause to check any other data on the device. I do not even see that as a slippery slope. Organizations like NCMEC will want this, and Apple has no standing to refuse as they have already accepted the premises. Full scanning will come, I am certain. They are just cooking the frog slowly now.

Exactly, this is the unnerving part. I am perfectly fine with cloud scanning (even I prefer that they did not), after all they are also under certain pressure from authorities etc. But putting this functionality on-device opens the floodgate to all kind of very worrying developments. I mean, Apples implementation is smart and it is indeed privacy-focused, they use multiple layers of encryption etc., but the implication of this technology is far-reaching. They promise not to abuse it, but the very fact that abusing it would be trivial is already extremely worrisome. This is the biggest attack on privacy since the advent of social networks and I am honestly in disbelief that the outcry is not louder than it is. How can people support this kind of thing? American society simply doesn’t make any sense to me. The same people who are protesting universal healthcare and hate speech regulations as “attacks on their freedom” are perfectly fine with someone scanning their private data because “think of the children”.
 

400

macrumors 6502a
Sep 12, 2015
760
319
Wales
I would agree that there is (for now) hardly any principal difference between a cloud provider checking the data coming from the user and the iPhone checking the data right before the upload - they are just placing the checkpoints at different ends of the tunnel.

However, I think there is something psychologically queasy about having the user device perform the check. People pay a lot of money for smartphones, they quickly become very personal devices, and people trust them with personal data (maybe more than they should, but that is how it is). And now this trusted personal smartphone starts using AI and various blackboxed methods beyond the user's oversight to check this personal data, with the sole purpose of finding an offense to report the user to the authorities. Basically, a trusted extension of your brain turns into an adversary out to get you.

Moreover, if the new on-device check is acceptable, then there is no reason why it should be limited to uploads. In fact, I would argue that it is morally inconsistent to do so: If child protection is a valid cause to check outgoing images, then it is also a valid cause to check any other data on the device. I do not even see that as a slippery slope. Organizations like NCMEC will want this, and Apple has no standing to refuse as they have already accepted the premises. Full scanning will come, I am certain. They are just cooking the frog slowly now.
I am wondering if the cloud option was pushed by higher authorities or Apple saw that the authorities would be heading that way and this is apples way of keeping it a better option.

However, the way I read it works (as I understand it at the moment), and you sign up to the service, I still have no issues. Drop Apple and go with the others if that bothers people. Their (google, facebook etc.) business models worry me more and look to do the same things.

Until it goes further but then this is tech, it will advance no matter what people like about it.
 
  • Like
Reactions: cyanite

leman

macrumors Core
Oct 14, 2008
19,522
19,679
However, the way I read it works (as I understand it at the moment), and you sign up to the service, I still have no issues. Drop Apple and go with the others if that bothers people. Their (google, facebook etc.) business models worry me more and look to do the same things.

The problem really is that Apple scanning technology is on-device. With other cloud providers, you can always choose not to use their services and they won’t have access to your data. With Apple‘s implementation, you only have their promise that they will not use it outside of iCloud. But the technology is there, on your phone, ready to go. Who says that iOS 16 won’t extend it to your locally stored photos as well?
 

400

macrumors 6502a
Sep 12, 2015
760
319
Wales
The problem really is that Apple scanning technology is on-device. With other cloud providers, you can always choose not to use their services and they won’t have access to your data. With Apple‘s implementation, you only have their promise that they will not use it outside of iCloud. But the technology is there, on your phone, ready to go. Who says that iOS 16 won’t extend it to your locally stored photos as well?
Tricky balance. If you don't like the options then you can move. I have looked at the alternatives and they are worse for my money.

I use one cloud backup that has zero knowledge and you are on scouts honour to not upload anything shonky. It is purely a back up, full versioning etc. and does not provide the use Photos and iCloud does.

But I still wonder if Apple is seeing into the future with legislation and expectations (and some certainty from legislators) and this is the best option going forward.

Time will tell.

Sales will also tell.
 

leman

macrumors Core
Oct 14, 2008
19,522
19,679
Tricky balance. If you don't like the options then you can move. I have looked at the alternatives and they are worse for my money.

At this point, nobody else does on-device scanning.


But I still wonder if Apple is seeing into the future with legislation and expectations (and some certainty from legislators) and this is the best option going forward.

If on-device policing is the best option going forward, then we are already living in a dystopian society. In a world where the majority is ok with this kind of functionality, discussions about privacy and personal freedom becomes pointless.

Sales will also tell.

That’s a dangerous way of looking at it. Most folks are careless, they will buy any kind of crap if the wrapper is shiny enough. Legislations and customer protection groups are what is supposed to prevent such situations but the system is simply not working.

I fear that after some initial outrage people will calm down and the protests will die out. Google and Microsoft will implement similar systems next, and users will follow out of convenience. Linux community will resist but Linux is practically unusable as a desktop system, so nobody will really care. And within a couple of years, we will live in a total surveillance society, where our own phones are spying on us.

The real danger is lack of functional legislation and proper law enforcement mechanisms. Apples technological platform is sound and well implemented. It’s much more sophisticated than other systems and it does put user privacy in foreground. Unfortunately, Apple seems to be completely oblivious to the social aspect of the problem - imperfect data sources, flawed legal system, dehumanized law enforcement and of course, nefarious actors that will find a way to use these systems to their advantage. Forget ransomware, now you have to fear receiving manipulated images f cats that will get yiu flagged as a sex offender to a system that does not give a damn about you as an individual. Read about folks who had their bank accounts locked because they share the same name as some criminal. That’s how well the system works.
 

pshufd

macrumors G4
Oct 24, 2013
10,151
14,574
New Hampshire
I would not be surprised if Microsoft has been doing this as well.

A lot of people have already pointed out why this isn't a good idea as to the slippery slope.

If you want privacy, you probably need to go with Open Source.
 

cyanite

macrumors 6502
Sep 28, 2015
358
472
you only have their promise that they will not use it outside of iCloud.
If you don’t trust Apple, why would you use their devices and services? Via iOS they have full control of the device if they wanted to, so if you think they might, just don’t use it. Of course this applies to the competition as well.

In a world where the majority is ok with this kind of functionality, discussions about privacy and personal freedom becomes pointless.
That’s a ridiculous exaggeration. I actually think doing it on-device is better than in the cloud, since it means less data will be shared with Apple. If you believe Apple lies about this, don’t use their products. But any company could lie about something at any time, so I think it’s pointless to spend all your time worrying about that.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.