Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Kung gu

Suspended
Original poster
Oct 20, 2018
1,379
2,434
I like macOS and I cannot wait for the new macs but privacy is important to me.

So with the recent news of Apple scanning for photos on device from macOS Montery I wanted to keep my Mac private.

So I will use this app to protect my privacy.


I will block all Apple services such as iCloud, Music, TV, Podcasts and News.

I will also block all incoming connections from the Photos App.
 

Kung gu

Suspended
Original poster
Oct 20, 2018
1,379
2,434
By lulu terms:
“The OS does not route all traffic through Network Extensions (such as LuLu). As such, such traffic is never seen by LuLu, and be cannot be blocked.”
Sorry for the bad news.
Yeah I know but for specific apps like above its does block like Apple Services, photos app and iCloud
 

Shirasaki

macrumors P6
May 16, 2015
16,263
11,764
You need to do more than that. Apple doesn’t need to route their traffic through network stacks, ports and drivers like all third party apps do. With macOS system drive read-only by default, it is even tricker to block system application networks. The best bet is buying a USB computer stick (yes you heard that right), configure it to run Linux, install firewall on that, and let your Mac connect to that stick’s wifi signal. Apple can’t alter third party firewall behaviour, so your privacy should be better protected.
 

Thomas Davie

macrumors 6502a
Jan 20, 2004
746
528
Can you use steganography to hide images inside of photos to evade a scan? Or would Apple frown on that for some reason.

thx

Tom
 

Mikael H

macrumors 6502a
Sep 3, 2014
864
539
Can you use steganography to hide images inside of photos to evade a scan? Or would Apple frown on that for some reason.

thx

Tom
What's the point of that unless you're actually trying to hide CSAM? The thing with Apples solution here is that they only check image hashes unless you hit their threshold of CSAM hash matches. This means that they explicitly try to avoid looking at all your pictures, but they will hash check everything as part of he iCloud upload procedure.

You don't avoid scanning by not having CSAM images. You avoid it by not using cloud services for storing your pictures.
 

xraydoc

Contributor
Oct 9, 2005
11,030
5,490
192.168.1.1
Can you use steganography to hide images inside of photos to evade a scan? Or would Apple frown on that for some reason.

thx

Tom
Why? You can store all the nude/pornographic images you want on your Apple devices -- both self-made and shared on the internet. Apple couldn't care less.

The thing that is getting "scanned" are images that match hashes of KNOWN, PRE-EXISTING CSAM material.

If you're suggesting someone store CSAM inside other material via steganographic encryption, then you've got bigger problems.
 

Hessel89

macrumors 6502a
Sep 27, 2017
594
328
Netherlands
KNOWN, PRE-EXISTING CSAM material.

Nope. In this interview, Graig Federighi stated that in order to prevent child grooming, this neural system can also spot potentially explicit photo's made and sent through iMessage, so this also includes new photo's.
Essentially he's admitting Apple scans ALL your photo's going through iMessage or iPhoto.

Of course it's a bit more nuanced than that, but Apple have really chosen their words carefully on this one.
They are using the word ''match'', since this sounds like they only mean ''match'' as in ''match with known pre-existing CASM material.''

In reality the Neural System can also spot potential content, which it will then send to an actual human for verification. When this human has confirmed the photo does in fact contain CASM material, there is also a ''match''.
 

tlab

macrumors regular
Dec 12, 2017
111
170
Nope. In this interview, Graig Federighi stated that in order to prevent child grooming, this neural system can also spot potentially explicit photo's made and sent through iMessage, so this also includes new photo's.
Essentially he's admitting Apple scans ALL your photo's going through iMessage or iPhoto.

Of course it's a bit more nuanced than that, but Apple have really chosen their words carefully on this one.
They are using the word ''match'', since this sounds like they only mean ''match'' as in ''match with known pre-existing CASM material.''

In reality the Neural System can also spot potential content, which it will then send to an actual human for verification. When this human has confirmed the photo does in fact contain CASM material, there is also a ''match''.
You’re conflating two separate things. The neural scanning of images only happens if you’re a child in a family sharing account, and your parents have turned on the feature (ie it’s opt-in). If that’s enabled, it scans images when they’re sent or received by iMessage. It doesn’t touch your photo library.

The CSAM detection system creates a hash of every photo you upload to iCloud, then compares the hash to a pre-defined set of hashes of existing known CSAM images. If you don’t attempt to upload multiple of those particular CSAM images to iCloud, nothing gets detected and nothing happens to your photos. The neural system that’s used to analyse children’s iMessage traffic has no part to play in this system: it simply looks for matching hashes.

In any event, Apple is already able to decrypt your iCloud Photo Library, so if you’re that paranoid I’m not sure why you’d be using iCloud in the first place.
 

Puonti

macrumors 68000
Mar 14, 2011
1,567
1,187
You’re conflating two separate things. The neural scanning of images only happens if you’re a child in a family sharing account, and your parents have turned on the feature (ie it’s opt-in). If that’s enabled, it scans images when they’re sent or received by iMessage. It doesn’t touch your photo library.

The CSAM detection system creates a hash of every photo you upload to iCloud, then compares the hash to a pre-defined set of hashes of existing known CSAM images.
You're essentially correct, just a minor clarification because it is a complex topic:

Known CSAM detection when uploading photos to iCloud Photos (if user opts in by having iCloud Photos enabled) uses NeuralHash, Private Set Intersection, and Threshold Secret Sharing:


Sexually explicit image detection when a 0-17 year-old child or teen sends or receives iMessages (if their parents as configured in Family Sharing opt-in to the Communication safety in iMessages feature) uses similar machine learning as Photos has used previously for categorizing the contents of images (dog, car, sky, building, eggplant, etc..):

 

ian87w

macrumors G3
Feb 22, 2020
8,704
12,638
Indonesia
Nope. In this interview, Graig Federighi stated that in order to prevent child grooming, this neural system can also spot potentially explicit photo's made and sent through iMessage, so this also includes new photo's.
Essentially he's admitting Apple scans ALL your photo's going through iMessage or iPhoto.

Of course it's a bit more nuanced than that, but Apple have really chosen their words carefully on this one.
They are using the word ''match'', since this sounds like they only mean ''match'' as in ''match with known pre-existing CASM material.''

In reality the Neural System can also spot potential content, which it will then send to an actual human for verification. When this human has confirmed the photo does in fact contain CASM material, there is also a ''match''.
Those are two separate things.

The iMessage censorship using ML is local only, only for child accounts 13yo or younger, and it's OPT-IN. Meaning you can ignore it altogether if you want to. It's optional.

The CSAM part is separate. This one, the actual database of hashes is already coded into iOS15. The iPhone will scan all your photos to create hashes, and then see if any of those hashes match the CSAM hashes in the database.
 
  • Like
Reactions: tlab

dugbug

macrumors 68000
Aug 23, 2008
1,929
2,147
Somewhere in Florida
Those are two separate things.

The iMessage censorship using ML is local only, only for child accounts 13yo or younger, and it's OPT-IN. Meaning you can ignore it altogether if you want to. It's optional.

The CSAM part is separate. This one, the actual database of hashes is already coded into iOS15. The iPhone will scan all your photos to create hashes, and then see if any of those hashes match the CSAM hashes in the database.

This. The onboard censorship scanning is a local parent feature, it goes NOWHERE. It does not notify ANYBODY but the parent, which is enabled.

All you have to do to subvert the hashing of your photos is to disable icloud photos. Thats it. Thats all. You don't need to watch a youtube video or any nutjob conspiracy list of settings.
 
  • Like
Reactions: Tagbert and tlab

xraydoc

Contributor
Oct 9, 2005
11,030
5,490
192.168.1.1
Nope. In this interview, Graig Federighi stated that in order to prevent child grooming, this neural system can also spot potentially explicit photo's made and sent through iMessage, so this also includes new photo's.
Essentially he's admitting Apple scans ALL your photo's going through iMessage or iPhoto.

Of course it's a bit more nuanced than that, but Apple have really chosen their words carefully on this one.
They are using the word ''match'', since this sounds like they only mean ''match'' as in ''match with known pre-existing CASM material.''

In reality the Neural System can also spot potential content, which it will then send to an actual human for verification. When this human has confirmed the photo does in fact contain CASM material, there is also a ''match''.
As others have said, you're mistaking & combining the option child iMessage monitoring (reports to the parent) with the CSAM hash scanning. Two completely separate things.

1a) If you're an adult, Apple doesn't care about nudie pics sent over iMessage; no scan whatsoever is performed, either on-device or in the cloud.
1b) If you're a kid (and if your parents properly set up your account), your parents, and only your parents, will get notified if you send/receive nude images that the on-device AI picks up. These do NOT get sent to Apple or anyone else for verification.

2) Store KNOWN kiddieporn (i.e., images that the FBI have previously captured and hashed) on iCloud, and the FBI will (eventually) get notified.
 
  • Like
Reactions: tlab

Tech198

Cancelled
Mar 21, 2011
15,915
2,151
Yeah I know but for specific apps like above its does block like Apple Services, photos app and iCloud
We've been using Adobe stuff for years with Lulu blocked services, but that is on Catalina. (Intel-mac). Things must of changed.
 

08380728

Cancelled
Aug 20, 2007
422
165
In 27 or so years running macOS on the internetz, never have I had a single bit of malware or virus or any other related issue, never had a firewall on, never had any anti virus or malware app.

wish windows loosers would stop manufacturing imaginary threats.

Anyway what’s any of this got to do with Apple Silicon?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.