Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

crawfish963

macrumors 6502a
Original poster
Apr 16, 2010
943
1,676
Texas
As found here.

Also, there have already been collisions, as found here.

Screen Shot 2021-08-18 at 10.55.53.png
 
“muh 1 in 1 trillion”

it took amateur devs less than a day to find ways to potentially exploit the system LOL

Apple have to reverse course on this spyware and remove it entirely. It’s a shame that too many of their customers are gaslit into thinking this is okay.

This also proves the theory someone had in a different thread after they noticed the press images Apple used in the CSAM announcement were from iOS 14 despite this being an iOS 15 feature: Apple intended to release this some time during iOS 14 but delayed it for some reason.
 
  • Like
Reactions: crawfish963
“muh 1 in 1 trillion”

it took amateur devs less than a day to find ways to potentially exploit the system LOL

Apple have to reverse course on this spyware and remove it entirely. It’s a shame that too many of their customers are gaslit into thinking this is okay.

This also proves the theory someone had in a different thread after they noticed the press images Apple used in the CSAM announcement were from iOS 14 despite this being an iOS 15 feature: Apple intended to release this some time during iOS 14 but delayed it for some reason.
Purely a guess, but I would be willing to be that COVID was partially responsible for the delay, as have been the anti-trust issues Apple has been dealing with.
 
  • Like
Reactions: macintoshmac
“muh 1 in 1 trillion”

it took amateur devs less than a day to find ways to potentially exploit the system LOL

Apple have to reverse course on this spyware and remove it entirely. It’s a shame that too many of their customers are gaslit into thinking this is okay.

This also proves the theory someone had in a different thread after they noticed the press images Apple used in the CSAM announcement were from iOS 14 despite this being an iOS 15 feature: Apple intended to release this some time during iOS 14 but delayed it for some reason.
Yeah, you can spoof it if you have the original hash of the CSAM file and generate a photo that matches that hash, but even then, the photo is scanned again by a different perceptual hash on Apple's own server to rule out a false positive. If the image doesn't have the same visual look to it, it's not going to be sent to human review. I think it's good that this safeguard is in place.

Someone somewhere is going to find some really bad CSAM images and try to create normal looking "fakes" and sprinkle them around on the internet I'm sure, but the images would have to "look" nearly identical to a CSAM image in order to get flagged for human review.

https://www.apple.com/child-safety/...del_Review_of_Apple_Child_Safety_Features.pdf Outlined here

989148-065ae9c44cd6649a65fc08ac17aba046.png
 
Yeah, you can spoof it if you have the original hash of the CSAM file and generate a photo that matches that hash, but even then, the photo is scanned again by a different perceptual hash on Apple's own server to rule out a false positive. If the image doesn't have the same visual look to it, it's not going to be sent to human review. I think it's good that this safeguard is in place.

Someone somewhere is going to find some really bad CSAM images and try to create normal looking "fakes" and sprinkle them around on the internet I'm sure, but the images would have to "look" nearly identical to a CSAM image in order to get flagged for human review.

https://www.apple.com/child-safety/...del_Review_of_Apple_Child_Safety_Features.pdf Outlined here

Yeah well it wouldn't be hard for a bad actor to find CSAM on the internet that has most likely been indexed in the hash DB then create a bunch of attach images from that. All they have to do then is spam services like WhatsApp that automatically adds photos you receive into your iCloud library.

This would effectively spam the "Apple human reviewers" with a ton of false positives making their job next to impossible.
 
Yeah well it wouldn't be hard for a bad actor to find CSAM on the internet that has most likely been indexed in the hash DB then create a bunch of attach images from that. All they have to do then is spam services like WhatsApp that automatically adds photos you receive into your iCloud library.

This would effectively spam the "Apple human reviewers" with a ton of false positives making their job next to impossible.
This kind of attack could already be done to anyone backing up to iCloud Photos (Or another cloud photo provider like Google Photos).

Also “next to impossible”? Do you think the CSAM reports come on on an “I Love Lucy” style conveyor belt or something?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.