Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

xxray

macrumors 68040
Original poster
Jul 27, 2013
3,117
9,422
On today’s episode of the NYT’s Podcast “The Daily,” they say that at a hearing about encryption in late 2019, some members of Congress told Apple “if you’re not going to solve this problem [about CSAM], we’re going to force you to do it.” Then, the podcast says that’s when Apple set out to solve the problem.

Time stamp: 4:00

https://podcasts.apple.com/us/podcast/the-daily/id1200361736

https://open.spotify.com/episode/4oGuvoCiUg4XXOHMaSi8lb
This probably means they'll have the same software running on their server to remove all existing CSAM as well.
 
On today’s episode of the NYT’s Podcast “The Daily,” they say that at a hearing about encryption in late 2019, some members of Congress told Apple “if you’re not going to solve this problem [about CSAM], we’re going to force you to do it.” Then, the podcast says that’s when Apple set out to solve the problem.

Time stamp: 4:00

https://podcasts.apple.com/us/podcast/the-daily/id1200361736

https://open.spotify.com/episode/4oGuvoCiUg4XXOHMaSi8lb
That was fully expected that it's politically driven. So it's given that this technology will not be used only for protecting children.
 
How can anyone force someone to see what pictures they own? It's a human right to say "No, you cannot look into my phone/pictures"

The sad part, is that anyone that opposed to looking for CSAM in their devices will get looked at in a negative manner...The same as many people who will say "yeah, go ahead and search my car, I have nothing to hide." But for others, its "No warrant, no search," regardless if there is anything to hide or not.

If "They" are trying to use this to open it door, it is a good way to open that door...There are not, at least I hope, a lot of people that support CSAM. The ones who don't support it, also don't want to appear to support it either.
 
Certain members of Congress (Sen. Diane Feinstein, D-CA comes immediately to mind) have been trying for years to force back doors into all crypto in the U.S. If this story is true, this may be the camel getting its nose into the tent.

If this story is true, it also rather affirms one of the suspicions those of us opposed to the on-device CSAM scanner have had.
 
Sorry but this must be fake news. Several MacRumors forum members assured me and others that Apple push back against government threats to invade the privacy of their users -- just like they pushed back against the FBI's request to scrap plans to encrypt iCloud backups, or when the Chinese government asked for Chinese iCloud keys, or when the NSA asked them to help integrate with the PRISM program.

/s
 
On today’s episode of the NYT’s Podcast “The Daily,” they say that at a hearing about encryption in late 2019, some members of Congress told Apple “if you’re not going to solve this problem [about CSAM], we’re going to force you to do it.” Then, the podcast says that’s when Apple set out to solve the problem.

Time stamp: 4:00

https://podcasts.apple.com/us/podcast/the-daily/id1200361736

https://open.spotify.com/episode/4oGuvoCiUg4XXOHMaSi8lb

If Congress and the US Government can twist Apple's hand to force them to do this... the same Congress can also coerce every single American corporation to comply: Facebook, Amazon, Google, Twitter, Microsoft/Hotmail, Akamai and every single Internet data server or content provider.
 
  • Like
Reactions: dk001
If Congress and the US Government can twist Apple's hand to force them to do this... the same Congress can also coerce every single American corporation to comply: Facebook, Amazon, Google, Twitter, Microsoft/Hotmail, Akamai and every single Internet data server or content provider.
I’m 100 percent certain all of the big names you just mentioned already ARE cooperating closely with the government.
 
If Congress and the US Government can twist Apple's hand to force them to do this... the same Congress can also coerce every single American corporation to comply: Facebook, Amazon, Google, Twitter, Microsoft/Hotmail, Akamai and every single Internet data server or content provider.

Congress do it through laws which everyone has to follow.

But these companies you mentioned probably already scans actively for CP.
 
Sorry but this must be fake news. Several MacRumors forum members assured me and others that Apple push back against government threats to invade the privacy of their users -- just like they pushed back against the FBI's request to scrap plans to encrypt iCloud backups, or when the Chinese government asked for Chinese iCloud keys, or when the NSA asked them to help integrate with the PRISM program.

You have to remember that trying to obey laws are even more important to Apple than privacy for customers.

The CSAM Detection system is Apple's pushback against potential future laws which will require scans on servers. Scanning on their servers will require breaking encryption which is something they don't want to do.
 
  • Like
Reactions: Solomani
You have to remember that trying to obey laws are even more important to Apple than privacy for customers.

The CSAM Detection system is Apple's pushback against potential future laws which will require scans on servers. Scanning on their servers will require breaking encryption which is something they don't want to do.
I’d rather say it’s government hating encryption with passion for general public. They want everything transmitted via plain text. But then there’s complications where they don’t want to just give away everything for free, so encryption is required.

Guess apple just got caught between a rock and a hard place.
 
Damage control.

Apple is the most valuable company in the world or whatever, and we're supposed to think they were ordered to do something that contradicts everything they told their users and they were like "Darn it! Oh well, guess there's nothing we can do..."

You can tell how serious it is when stuff like this emerges. Same with the Epic leaks.
 
So, the CSAM checks for file hashes...when detected a thumbnail of the photo is sent to analysis by someone at Apple...Does this make Apply compliant in sharing CSAM...and the person who is tasked with viewing these thumbnails...what if they are a pervert and enjoy that CSAM?
 
So, the CSAM checks for file hashes...when detected a thumbnail of the photo is sent to analysis by someone at Apple...Does this make Apply compliant in sharing CSAM...and the person who is tasked with viewing these thumbnails...what if they are a pervert and enjoy that CSAM?
2 checks are done.

The first check is done while the photo is uploading, then once 30 matches take place, only those 30 photos are run through a second check with a different hashing process (perceptual hash) to make sure they're not false positives, then if they somehow make it through THAT check, then a human looks at them. I don't know for sure, but I'm sure it won't be just some Apple employee. It would probably be someone who is approved for verifying those photos. Then from there, that person (or persons) locks the offending account and contacts authorities.
 
This doesn't make it any better but with all of the political pressure placed on Apple lately, it does not surprise me. Now, with that said. I would expect every other tech company is, or has already been doing the same thing. It sucks but when it comes to information technology, privacy does not realistically exist anymore...
 
2 checks are done.

The first check is done while the photo is uploading, then once 30 matches take place, only those 30 photos are run through a second check with a different hashing process (perceptual hash) to make sure they're not false positives, then if they somehow make it through THAT check, then a human looks at them. I don't know for sure, but I'm sure it won't be just some Apple employee. It would probably be someone who is approved for verifying those photos. Then from there, that person (or persons) locks the offending account and contacts authorities.
So, lets saw a guy named Jon Binen uploads 50, it will stop scanning after the first 30?

Will they keep a running tab, like Jon Binen uploads 20 today, then next month does 9, and a couple days later does 1 more?
 
  • Haha
Reactions: rafark
No, each photo uploaded
So, lets saw a guy named Jon Binen uploads 50, it will stop scanning after the first 30?

Will they keep a running tab, like Jon Binen uploads 20 today, then next month does 9, and a couple days later does 1 more?
Running tab. Each image is scanned individually as it's uploaded. Once 30 matches piles up (over time), then those 30 photos will be run through another perceptual hash on the server to verify that they are visually similar to their CSAM counterpart. If not, those photos are trashed and no humans get involved.
 
  • Like
Reactions: mikecwest
If you were a person in the USA that has these types of images.
1/ would you stay with Apple?
2/ would you actually load them to iCloud?

My guess would be no, so does this make the process mute.
Would be nice to see if revenue reduces in the US because of this.
 
No, each photo uploaded

Running tab. Each image is scanned individually as it's uploaded. Once 30 matches piles up (over time), then those 30 photos will be run through another perceptual hash on the server to verify that they are visually similar to their CSAM counterpart. If not, those photos are trashed and no humans get involved.
As there is no perfect system yet, there is still a chance that Apples algorithm will bring up false positives.

So if someone at Apple would watch those false positives or better the real Pictures, that means the legal owner of these pictures can sue Apple for millions of dollars.

It's the same like they would begin to scan all Apple Pay transactions and because you transferred 100 dollars to Mary Uhana someone will look into all your payments.

There is a reason why this is a thing that should not lay in the hands of private companies.
 
Yes, but it’s nice to have confirmation after all the speculation recently.

Confirmation?
Hardly. NYT has been agenda driven for a long time now. Even if it turns out to be factual, it still doesn't say why on device and why this solution.

I actually expected an article or something similar like this within a couple of days of Apple announcing.

See what else shows up.
 
You have to remember that trying to obey laws are even more important to Apple than privacy for customers.

The CSAM Detection system is Apple's pushback against potential future laws which will require scans on servers. Scanning on their servers will require breaking encryption which is something they don't want to do.

Still doesn't answer why this method and on device. No other company does this.
 
  • Like
Reactions: eltoslightfoot
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.