YouTuber Louis Rossmann says MacOS is scanning for dangerous memes and illegal material without consent.
So it begins again...
So it begins again...
YouTuber Louis Rossmann says MacOS is scanning for dangerous memes and illegal material without consent.
So it begins again...
Yeah fair enough and it may not be credible. We have to trust Apple that it is not personal info being sent, and either you do or you don't. More transparency is needed. It's being depicted that the API access has just started, but perhaps it has been going on for some time and is trivial.By "it begins again", I assume you mean Rossmann's self promoting.
I really wish there was a way to watch and debunk something without that person getting credit for the view.
So I read that article and the person who created it honestly didn't write a very good article and just says that a program he used "little snitch" found that "mediaanalysisd" was running even without iCloud and all this other stuff on and opt in.Yeah fair enough and it may not be credible. We have to trust Apple that it is not personal info being sent, and either you do or you don't. More transparency is needed. It's being depicted that the API access has just started, but perhaps it has been going on for some time and is trivial.
Here is the source that he is citing:
This! (and for the Rossmann crowd: !!!1111!1) The guy should stick to doing bad soldering jobs and not talk about things he doesn't understand. Isn't he still proud he dropped out of college? Well, you don't need a PhD or any other degree to understand that things need updates, such as models used for inference in local search."mediaanalysisd has been around since forever, a process most associated with Photos’ facial recognition but later expanded to other things like text and subject recognition. There are non nefarious reasons for mediaanalysisd to connect to the internet, like to update its models, which are run locally."
He promotes the right to repair, which is something that apple steadfastly fought against. Even their repair kit is designed and priced in such away that makes a repair by apple easier and cheaper then using that kit.He promotes
His followers are usually Apple-haters, it's easy to get enough viewers following his rants that way.The guy is a predatory conman running a rip off repair centre who has managed to somehow create a YouTube thought bubble to attract more gullible customers.
Yes, it is for searching all sort of things on the local system and make intelligent matches for different files/documents. Nothing of the personal data is transferred to Apple. Again, MS is doing this as well. The security research group at the other end of the hallway loves this stuff and they would have gone nuts by now. I can hear the crickets chirping when opening up my office door.So I did some research. api.smoot.apple.com is Spotlight search completion service so it's probably using it to collect synonyms / doing text processing to build up a search index of some sort.
mediaanalysisd
is sending actual image informations to an Apple API. mediaaanalysisd
is a part of Visual Look Up (VLU) which is a system Apple uses to recognise objects in photos like buildings, animals, paintings and more and give users information about the photo. VLU works by computing a neural hash locally that is used to distinguish photos/videos by the objects within them. During VLU, Apple requests a neural hash to compute what that hash represents and sends it back to the user. I'm assuming the database used to compute the hash’s meaning is too massive to be used locally and wouldn't be able to stay updated with new information. I really didn't like the article this video is based on because it makes claims without any real evidence or research. mediaanalysisd
is a process that has been around for years and it's very easy to look up what it's used for and how to disable it.First of all, that Jeffrey Paul article is full of assumptions that this particular process,mediaanalysisd
is sending actual image informations to an Apple API.
Ok, but do we know what mediaanalysisd is sending over the wire? I wasn't even aware of this happening. And to be honest I also did not expect this to happen as a naive user of macOS.
mediaaanalysisd
actually does:Does anyone have a press release from Apple saying that this was not going to happen?
After extensive consultation with experts to gather feedback on child protection initiatives we proposed last year, we are deepening our investment in the Communication Safety feature that we first made available in December 2021. We have further decided to not move forward with our previously proposed CSAM detection tool for iCloud Photos. Children can be protected without companies combing through personal data, and we will continue working with governments, child advocates, and other companies to help protect young people, preserve their right to privacy, and make the internet a safer place for children and for us all.
Well, either those statements were misquoted by the reporter or Apple changed their mind and didn’t tell anyone because it’s up and running now apparently. It’s easy to detect network traffic on macOS but I don’t think it’s possible on iOS. My expectation is it’s active on all Apple devices.I don't think there was a press release, just a statement, offered to a couple of new agencies, originating from Wired:
Here is the article:
Apple Abandons Controversial Plans to Detect Known CSAM in iCloud Photos
In addition to making end-to-end encryption available for iCloud Photos, Apple today announced that it has abandoned its controversial plans to...www.macrumors.com
And this is what that statement says:
So they will not move forward with the CSAM in iCloud Photos. The iMessage Child Protection feature is a completely different thing.
Yes, if he only read some credible sources... https://eclecticlight.co/2023/01/18/is-apple-checking-images-we-view-in-the-finder/
it’s up and running now apparently
How do you know this is or isn’t CSAM? The only information we have is Apple is sending photo data back to their servers or some server. We have no knowledge other than maybe a few rare statements from Apple’s as to what they are doing with this data.Again, VLU is not CSAM. And what VLU sends to Apple is not at all the same thing as CSAM implementation could have send. And again:
So, again Louis here made a huge mistake not documenting himself before hitting the record button. As you can see, these are different systems, with different goals.
- VLU is a service that helps identifying text and objects in photos so you can copy/paste from them easily (I suspect it now works for videos also, since iOS 16). It can be disabled and is a part of Siri & Search.
- CSAM system could scan offline all photos from Photos Library, could create some hashes and verify those hashes against a database on Apple side, if and only if iCloud photos was enabled.
- iMessage Child Protection is a system that you can setup for your child's iPhone, and Apple can scan and blur potential unwanted material, received from unknown senders, before showing it to your kid inside iMessage. It also informs you, as a parent of that content. This one I will turn on for my kid, when he'll have an iPhone.