I know this has been discussed, but this was my first thought. Photos of little grandsons' bath being flagged.What about photos of "Baby's first bath" will those users get treated as child exploitation?
I know this has been discussed, but this was my first thought. Photos of little grandsons' bath being flagged.What about photos of "Baby's first bath" will those users get treated as child exploitation?
Helping the FBI in that context would reduce privacy for everyone. But they do help the feds when they have a valid warrant and the data is accessible without compromising everyone else’s privacy. This is a great toolLet's also not forget this is the same company that would not help the FBI access dead known/proven terrorists' phones to try to prevent other attacks and get information.
But they will scan your cloud data for these types of images.
A bit hypocritical which criminals they will assist with or not as well.
If you install the pertinent update, you will be giving consent.APPLE YOU DO NOT HAVE THE RIGHT TO VIOLATE MY PRIVACY TO ANYTHING ON MY DEVICE.
Typical Tim Cook BS - “all about human rights,” then supports PRC. “All about privacy,” then scans all images on someone’s phone.
Like any iOS update, the user has to give consent before being allowed to install the update.So. They access user photos without consent and sell users out? Really, Apple?
I know this has been discussed, but this was my first thought. Photos of little grandsons' bath being flagged.
Nope. I am against child abuse and child porn. I was just concerned about having nudes for myself.If you’re sending them to a minor, yes you should be worried
There is nothing in these tools that remotely affects having nudes on your phoneNope. I am against child abuse and child porn. I was just concerned about having nudes for myself.
I wondered what agreements in secret Apple made with the DOJ that caused the threats and PR pressure for backdoors to be created suddenly disappear. I think we just saw one. How bad are the non public ones?That's so wrong! Hey Apple... Focus on fixing this stuff before going after the consumer's. This is creepy and wrong in so many levels.
![]()
Female Apple employee put on administrative leave following tweets about sexism in the workplace | AppleInsider
Apple senior engineering program manager Ashley Gjovik was placed on indefinite administrative leave this week after she posted a number of tweets chronicling alleged sexism and discrimination in the workplace.appleinsider.com
Isn’t this violating of a Privacy? I have a lot of nudes of myself on my phoneshould I start deleting them or what?
Well over 34,000 photos. You got to be kidding me
I’m scared 😱 Should I be worried?
If you read how the process will work, this kind of picture would never been flagged. That doesn’t make the move less controversial or more acceptable but Apple wants one of your saved pictures to match a database of child abuse photos before being flagged to anyone. Nudes of you or photos of your childs taking a bath would not match that database in the first placeI know this has been discussed, but this was my first thought. Photos of little grandsons' bath being flagged.
Not true. They access user photos. Violation of privacy.You all, the ONLY images being flagged are those whose hashes match a database that is maintained by those who combat child exploitation. Your bath photos aren’t on said database, so there’s nothing to worry about.
Health goals, of course. I said it myself, LOL.That is a lot of nudity. Are you in the business or just like to check your body for health goals?
The CSAM database is of known abuse images. So regular nudes shouldn't trigger them.
And the blurring of explicit photos seems to be a parental control thing.
It's still a slippery slope here. What next, will Apple disallow nudity in icloud, because, #morals?
Plus, the whole appeal of iCloud over say Google Photos was there was no cloud AI meddling with your photos scanning them at all.
Now, what is the difference really?
No need to be rude.The negative sentiment here is what happens when people don’t read the article, and don’t understand the parts they do read. Learn how to read then come back and comment
Nobody is violating your privacy. You will only be affected if you are a criminal with child photos, and then you should be arrested, shamed, and jailedAPPLE YOU DO NOT HAVE THE RIGHT TO VIOLATE MY PRIVACY TO ANYTHING ON MY DEVICE.
Typical Tim Cook BS - “all about human rights,” then supports PRC. “All about privacy,” then scans all images on someone’s phone.
Absolutely. False positives are a legit concern. But the early responses seemed to think that any borderline child images were going to be reported.you don’t know for a fact whether or not the system will trigger false positives as Apple have said their technology is proprietary and is designed to detect changes from base CSAM images. That means some level of interpretation is involved via machine learning which on the whole has been proven to be easily tricked into false positives. No doubt that will be weaponized somehow. We can’t say it’s just comparing basic hashes because Apple have stated otherwise and have not opened their system up for public audit.
Not to mention how this on the whole is a massive invasion of privacy and opens the door to scan for images outside of what is obviously bad content.