Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

hanneskoleh

macrumors newbie
Original poster
Aug 26, 2017
3
2
I just recently learned that Photos app has a quite a lot metadata for the images I have taken, try searching for "fruit", "book", "food", "baked", "bottle". As a feature this is pretty neat, it simplifies searching for past pictures much easier. In addition to this the logic can detect people from pictures and even identify them.

However I am concerned about the privacy factor. I have not enabled iCloud Photos sharing. Yet when I take new photo of an apple in Airplane mode, the search finds it as "fruit" only after I have enabled Internet access again. This means that Apple is sending my images (or some derivates of them) to their servers to be processed (by Siri CVPR best paper award winning) https://www.macrumors.com/2017/08/25/apple-ai-research-paper-wins-top-award/ technology.

I haven't found any source that explain exactly what happens there, so I can only guess. What ever it is, there would need to be toggle to disable it, and every user would have to opt-in/out of it when they start using new iOS-version. I do not wish any entity to have knowledge about my private images.

---

If the detecting logic could work entirely in my device, and no need to send the data anywhere, then I would be 100% happy, but I assume that is not the case?
 
Last edited:
  • Like
Reactions: Mescagnus
Wait! What?

So even if I want to keep my photos on my iPhone only, image recognition requires online access to work?

Didn't Graig explain in their keynote that iOS 10 photo analysis happens on the phone, which is possible because of iPhone's powerful GPU?

So why are new images analysed only after internet access is enabled? Why is it required, and what exactly is sent to Apple's servers?

Does Apple actually know the contents of my photos, even if I want to keep them private on my phone? Is any of this explained in their Privacy Policy?

I want answers, too.
 
Not seeing this behaviour.

I put the phone into Airplane Mode, took a picture of a coffee cup. When I tapped the search icon in Photos, at the bottom, saw an "Indexing" spinner for a while (less than a minute). After that, search on "cup" found the picture. Repeated with a photo of a candle, same behaviour.

OP maybe not be letting phone do indexing in sufficient time frame (I am assuming that the algorithm is looking at the whole photo library, so could take some time on a large library: I only have 38 pictures on the device) and appears to only work when online since maybe turning off Airplane Mode before indexing is complete. Test is to put into Airplane Mode, take a picture, then leave the phone alone for a few minutes, then see if search works. And or tap the search button and leave the phone alone for a while.
 
Last edited:
It doesnt find anything or idenfies them inccorrectly. I have taken a photo of a book today. I tried to find it by searching "book" and the first photo it found was my faceshot but i dont feel im a book.. i also tried "fruit" - but it didnt show any fruit to me (only a photo named fruitcake). Now im wondering what photos it will show if i tried a search with a word banana...
 
I thought this image recognition only ran when the phone was charging. Also, it doesn't send your photos to Apple...
 
Apple said at WWDC 2016 That it would be using a technique known as "differential privacy" to comb through users' data while keeping them anonymous. This includes techniques like hashing, subsectioning, and noise injection to scramble their own data. This makes it difficult, in theory, to ever trace information back to an individual user, while still providing Apple's computer scientists with workable datasets with which to train their deep learning tools.

Source: theverge
 
It looks like my original assumption was wrong.

In my original tests I waited maybe 60-120 seconds for the image to be processed. It was not.

Today I tested with 7 additional pictures (running shoes, flower, magazine, wine bottle, cherry tomatoes, 2 plants), waited for 4-5 minutes after each. Running shoes, flower, wine bottle and one of the plants were recognized as categorized items even in Airplane mode. Those 3 other pictures (magazine, cherry tomatoes or one plant) were not detected as anything (that I could find a name for) even in online-mode.

So unless iOS is sending data in airplane mode, it in fact feels that the metadata detection does work in the device! That is good news!
 
Last edited:
  • Like
Reactions: Mescagnus
Thanks for investigating & clearing this up.

Now I don't need to worry that Apple might know about my secret squirrel fetish!
 
I am still a bit unsure about the internals of this functionality.

I'd love to test this with fresh iPhone that hasn't yet been linked to any Apple-account.

Next time when you get a new iPhone, launch it, do not input any Apple account, go to Airplane mode (and/or do not input SIM and WLAN details), then take multiple test images of different objects, and try if it found a category for them.

As it might be that my device and profile has already been "tuned into" my kind of photos with some guidance from the server-side by passing in my data.

Someone getting new device next time, please test this and report here.
 
I can try this with my iPhone 7, when I reset & erase it (upon getting the new "iPhone 8"), before selling it.

The 7 will become a "fresh" iPhone at that point. I'll then take photos of some objects, with no internet connection whatsoever, and see what happens.

I need to be certain that all the squirrel stuff is not compromised or leaking anywhere!!
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.