Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

LouisvillePharmacist

macrumors newbie
Original poster
Nov 29, 2022
2
0
Hello all,

This is about the automatically generated keywords by the Photos AI (e.g. dog, beach, food, car, etc.) not the manually added keywords.

When I search for those keywords I get different results on my iMac than I do on my iPhone, even though iCloud for photos is turned on on both devices.

For example: when I search for "Cheese" I get three results on the iPhone, but only one on the iMac.

Does the iCloud not share the metadata between devices, is it generated differently on each device?

Thanks!
 
Does the iCloud not share the metadata between devices, is it generated differently on each device?
Interesting. Got me looking at my photos. Also images outside macOS Photos which are searchable in Finder (edit: Ventura). Like you, I get slightly different results for almost everything. I started with bird and waterfall.

I prefer to think of it as search results rather than keywords (this may be splitting hairs). The index is generated on each device and, I suspect, regenerated after macOS/iOS updates. I think the differences show the indeterminate nature of artificial intelligence (visual recognition).

There also differences between searching for images in the Photos Library and searching images outside the library. As a lighthearted example, macOS Photos shows this image when searching for "bird", but macOS Finder (like iOS Photos) does not:
IMG_3944.jpeg



it’s probably a privacy thing where the metadata is generated on each device and not shared.
I don't think it is a privacy thing. Rather getting the best results available on each device.
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.