It recognized and categorized faces?Which is funny because Google Photos was already very similar to Apple's iPhoto from 10 years ago for OSX.
Open iPhoto? We've been using that feature for many years.Proof please.
Yes.
Edit: Since 2010.
Like this http://www.macworld.com/article/2049356/how-to-use-iphotos-faces-and-places-features.html which states it started in 2009?Open iPhoto? We've been using that feature for many years.
Psst, yes it did ;It recognized and categorized faces?
Well, I think we should be clear here: iPhoto, which at the time was a Mac only product, had facial recognition. I don't believe the mobile product did. However, the mobile product did capture locations. iPhoto for iOS was only introduced in 2012 (http://www.macworld.com/article/1165729/iphoto_for_ios_arrives_on_the_app_store.html)Psst, yes it did ;
Agreed, I would like to hear from others who have experienced both apps. Given the same pictures in both applications, Google Photos recognizes the same face from all the different people nearly perfectly. The same thing in Apple's photo app seems to struggle with this, for example I have to "merge" the same person as much as a dozen times because it seems unable to match them. Anyone else using both who can speak to this?Well, I think we should be clear here: iPhoto, which at the time was a Mac only product, had facial recognition. I don't believe the mobile product did. However, the mobile product did capture locations. iPhoto for iOS was only introduced in 2012 (http://www.macworld.com/article/1165729/iphoto_for_ios_arrives_on_the_app_store.html)
Let's leave it at this: The competition between Google and Apple is making both products better. Now, Apple just needs to find a way to better handle the low-res versions of photos following iCloud library integration. Even those low-res versions (of 25,000 photos, admittedly) were enough to push my phone over its memory. There needs to be a way to limit the number of low-res photos stored on device or somehow allow large libraries without needing huge on device storage (though bumping the min to 32GB helps). I would love to have my photos all with Apple, but I cannot, so I have to use Google Photos for my long term storage.
That's the tradeoff for having this done on device as opposed to on Google's servers. Obviously, server-based recognition can bring a lot more processing power to bear than can the device. That said, I don't think it will be a fair comparison until you do some merging and "teach" the on device neural network to recognize those faces. I suspect that recognition rates will improve over time (at least that's the Apple spin!)Agreed, I would like to hear from others who have experienced both apps. Given the same pictures in both applications, Google Photos recognizes the same face from all the different people nearly perfectly. The same thing in Apple's photo app seems to struggle with this, for example I have to "merge" the same person as much as a dozen times because it seems unable to match them. Anyone else using both who can speak to this?
I was surprised to see this task being processed on the device.That's the tradeoff for having this done on device as opposed to on Google's servers. Obviously, server-based recognition can bring a lot more processing power to bear than can the device. That said, I don't think it will be a fair comparison until you do some merging and "teach" the on device neural network to recognize those faces. I suspect that recognition rates will improve over time (at least that's the Apple spin!)
Apple's big thing this year (not surprising given the controversy relating to the San Bernadino terrorist's phone) is privacy. They want to do as much as possible - data analysis-wise - behind the secure enclave of your device. If they have to do stuff on their servers, they'll anonymize the data as much as possible. Now, one could say they're interested in protecting your privacy. A cynic, however, might say that they realize that they are way behind in the data analysis game and are introducing this as a way of trying to level the playing field with consumers.I was surprised to see this task being processed on the device.
Fair enough, it just seems like the can't maintain that for too much longer. Offloading tasks server side while never having to overload your device is the future, for mobile computing in general really. I get it for the fingerprint, having its own dedicated chip but these pictures reside in Apple's cloud ecosystem for most anyway, why not just process them there?Apple's big thing this year (not surprising given the controversy relating to the San Bernadino terrorist's phone) is privacy. They want to do as much as possible - data analysis-wise - behind the secure enclave of your device. If they have to do stuff on their servers, they'll anonymize the data as much as possible. Now, one could say they're interested in protecting your privacy. A cynic, however, might say that they realize that they are way behind in the data analysis game and are introducing this as a way of trying to level the playing field with consumers.
Off the top of my head, fappening 2.0, with enhanced face recognition and video slide shows. It wasn't Apple's fault but they got blamed(or thanked) anyway.Fair enough, it just seems like the can't maintain that for too much longer. Offloading tasks server side while never having to overload your device is the future, for mobile computing in general really. I get it for the fingerprint, having its own dedicated chip but these pictures reside in Apple's cloud ecosystem for most anyway, why not just process them there?
Fair enough, it just seems like the can't maintain that for too much longer. Offloading tasks server side while never having to overload your device is the future, for mobile computing in general really. I get it for the fingerprint, having its own dedicated chip but these pictures reside in Apple's cloud ecosystem for most anyway, why not just process them there?
Understood, however photos still do reside on their cloud servers in this case. That being the case we have to wonder why they're pushing it off to the client for processing, I wonder if that's why it's not as accurate as it may take a while to properly match them up.Absolutely not. Demand is for greater privacy, not less.
The photos reside on the servers, but not identity of the person or their relationship to you. That is what I think they are pushing.Understood, however photos still do reside on their cloud servers in this case. That being the case we have to wonder why they're pushing it off to the client for processing, I wonder if that's why it's not as accurate as it may take a while to properly match them up.