Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

dk001

macrumors demi-god
Oct 3, 2014
11,142
15,496
Sage, Lightning, and Mountains
Do the same digging on their competition and I think you'll find the grass isn't even growing on the other side. Always good to approach everything with a bit of skepticism. I read their published material, I researched the organizations that deal with the content and manage the databases, and I remembered who has the most open gov't contracts (not Apple). I also read information published by security and privacy gurus, but always with the question of what would they stand to gain by supporting one side versus the other. I do think Apple botched this reveal and were counting on a much different public reaction, it was good comedy.

1. We're the user so we cannot prevent that unless we somehow gain control of those databases. If you're so inclined, become an employee to help maintain oversight.
2 & 3. I haven't seen the reasoning behind the data threshold, possibly from studies behind pinterest/instagram parents versus pedophiles and their devices. 🤷‍♂️ Maybe Apple doesn't want to over burden NCMEC with potential false positives so they act as a middle man to limit backlogs. My guess is that when Apple did in transit encrypting the CSAM hits were reduced and that raised flags with the NCMEC who probably reached out to Apple to find a middle ground solution. NCMEC End to End Encryption Statement
4. Shared globally yes, but each organization/agency can weigh in to help ascertain validity like a second or third opinion from doctors.
5. You don't notify a criminal they're about to be caught for doing whatever illegal activity it is so they have time to destroy evidence.
6. You can't compare hashes on encrypted data at the server level, the photos are encrypted in transit so it would have to be on device scanning to do the comparison. Server side would require unencrypted data transfers to the server and your privacy and security experts would agree that is a terrible idea. Keeping the work on the device does help with end-user privacy.
7. It looks real at face value.

A totally acceptable choice you're currently allowed to make, and hopefully that choice is never forced for you later on down the road.


It may be just "some privacy" to you, but to others their location may be total privacy. To each their own. In terms of Apple versus the competition, I am turning more to Apple than my previous position of MS. I am just tired of MS's terrible OS. As an outsider looking in I liked how the ecosystem all worked together versus hunting down random drivers different versions of software. I will also never use an android device after having had to work with them in my previous job, buggy software, poor quality devices, and it's too open for my liking. YMMV


Yes, anything that detracts from their stovepipe of ideas and feelings. You can also suppress posts by burying them under more posts that scream counter ideas, not everyone peruses each thread so if you can hide the informative post with group think it accomplishes the same thing albeit less effective as removal.

I’ll start with just your first couple of sentences. I’ll do a bit more later as work allows…. Yeah, all is not green.

I went into this not knowing that CSAM scanning was happening anywhere. As I learned, what was alluded and what was actually happening differed by a lot (scale). I am leaving out data scanning for ads. That is another issue and varies widely.

If Apple implements this, they will jump from the least intrusive to the most intrusive (except for maybe FB) and do it in a most broken fashion based on what we know so far.

Excluding FB, all else that scan do so when items are shared. Not on upload or download. None of these, including FB, scan on device.

In the end though, this isn’t about CSAM. It is about the tools being used.
 

Serban55

Suspended
Oct 18, 2020
2,153
4,344
No they are not. They don't scan on device.
then you are not a developer..to see how the open source android smartphones are working...everything is managed on the phone the moment you go "online"
At hardware level huawei was the worst 2 years ago
 

Serban55

Suspended
Oct 18, 2020
2,153
4,344
I think this topic can be closed this by now OP had make up his mind..he left or not, his decision
12 pages for this kind of topic is already too much, is like people has nothing else to do :)
 
  • Angry
Reactions: Euronimus Sanchez

jz0309

Contributor
Sep 25, 2018
11,424
30,123
SoCal
Excluding FB, all else that scan do so when items are shared. Not on upload or download. None of these, including FB, scan on device.
about 6 or so weeks ago I decided to backup my data to OneDrive, I ended up putting ~ 75k photos up there, NOT sharing with anyone (though I do have some other folders that I do share) ... about 2 weeks later I started to receive emails from OneDrive "Your memories on this day ..." with a set of photos that were taken on this particular calendar day in different years ... So, the ARE scanning content and quite frankly I have no idea what else they are scanning for but I assume CSAM is one of them, and I'm ok with that, but, scanning my photos and then create email alerts, that IS shady, and while it might be covered in the T&Cs, I do not appreciate that.
 
  • Like
Reactions: dk001

eltoslightfoot

macrumors 68030
Feb 25, 2011
2,564
3,126
then you are not a developer..to see how the open source android smartphones are working...everything is managed on the phone the moment you go "online"
At hardware level huawei was the worst 2 years ago
You need more evidence than I am not a dev. And Huawei is not android.
 
  • Like
Reactions: Mendota

Mendota

macrumors 6502a
Jan 9, 2019
617
1,209
Omaha
... but they do pull all your data online, and then scan.
They do not "pull" anything. It is the user's choice as to what they choose to put on public cloud servers. It is also users' choice as to who they choose to browse with. Yes, the internet is public, and there are a lot of prying and watching eyes. We know that. But when I choose to "go home" as it were to my own personal devices, I don't want watching eyes there.
 

Mendota

macrumors 6502a
Jan 9, 2019
617
1,209
Omaha
No I didn’t make it up, it’s literally in their terms of service when you sign up. Saying “nuh uh” without actually doing any research whatsoever is incredibly silly. I don’t make things up for internet points. They are scanning all of your documents looking for things that sound like potential violence and they are also scanning for CSAM, which is already known they’ve been doing that for almost a decade. They scan everything in OneDrive looking for particular things and they will remove and report content if it is violation of their TOS. If you want to remain blissfully ignorant that’s on you. Some of you guys need to wake up to what’s actually happening. It’s not conspiracy theories.
I have and do read all terms of service for all products I use, to include Apple, Google, and Microsoft. Again THEY DO NOT DO DEVICE SCANNING! If it is not on their public servers, it is unknown to them.
 

Mendota

macrumors 6502a
Jan 9, 2019
617
1,209
Omaha
about 6 or so weeks ago I decided to backup my data to OneDrive, I ended up putting ~ 75k photos up there, NOT sharing with anyone (though I do have some other folders that I do share) ... about 2 weeks later I started to receive emails from OneDrive "Your memories on this day ..." with a set of photos that were taken on this particular calendar day in different years ... So, the ARE scanning content and quite frankly I have no idea what else they are scanning for but I assume CSAM is one of them, and I'm ok with that, but, scanning my photos and then create email alerts, that IS shady, and while it might be covered in the T&Cs, I do not appreciate that.
There is a bit of confusion here, the term "sharing" is defined here as uploading to the cloud itself. That is what we mean by sharing. This is separate from actual sharing with others. So yes, again anything that is upload to public servers, OneDrive, Dropbox, Google drive, etc. is scanned by those companies for CSAM. There is nothing "shady" about it, they are just going by dates. They think people like these features, but you can turn it off.
 

usagora

macrumors 601
Nov 17, 2017
4,869
4,456
You on the other hand appear to take Apple at what it says and are willing to hope for the best and wait till after this launches or Apple gives some more information.

That's the only logical thing to do unless you can prove they lie most of the time or something. Plus I see no reason for them to lie here.

1. How are you going to prevent repurposing of this system? You can’t “just say no” and “follow the law”.

Apple controls the system. Just like many of their systems, if they wanted to, they could abuse it themselves or allow others to abuse it. So I'm not sure how they can possibly prove the future to you.

2. Why such a high threshold for notification? This “30” indicates a potential serious concern with the incidence of false positives.

I've answered this elsewhere and used the analogy of backup systems. Does Boeing (or Airbus or any other aircraft company) installing backup hydraulic systems in their airplanes indicate a serious concern with their primary hydraulic systems? Of course not. It's simply a safety net. Another example I used was a seasoned climber using safety equipment. Does that mean they doubt their skills? Again, of course not.

To me, Apple setting the threshold to 30 is their way of basically saying they want to be sure that there will never be an account falsely flagged (because the odds are so incredibly small). And as I've also mentioned before, I doubt there are many people with collections of CSAM under 30 images.

3. Why is Apple sticking itself in the middle (verification at 30) instead of just handing it over to NCMEC who deal with this?

Isn't that precisely what they said they will do?

Screen Shot 2021-09-13 at 2.32.09 PM.png


4. A second source for verification of a match comes from where? CSAM databases are shared globally (source ICMEC).

I'm not sure what you're asking here. But flagged images will be manually reviewed. There will be no doubt whether they're CSAM at that point.

5. Why no notification to the user regarding matches at all?

What would be the point of that? "Hey, we just wanted to let you know that we suspect you may be in possession of child porn, so hurry up and delete everything before the cops see it!" Again, if it's false positives you're concerned about, they will easily be dismissed under manual review - and it's nearly unthinkable that someone could have 30 false positives to begin with.

6. Why on the device instead of server side when server side is a better option to clean the iCloud up?

This has been covered like 10 billion times already. On device scanning is hidden from Apple's eyes; server-side is not. As far as I can tell, their goal is not to drop a nuclear bomb to eradicate every last trace of CSAM by scanning the billions of photos currently on iCloud, but to thwart the further spread of CSAM on iCloud.

7. Is this a real system to be launched or is this just a pr move?

I'm sorry, but this is a silly question. If it were the latter, do you honestly think they're going to give you a straight answer?

Lot’s more.

Really? Or is that just what people say when they run out of questions but want to give the impression that the situation is more confusing than it is? LOL!

Apple half assed answered a couple of these built even those answers have gaps.

From reading your posts, I get the feeling that no matter what Apple says for any of these questions, you'll tear it apart - even if all you can say is, "Well, I think they're lying." It seems they simply can't win with some of you. You've already convicted and hanged them in your mind without a proper trial.
 
Last edited:

usagora

macrumors 601
Nov 17, 2017
4,869
4,456
Once again you read into it in an effort to rationalize your viewpoint.
You do that in an effort to claim it false when you know it isn’t.
You just don’t like it.

Are you kidding me? I'm not reading into anything - I'm simply going with what Apple has said in black and white. YOU'RE the one reading into things (or perhaps haven't even read, or have forgotten, what Apple has actually stated on this topic).

To call this spyware is absurd on every level.
 
  • Like
Reactions: artfossil

dk001

macrumors demi-god
Oct 3, 2014
11,142
15,496
Sage, Lightning, and Mountains
about 6 or so weeks ago I decided to backup my data to OneDrive, I ended up putting ~ 75k photos up there, NOT sharing with anyone (though I do have some other folders that I do share) ... about 2 weeks later I started to receive emails from OneDrive "Your memories on this day ..." with a set of photos that were taken on this particular calendar day in different years ... So, the ARE scanning content and quite frankly I have no idea what else they are scanning for but I assume CSAM is one of them, and I'm ok with that, but, scanning my photos and then create email alerts, that IS shady, and while it might be covered in the T&Cs, I do not appreciate that.
Apples and Biscuits. IMO.

Scanning using an AI to lump together faces, basing off of backgrounds or photo metadata is far different than scanning for Illegal content against a known database for notification to authorities. Any scanning that the Clouds do and the subsequent output is sent to you in the “See what we did? Vacation Memories!!”. But not the CSAM notifications.

This function is in the TOC/EULA. I remember seeing it for Google. I know in Google you can turn it off - I did. I do not know about OneDrive. I only store documents / work stuff there. Much of it encrypted.

It is functions like this I suspect many are not concerned about CSAM scanning in the Cloud. Or rather they accept. Now that I know that these sites are doing it, I am wondering what else they scan for that I am not aware of. Happy with it? Not really and am looking at my options and something I will pay more attention to going forward. I have recently started using iDrive and will likely move much of my stuff there. Also looking at Proton Drive.

But scan on device with all results hidden from me AND the ability to notify authorities? I am liking that less and less the more I learn.
 

dk001

macrumors demi-god
Oct 3, 2014
11,142
15,496
Sage, Lightning, and Mountains
That's the only logical thing to do unless you can prove they lie most of the time or something. Plus I see no reason for them to lie here.



Apple controls the system. Just like many of their systems, if they wanted to, they could abuse it themselves or allow others to abuse it. So I'm not sure how they can possibly prove the future to you.



I've answered this elsewhere and used the analogy of backup systems. Does Boeing (or Airbus or any other aircraft company) installing backup hydraulic systems in their airplanes indicate a serious concern with their primary hydraulic systems? Of course not. It's simply a safety net. To me, Apple setting the threshold to 30 is their way of basically saying they want to be sure that there will never be an account falsely flagged (because the odds are so incredibly small). And as I've also mentioned before, I doubt there are many people with collections of CSAM under 30 images.



Isn't that precisely what they said they will do?

View attachment 1830480



I'm not sure what you're asking here. But images will be manually reviewed. There will be no doubt whether they're CSAM at that point.



What would be the point of that? "Hey, we just wanted to let you know that we suspect you may be in possession of child porn, so hurry up and delete everything before the cops see it!" Again, if it's false positives you're concerned about, they will easily be dismissed under manual review - and it's nearly unthinkable that someone could have 30 false positives to begin with.



This has been covered like 10 billion times already. On device scanning is hidden from Apple's eyes; server-side is not. As far as I can tell, their goal is not to drop a nuclear bomb to eradicate every last trace of CSAM by scanning the billions of photos currently on iCloud, but to thwart the further spread of CSAM on iCloud.



I'm sorry, but this is a silly question. If it were the latter, do you honestly think they're going to give you a straight answer?



Really? Or is that just what people say when they run out of questions but want to give the impression that the situation is more confusing that it is? LOL!



From reading your posts, I get the feeling that no matter what Apple says for any of these questions, you'll tear it apart - even if all you can say is, "Well, I think they're lying." It seems they simply can't win with some of you. You've already convicted and hanged them in your mind without a proper trial.

Busy day so I’ll get back to you later.
Thanks for taking the time to address each quaestion.
 

dk001

macrumors demi-god
Oct 3, 2014
11,142
15,496
Sage, Lightning, and Mountains
Are you kidding me? I'm not reading into anything - I'm simply going with what Apple has said in black and white. YOU'RE the one reading into things (or perhaps haven't even read, or have forgotten, what Apple has actually stated on this topic).

To call this spyware is absurd on every level.

Read it a couple of times keeping up with @Jayson A posts.

You are selectively placing the term spy/spyware in a context that supports your argument. You are taking what Apple says as gospel, glossing over the gaps, the contradictions, and claiming “this is the proof”.

More power to you. You can do it. For many of us (including security/privacy/academic/technical/users) we want the complete picture. Not a marketing play mixed up with bits and pieces of “supportive” semi-technical papers.
 

Sciomar

macrumors 6502a
Nov 8, 2017
559
1,737
As I said in, I think it was the 2nd or 3rd thread on this topic, the social court has made its ruling based on emotion, influencer hype, and hypothetical what-ifs. Apple publicly stated they're going to implement client-side CSAM rather than burying it deep in the terms of service like most other companies doing it, or hiding it and hoping we never found out like big brother likes to do, that was wonderful transparency. The moment end-to-end encryption became a thing the advocates for groups like NCMEC started screaming because that meant there would be less reporting, wherein companies implemented CSAM scanning. Would you rather deal with a company that tells you they're going to do it or a company that hopes you never find out they're doing it?
 

dk001

macrumors demi-god
Oct 3, 2014
11,142
15,496
Sage, Lightning, and Mountains
There is a bit of confusion here, the term "sharing" is defined here as uploading to the cloud itself. That is what we mean by sharing. This is separate from actual sharing with others. So yes, again anything that is upload to public servers, OneDrive, Dropbox, Google drive, etc. is scanned by those companies for CSAM. There is nothing "shady" about it, they are just going by dates. They think people like these features, but you can turn it off.

One item that I am unsure of: scanning for CSAM.
Think about how much is uploaded each minute onto Amazon servers. Would Amazon be able to actually scan this for CSAM? Or would they have to scan a much smaller subset?

Looking at this, I suspect the latter.
 

Mendota

macrumors 6502a
Jan 9, 2019
617
1,209
Omaha
As I said in, I think it was the 2nd or 3rd thread on this topic, the social court has made its ruling based on emotion, influencer hype, and hypothetical what-ifs. Apple publicly stated they're going to implement client-side CSAM rather than burying it deep in the terms of service like most other companies doing it, or hiding it and hoping we never found out like big brother likes to do, that was wonderful transparency. The moment end-to-end encryption became a thing the advocates for groups like NCMEC started screaming because that meant there would be less reporting, wherein companies implemented CSAM scanning. Would you rather deal with a company that tells you they're going to do it or a company that hopes you never find out they're doing it?
I don't know about you, but I was always aware that companies scanned for illegal content on their servers. And all companies have a EULA around their products and services. It is just that many people don't bother to read them. So no, it is not just Apple communicating, everyone does. I don't see what Apple is doing here as communication, I see it as rationalization. And anytime someone began to rationalize I take a second look.
 

dk001

macrumors demi-god
Oct 3, 2014
11,142
15,496
Sage, Lightning, and Mountains
That's the only logical thing to do unless you can prove they lie most of the time or something. Plus I see no reason for them to lie here.
What? That makes no sense in any fashion. Apple states A/B/C. I have questions on the information regarding B/C and you say just accept it all as gospel? LMAO!!!! Not until I can get answers that satisfy me and settle the claim fro all the professions that say “this is a BAD thing!”.

Apple controls the system. Just like many of their systems, if they wanted to, they could abuse it themselves or allow others to abuse it. So I'm not sure how they can possibly prove the future to you.
Where am I looking at the future except in response to an Apple answer?

I've answered this elsewhere and used the analogy of backup systems. Does Boeing (or Airbus or any other aircraft company) installing backup hydraulic systems in their airplanes indicate a serious concern with their primary hydraulic systems? Of course not. It's simply a safety net. To me, Apple setting the threshold to 30 is their way of basically saying they want to be sure that there will never be an account falsely flagged (because the odds are so incredibly small). And as I've also mentioned before, I doubt there are many people with collections of CSAM under 30 images.
Still doesn’t answer my question. Apple even claimed this would be the initial setting and would change as the system proved itself.

Yes and it makes little sense from a legal perspective. This reads more like a cya. That is why the question.

I'm not sure what you're asking here. But images will be manually reviewed. There will be no doubt whether they're CSAM at that point.
This is something @Jayson A pointed out when the topic first came up. Apple will use a second check to make sure something was not seeded into the original database. When I tried looking into who has databases and the sources this piece doesn’t fit well.

What would be the point of that? "Hey, we just wanted to let you know that we suspect you may be in possession of child porn, so hurry up and delete everything before the cops see it!" Again, if it's false positives you're concerned about, they will easily be dismissed under manual review - and it's nearly unthinkable that someone could have 30 false positives to begin with.
Potential false positives has the ability to easily destroy someone life even if they are innocent. If a system is generating false positives I know I would like to be notified. Not just “surprise!”. Apple feels it can happen. They even put an appeals process in place.

This has been covered like 10 billion times already. On device scanning is hidden from Apple's eyes; server-side is not. As far as I can tell, their goal is not to drop a nuclear bomb to eradicate every last trace of CSAM by scanning the billions of photos currently on iCloud, but to thwart the further spread of CSAM on iCloud.
BS. You can run the exact same process server side and catch ALL THE CSAM. On device you only catch the small bit in the event they leave the iCloud Photo feature on And catch none of the items already in the iCloud.

I'm sorry, but this is a silly question. If it were the latter, do you honestly think they're going to give you a straight answer?
You may think so but quite a few think otherwise. If this had not tanked from a marketing announcement perspective it would have been a great argument to “stay in the Apple Environment to be private”.

Really? Or is that just what people say when they run out of questions but want to give the impression that the situation is more confusing that it is? LOL!
Sad you feel that way. You asked for 2-3 and I served up a lot more, none of which you have done other than play the same silly game of “but Apple says” and “That’s silly/dumb/nobody asks that”. Based on your answers, why would I put up more?

From reading your posts, I get the feeling that no matter what Apple says for any of these questions, you'll tear it apart - even if all you can say is, "Well, I think they're lying." It seems they simply can't win with some of you. You've already convicted and hanged them in your mind without a proper trial.
Feel what you want. You are entitled to your own opinion. Rather than really try to answer you just point and claim “you’ll never …”. I had hoped you could at least come up with something I had missed and possibly answer at least one item or add to my knowledge base. You have done neither.
 
Last edited:

dk001

macrumors demi-god
Oct 3, 2014
11,142
15,496
Sage, Lightning, and Mountains
As I said in, I think it was the 2nd or 3rd thread on this topic, the social court has made its ruling based on emotion, influencer hype, and hypothetical what-ifs. Apple publicly stated they're going to implement client-side CSAM rather than burying it deep in the terms of service like most other companies doing it, or hiding it and hoping we never found out like big brother likes to do, that was wonderful transparency. The moment end-to-end encryption became a thing the advocates for groups like NCMEC started screaming because that meant there would be less reporting, wherein companies implemented CSAM scanning. Would you rather deal with a company that tells you they're going to do it or a company that hopes you never find out they're doing it?

I wonder what changes will be made to TOC/EULA for iOS/iPadOS/MacOS to incorporate these new features. It will be there and I am wondering how much legalese will it be shrouded in.
 

usagora

macrumors 601
Nov 17, 2017
4,869
4,456
Read it a couple of times keeping up with @Jayson A posts.

You are selectively placing the term spy/spyware in a context that supports your argument. You are taking what Apple says as gospel, glossing over the gaps, the contradictions, and claiming “this is the proof”.

More power to you. You can do it. For many of us (including security/privacy/academic/technical/users) we want the complete picture. Not a marketing play mixed up with bits and pieces of “supportive” semi-technical papers.

The only definition I can think of the word "spy" that doesn't involve secrecy is from the children's game. "I spy with my little eye" where it's being used in the poetic sense of simply "noticing" or "seeing." That is so obviously NOT the niche definition you're intending. You're going for the emotional impact of the word "spying" or "spyware." It's so obvious and dishonest.
 

dk001

macrumors demi-god
Oct 3, 2014
11,142
15,496
Sage, Lightning, and Mountains
The only definition I can think of the word "spy" that doesn't involve secrecy is from the children's game. "I spy with my little eye" where it's being used in the poetic sense of simply "noticing" or "seeing." That is so obviously NOT the niche definition you're intending. You're going for the emotional impact of the word "spying" or "spyware." It's so obvious and dishonest.

I posted it before.
Even recommend trying Websters.
It can be used a number of ways.

 
  • Like
  • Love
Reactions: M5RahuL and Mendota

keeper

macrumors 6502a
Apr 23, 2008
520
303
So it seems pepole are unhappy with apple scanning on devise before data is encrypted for upload to iCloud.

But the same people are happy for it to be scanned on server.
But surly for that to happen Microsoft etc must be unencrypting your data on server to perform the scan, that seems wrong to me…and worse than apple.
 

Jayson A

macrumors 68030
Sep 16, 2014
2,671
1,935
This is something @Jayson A pointed out when the topic first came up. Apple will use a second check to make sure something was not seeded into the original database. When I tried looking into who has databases and the sources this piece doesn’t fit well.
The secondary on-server scan doesn’t check to see if a non-CSAM has was injected into the system, it simply uses the actual image and compares it visually using a perceptual hashing process against a separate perceptual hash database. It’s basically to rule out false positives that may have accidentally triggered a match (like the ones of the dog that were forced collisions using software to generate an image with the same hash). The image wouldn’t visually match the CSAM counterpart, so that image would be discarded before humans even step in.
 

Mendota

macrumors 6502a
Jan 9, 2019
617
1,209
Omaha
So it seems pepole are unhappy with apple scanning on devise before data is encrypted for upload to iCloud.

But the same people are happy for it to be scanned on server.
But surly for that to happen Microsoft etc must be unencrypting your data on server to perform the scan, that seems wrong to me…and worse than apple.
It only seems worse to you because you want to defend and justify what Apple is doing. If I go to the airport to get on a plane, I accept the security procedures. It is a public place; however, I do not accept someone coming into my home to look for contraband. This is fundamental to personal rights and that it is why we place such a premium on the sanctity of our homes and possessions. It is why we require our police to have warrants to come into our homes. I am not interested in giving Apple a pass that is not even afforded to the police and national security.
 

Jayson A

macrumors 68030
Sep 16, 2014
2,671
1,935
It only seems worse to you because you want to defend and justify what Apple is doing. If I go to the airport to get on a plane, I accept the security procedures. It is a public place; however, I do not accept someone coming into my home to look for contraband. This is fundamental to personal rights and that it is why we place such a premium on the sanctity of our homes and possessions. It is why we require our police to have warrants to come into our homes. I am not interested in giving Apple a pass that is not even afforded to the police and national security.
Bad analogy. Your device literally does nothing until you try to upload pictures to the cloud. The on-device hashing system is completely blind and your device knows nothing about whether a match has been found or not, those vouchers are then uploaded to Apple with your photo upload and only THEN does Apple know if it's a match. So literally, this system REQUIRES iCloud to work. No iCloud, no upload, no scan, no vouchers. It's really that simple.

So to update your analogy. It's like arriving at the airport and putting your bag in the scanner before you get on the plane. Your iPhone is the bag, and the scanner is... well, you know and the plane is iCloud.
 
  • Like
Reactions: usagora and keeper
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.