What if with the glasses, apple's main long term goal is to ditch support for monitors/screens altogether and force users to buy/use the glasses as the only supported video output device/monitor/screen?
The glasses are going to be a facetime version of the Jedi Council meetings from those awful Star Wars prequels.
transparent people sitting on empty couches, that you can facetime with, and even worse, they're going to be infantalised memoji versions of people - because sending animation data of a model is lower bandwidth than sending video.
There's no technology path on the horizon for opaque-capable eyescreens that aren't based on passthrough video, and retina-quality passthrough video isn't going to be in a non-specialist-bulk device any time soon.
Welders can wear a welding helmet all day to do welding, office workers won't to do office work. Likewise, 3D modellers / specialist media producers can wear a bulky HMD. General computing won't.
I personally thought it was more likely that Apple's "glasses" would be to put extra (translucent) screens around your existing device screens (the physical screen can hide a keying pattern in a subliminal refresh rate) - so you look at your apple watch, and the glasses put a grid of 8 AR watch screens around the real one, so you can swipe inwards to switch the app thats active in the middle. Same for phone, and for Mac - palettes for your media apps being floating, hand addressable windows outside you screen, for example. It wouldn't work for a colour picker which needs opacity - but it could be virtually attached to your phone, so when you want to pick a colour, you look at your phone, whose screen becomes a colour panel, and the glasses bloom controls off to the side for interaction.
It's going to be a thing to
increase the value of your existing screen, not replace it </my-suspicion>.