I've been following disparate technologies for years that combined with the direction Apple has taken on public products like iPhone (FaceID, LiDar), Watch (miniaturization, display, battery efficiency), Mac (Apple silicon) and iOS (AR), are finally coming together into a product that together will be unlike anything on the market today. Very much like iPhone did in 2007.
Predictions:
These are going to be smaller than most people here are expecting. Apple will do away with traditional optical approaches and deliver sharpness in software.
TrueDepth sensors, made commonplace with Face ID, will measure the distance from the display to the eye and the shape of the eye. An infrared camera will measure the direction and dilation of the pupil to know where the user is looking, including in three dimensions since pupil dilation determines focus and depth perception. Knowing the exact distance, enables offsetting distortion in the opposite direction to a user's vision.
There'll be a standard eye test at setup. A slider, perhaps controlled by an Apple Watch-like Digital Crown will be adjusted until the user can see every line in sharpness. I expect this technology to appear on iPhones and iPads (and future Macs with a TrueDepth array).
This digital optical variability will enable dynamic adjustments multiple times per second, solving the problem of edge blur in existing VR headsets as your face moves the set. This may also solve the motion sickness some users experience with VR headsets. Both of these are important problems to solve if Apple is going to break this segment into the mainstream.
Removing unnecessary glass and mechanical adjustments traditionally needed to account for vision differences, will enable a shorter profile ahead of the user and a much more lightweight assembly. The fact that an external power source is expected, reinforces this. They're going for small and lightweight. This leads me to believe that it won’t have a strap, but instead typical stems that sit on your ears.
All of the processing and short term battery will reside in the stems to keep the forward frame as streamlined as possible. For context in how this is possible, an Apple Watch's entire System in Package would fit along a single stem, leaving room for batteries in the tips of the stem, the farthest point away from the face for better weight distribution. Multiply that by 2 stems, each running an M-class chip and you have a powerful computer on your face. Notably, Apple Silicon has been designed to work in modules. M Pro and Max chips are simply multiple chips connected to one another.
I'm not confident Apple is going to introduce external cameras, given the "Glasshole" trap Google stepped into. The same spatial awareness required for AR can be accomplished with LiDar, developed in iPad and iPhone, which measures the external world in topography instead of light. LiDar can map hands in real time, doing away with the need for controllers. If they decide to include an external camera, used for photography and video, then it’ll be hidden behind mirrored glass or variable opacity glass.
Because I don't think Apple is all too invested in full immersion and Extended/Mixed Reality is the future they’re headed towards, these glasses will have variable opacity, from fully opaque to tinted like sunglasses. To enable this, the displays will either have to be transparent OLEDs that you can see through when the front glass is transparent or Apple has made advancements in retinal projection – projectors shooting light directly onto your retina to produce images.
This technology is going to disrupt opticians for the most common eye tests. Apple is getting into this ahead of Extended Reality glasses that look like standard glasses and use retinal projectors instead of displays. When this first generation morphs into full XR that you can wear around in public, every device would be custom made for each user's eyesight. Warby Parker has these logiistics figured out, so will Apple. Having your iPhone's TrueDepth system perform the eye test, would simplify the process, enabling mass market scaling.
I've gone into the software (Memoji, AR Kit) and real world use cases in other threads, and those too will have been developed in secret out in the open, as Apple has done extensively for decades now. New devices are unveiled, yet they feel familiar because we'd seen their components spread out amongst different features and technologies on existing devices, and then they're brought together in a single device that makes the experience cohesive.
Predictions:
These are going to be smaller than most people here are expecting. Apple will do away with traditional optical approaches and deliver sharpness in software.
TrueDepth sensors, made commonplace with Face ID, will measure the distance from the display to the eye and the shape of the eye. An infrared camera will measure the direction and dilation of the pupil to know where the user is looking, including in three dimensions since pupil dilation determines focus and depth perception. Knowing the exact distance, enables offsetting distortion in the opposite direction to a user's vision.
There'll be a standard eye test at setup. A slider, perhaps controlled by an Apple Watch-like Digital Crown will be adjusted until the user can see every line in sharpness. I expect this technology to appear on iPhones and iPads (and future Macs with a TrueDepth array).
This digital optical variability will enable dynamic adjustments multiple times per second, solving the problem of edge blur in existing VR headsets as your face moves the set. This may also solve the motion sickness some users experience with VR headsets. Both of these are important problems to solve if Apple is going to break this segment into the mainstream.
Removing unnecessary glass and mechanical adjustments traditionally needed to account for vision differences, will enable a shorter profile ahead of the user and a much more lightweight assembly. The fact that an external power source is expected, reinforces this. They're going for small and lightweight. This leads me to believe that it won’t have a strap, but instead typical stems that sit on your ears.
All of the processing and short term battery will reside in the stems to keep the forward frame as streamlined as possible. For context in how this is possible, an Apple Watch's entire System in Package would fit along a single stem, leaving room for batteries in the tips of the stem, the farthest point away from the face for better weight distribution. Multiply that by 2 stems, each running an M-class chip and you have a powerful computer on your face. Notably, Apple Silicon has been designed to work in modules. M Pro and Max chips are simply multiple chips connected to one another.
I'm not confident Apple is going to introduce external cameras, given the "Glasshole" trap Google stepped into. The same spatial awareness required for AR can be accomplished with LiDar, developed in iPad and iPhone, which measures the external world in topography instead of light. LiDar can map hands in real time, doing away with the need for controllers. If they decide to include an external camera, used for photography and video, then it’ll be hidden behind mirrored glass or variable opacity glass.
Because I don't think Apple is all too invested in full immersion and Extended/Mixed Reality is the future they’re headed towards, these glasses will have variable opacity, from fully opaque to tinted like sunglasses. To enable this, the displays will either have to be transparent OLEDs that you can see through when the front glass is transparent or Apple has made advancements in retinal projection – projectors shooting light directly onto your retina to produce images.
This technology is going to disrupt opticians for the most common eye tests. Apple is getting into this ahead of Extended Reality glasses that look like standard glasses and use retinal projectors instead of displays. When this first generation morphs into full XR that you can wear around in public, every device would be custom made for each user's eyesight. Warby Parker has these logiistics figured out, so will Apple. Having your iPhone's TrueDepth system perform the eye test, would simplify the process, enabling mass market scaling.
I've gone into the software (Memoji, AR Kit) and real world use cases in other threads, and those too will have been developed in secret out in the open, as Apple has done extensively for decades now. New devices are unveiled, yet they feel familiar because we'd seen their components spread out amongst different features and technologies on existing devices, and then they're brought together in a single device that makes the experience cohesive.