That's a good point. On the other hand, even normal binocular vision doesn't work very well at the edges of the field of view, so it might not have to be a show stopper.
Another thing that occurred to me is that the Tech Specs lists 51–75 mm for Interpupillary Distance, so if it can be set to be different from a person's actual IPD, that might get the effect of mild horizontal prism if used in conjunction with a sphere prescription.
I hope one of the hands on reviews or in-store demos can determine whether that has any chance of working.
Failing that, maybe positioning a window at the wrong distance can help. If you need divergence, it probably can't go past infinite distance, but for convergence, distance could go past zero to negative. But the software might blank out the image if it thinks it's too close. If so, knowing that distance could be useful information.
(knowing the optical focal distance that the AVP display uses when there are no optical inserts would also be useful information)
It's been quite a while since I took an optics class, and my scientific understanding of human vision comes entirely from Wikipedia, but mostly out of geek/engineer-curiosity I've been kind of pondering how this works.
I'm assuming all your speculation is related to software correction of pupil-offset rather than doing it with glass, and seems like it might work. I should note that my comment about the edges of the field of vision if software-corrected have more to do with essentially letterboxing your vision when you looked away from center than peripheral vision. It would be less jarring if the other eye's display was cropped accordingly, so the edge of your field of view (which is no longer the edge when you look away from the center on the axis of the offset) isn't different between your two eyes. Having a narrower "window" through which you're looking would presumably be much less jarring than a different window for each eye.
What I'm curious about is whether the eye tracking system operates through the lenses or is inside them and pointed directly at your eyes. I kind of assume the latter, since corrective lenses would inherently blur everything on the far side of the lenses calibrated to work on an eyeball, but maybe it's looking at your retina, in which case the glass optics need to match the lenses in your eye in order for it to focus there.
If--big if--it is indeed looking directly at your eyes, rather than through the lenses, then this seems very correctable. As long as the lenses make the images look right to you, all you should theoretically need to do is calibrate the eye tracking system to know that "when right eye is looking at X,Y,Z coordinates, it's pointed at A,B,C coordinates on the screen". I don't want to say that's
easy to correct for, because I'm sure there are many complicated nuances to how it works, but that seems like a fairly straightforward software correction that could be either pre-programmed based on the prescription or manually calibrated by the user ("Look here and tap. Now look here and tap." repeat 50 times around the field of vision and for various distances).
An interesting aside, 51-75mm IPD is
much wider than what most VR headsets offer; the Quest 3 only has 58-70mm. That's a significant difference; according to the statistics from the US military
cited in Wikipedia, the Vision Pro's range should exclude less than 0.1% of the population at either end, probably less.
For the Quest 3, in contrast, something like 10% of women and 4% of men are below 58mm, and 5% of men and over 1% of women are above 70mm, so overall right around 10% of the population is "out of spec".
Perhaps the way the Vision Pro works makes it easier to compensate than other headsets, or maybe Apple is just using some of that huge price tag to cover a bigger fraction of the population.