Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

one more

macrumors 603
Aug 6, 2015
5,159
6,577
Earth
Do you actually know how current VR headsets work? These headsets have been around for ages. Your eyes focus further away in a VR headset, so much so that if you need prescription glasses for myopia, you will need prescription lenses for the VR headset.

I have no idea how these headsets work, as I have never tried one, but I have a basic understanding of how my eyes work, so am just pondering out loud. ;)
 

Abazigal

Contributor
Jul 18, 2011
20,399
23,909
Singapore
In the keynote, they introduced a new feature for the next iPadOS which tells you when your eyes are too close to the screen, because of the impact on eye health. And then almost immediately introduced the apple vision pro where you put screens a few centimetres from your eyes.

Why is no one talking about this? Or have I missed it?

VR headsets have been out for some time now. If they weren’t an issue then, I don’t see why they would be an issue now.
 

Absrnd

macrumors 6502a
Apr 15, 2010
915
1,671
Flatland
I have no idea how these headsets work, as I have never tried one, but I have a basic understanding of how my eyes work, so am just pondering out loud. ;)
Thats's Ok,
That is the great thing about AR and VR headsets, your eyes focuses exactly the same as looking into distance, as you normally would, and there is no strain on eye muscles in any way.

With an iPhone/iPad close up, both eyes have to strain a bit strain to see the same spot (cross eyed), but with 2 screens each eye looks at a separate screen, en react as seeing into the distance, and not go cross eyed.
 

Luna Murasaki

macrumors regular
Jun 24, 2020
121
287
Purple Hell
I think looking at close objects all day is certainly going to have a permanent effect, especially when you're a child and your eyes are still developing, but that would be because you are looking close at something, not because that something is a screen. You could spend all day as a child looking up close at printed stuff like comic books and end up nearsighted for the same reason. It's probably where the stereotype of the glasses-wearing nerd comes from in the first place - a stereotype that predates widespread computer use.

Given that the VR headsets encourage your eyes to focus on non-existent objects that are far away, I'd imagine spending all day in a VR headset is considerably less likely to cause distance-related eyestrain and nearsightedness than spending all day glued to an iPad screen. I also think an iPad, in turn, is less likely to cause this than an iPhone as you are less tempted to look up close at larger screens. That said, I look forward to a study 20 years from now that examines the differences in vision between adults who grew up wearing VR headsets and adults that didn't. There will probably be differences - not necessarily bad ones.

I myself am nearsighted probably because I was an extreme shut-in growing up spending all day on my parents' MS-DOS computer. I don't see this as damage, I see this as my eyes developing to be optimized for what I use them for. I am thinking of getting a second pair of glasses that does not correct my nearsightedness so I can sit at my computer for long periods more comfortably. I'm certainly grateful for the protection I will have against age-related far-sightedness when I am older.
 
Last edited:

Longplays

Suspended
May 30, 2023
1,308
1,158
Think of it this way... do people talk about eye health in related to screen time use when it comes to smartphones, tablets, watches, TVs, desktops, laptops, etc?

How about reading on paper?

People who can afford eye healthcare will be the ones buying a $3,499 devices. The bottom 99.9% will not be impacted until prices drop to $429-1599. When that occurs then the top 20% will be impacted by it.
 

Alex Cai

macrumors 6502
Jun 21, 2021
431
387
How convex lenses work
截屏2023-06-08 下午5.20.10.png
Vision Pro makes equivalent eye damage as looking at a 75’ Oled TV on your couch.
Your eyes rest on a focal length far past 3cm, around 3m away.
 

zapmymac

macrumors 6502a
Aug 24, 2016
941
1,098
SoCal ☀️
Tens of millions of people use VR headsets. I have not heard extensive complaints about eyestrain in any of the VR related forums I've frequented.

This is stupid. The only thing that holding it off-axis does is reduce the light reaching your eyes by a small amount, because most displays have some amount of falloff in brightness off-axis. If you want that, you're better off just turning down the brightness of the display.
It's not stupid according to my USA board certified eye doctor. Tell me what does your eye doctor say when you ask?
 
  • Like
Reactions: airbusking

Lounge vibes 05

macrumors 68040
May 30, 2016
3,885
11,153
I do have this concern. With constant eye tracking, it may not be good to use it continuously.
If you are seriously concerned about this (and you shouldn’t be, but if you are) Apple did announce that you can turn eye tracking off as an accessibility feature for the headset.
Then it just uses hand tracking to control it
 

dasmb

macrumors 6502
Jul 12, 2007
422
462
No, the focus field is probably about 5' to 10' or more away. The displays are going to be an inch or so away from your eyes. Hold something up to your eyes that close and you will see that it is impossible to focus that close.
Great, so I'll need progressive lenses even as a cyborg.
 

johnsterdam

macrumors member
Original poster
May 2, 2021
38
61
How convex lenses work
View attachment 2214970
Vision Pro makes equivalent eye damage as looking at a 75’ Oled TV on your couch.
Your eyes rest on a focal length far past 3cm, around 3m away.
From this, and others' replies, it sounds like it's not as bad as I thought - that counterintuitively, you're not looking at (or at least focusing on) something 'right in front of your eyes'. I still find it surprising that Apple don't talk about this at all, given their increasing focus on health.
 

NT1440

macrumors Pentium
May 18, 2008
15,094
22,161
From this, and others' replies, it sounds like it's not as bad as I thought - that counterintuitively, you're not looking at (or at least focusing on) something 'right in front of your eyes'. I still find it surprising that Apple don't talk about this at all, given their increasing focus on health.
Honestly I bet they wanted to, but in 2+ hour presentation some things end up getting cut.
 
  • Like
Reactions: Tagbert

rumz

macrumors 65816
Feb 11, 2006
1,226
635
Utah
It shouldn't be on them to convince you. Apple hasn't just invented the VR headset. This tech has been around for ages and you could look up any VR headset to understand this. The onus is on you to look up the very easily accessible info.
It is 100% on Apple to convince people to spend their money on their devices ;) It probably won't be many people until the price comes down to a consumer level (if it does).

I am curious about how it works though-- I initially wondered if people who need reading glasses (presbyopia?) would be able to use this at all (let alone with special lenses -- since reading glasses aren't necessarily prescription but generally just a certain strength or magnification). (Yes. I've fallen into that category in the last year. Woke up one day and couldn't read my Apple Watch without moving it a little further from my eyes than usual.)
 

Jensend

macrumors 65816
Dec 19, 2008
1,454
1,667
If you are seriously concerned about this (and you shouldn’t be, but if you are) Apple did announce that you can turn eye tracking off as an accessibility feature for the headset.
Then it just uses hand tracking to control it
The eye tracking is used for more than just pointing. It’s used to render your eyes on the outward facing display and “persona”. But more importantly, it is used for foveated rendering: the image is rendered at a higher quality where your eyes are looking. Also, it may be used to correct distortion from the lenses: as you rotate your eyes, your pupils move, which puts them in a different place in relation to the lenses of the device, which will effect the shape of the image that is projected into your eyes.
 
  • Like
Reactions: Tagbert

0339327

Cancelled
Jun 14, 2007
634
1,936
In the keynote, they introduced a new feature for the next iPadOS which tells you when your eyes are too close to the screen, because of the impact on eye health. And then almost immediately introduced the apple vision pro where you put screens a few centimetres from your eyes.

Why is no one talking about this? Or have I missed it?

Dan from MR did a very in-depth review. Apparently there is glass lenses to make the image appear further away, allowing the yes to focus properly.
 

teh_hunterer

macrumors 65816
Jul 1, 2021
1,231
1,673
It is 100% on Apple to convince people to spend their money on their devices ;)

That's 100% irrelevant to what I said. If someone isn't "convinced" by a correct explanation about how focal distance works with VR headsets, VR headsets have been out long enough for the onus to be on them to take 1 or 2 minutes to research it.
 

Zest28

macrumors 68030
Jul 11, 2022
2,589
3,954
In the keynote, they introduced a new feature for the next iPadOS which tells you when your eyes are too close to the screen, because of the impact on eye health. And then almost immediately introduced the apple vision pro where you put screens a few centimetres from your eyes.

Why is no one talking about this? Or have I missed it?

That's why Apple restricted the battery life to only 2 hours :p
 

Lioness~

macrumors 68040
Apr 26, 2017
3,408
4,247
I don't mind waiting at all to see how this evolve, and especially for the eyes.
I use my inbuilt longer vision sight anyway ☺️
Then we'll see what 'child-diseases' that are occurring? How the eyes react after a lot of use etc?
Other's can be test pilots for this, that suits me just fine.
Then I'll see, IF I will be interested of it at all at a later point.
Not all Apple products are for everyone. As for now, they seems far too big and bulky to me, but as time goes they might get a better shape. Apple Vision Mini might be more interesting.
 

Analog Kid

macrumors G3
Mar 4, 2003
9,364
12,621
Physically, our eyes do two things when we focus at different distances:
  • They use muscles to reshape their biological lenses to optically focus the light from a point at a certain distance onto our retinas-- this is how we make images sharp and clear at different distances and it's the part that glasses and contact lenses help us correct when our eyes are imperfect.
  • They pivot relative to each other so both eyes are looking at the same thing, this is a dominant method (among others) that our brain uses to figure out how far away something is. (Our brain is heavily involved here though, so it's more than just a simple triangulation. We also use scale, lighting, parallax and other things to infer distance which is why we can get a sense of 3D space in a 2D image).
The cameras capturing live action 3D movies have to make similar choices-- they're spaced apart a certain amount to give a sense of depth, but they have to choose a focal point which is why even though you can look around a 3D movie and get a sense of depth, only part of it is in sharp focus.

The independent displays to each eye give the sense of depth by getting our eyes to pivot relative to each other. But that display is at one apparent distance, and that's where our eyes will focus their lenses.

I say apparent distance because the optics between the eyes and the display change optical light field and put the focal point further away than the actual display. We're not focusing an inch in front of our eyes even if that's where the display is because the optics are extending the focal distance to some further point. This is why even if you're near sighted (see well close but poorly far away) you need to have corrective lenses in AVP to see the displays that are physically close to your eyes-- the optics in between changed the focus.

The focus optics are a 3D phenomenon, so there's no way to adjust it by adjusting the image on the display itself. There's been some work done on light field imaging that capture vector light fields so you can refocus a captured image later (or extend the depth of field)-- presumably a similar technique could be done at the display to change apparent point of focus but those trade pixel density to get that effect and the AVP displays are already remarkably dense to get retina resolution at that distance. This is why some folks are looking at raster lasers to scan the image directly to your retina, because there's no focal point in that case-- that just makes me uncomfortable for probably irrational reasons, though.

Unless the optics in the AVP are motorized, there's no real way to change the focus point for each eye. I'd guess they picked a neutral distance to minimize the amount of muscle strain in the eye needed to focus at that distance-- it will be different than looking at a display close to your face where you have constantly exercise your eye muscles just to stay focused.

So, I'm not an ophthalmologist, but all the above leads me to guess that there won't be eye strain in the same way we have it reading too close to our faces. What I don't know is if the brain is sensitive to the fact that your eyes are focusing at one distance and triangulating to another. It wouldn't surprise me if it is-- we use all sorts of hints in the brain to measure our surroundings. Some of these things our brain just seems to quickly adapt to, some of them lead to distress that takes the form or nausea or disorientation.

And since I've already written too much, I'll throw this in because I think it's an interesting theory about why we get nauseous in these situations: our brains are highly evolved to the natural world and knows how to expect the natural world to behave. When we sit in the back of a car reading a book, or a headset lags our head motion, our eyes and the fluid in our ears disagree about how we're moving-- but we didn't evolve for cars or head tracking displays. What in prehistoric nature would lead to mismatched sensor inputs? Neurotoxins. So our body's response it make itself puke up the poison.
 

Spaceboi Scaphandre

macrumors 68040
Jun 8, 2022
3,414
8,107
In the keynote, they introduced a new feature for the next iPadOS which tells you when your eyes are too close to the screen, because of the impact on eye health. And then almost immediately introduced the apple vision pro where you put screens a few centimetres from your eyes.

Why is no one talking about this? Or have I missed it?

Because it's a nonissue. There's a misconception that VR will damage your vision because it uses stereoscopic imaging so you have a 3D effect without the need of 3D glasses, first used with the Nintendo 3DS which later laid the groundwork for VR as we know it today. The thing is, it does damage your vision...if you are under the age of seven. There's a reason the Nintendo 3DS always had that health warning on the box that 3D mode should not be used if you are six or younger.

91ctHiKAM5L._AC_UF894,1000_QL80_.jpg


So the only effect on your eyes you'll have wearing the Vision Pro is the same as if you were staring at your phone's screen. I've been using VR regularly for 3 years, and my vision's still fine. Hell better than fine actually, as playing in VR has helped me with depth perception and reaction time.

 

Shhhh

Suspended
Jun 10, 2023
22
29
Staring at computer screens, TVs, phones, and anything else is bad for eye health. Why not add one more device to the mix?

giphy.gif
 

LIVEFRMNYC

macrumors G3
Oct 27, 2009
8,881
10,990
I been using VR headsets for about a decade now, never had any issues and my sight is still 20/20.
 

kkee

macrumors 6502a
May 29, 2023
525
665
Sydney
The high-performance eye tracking system in Apple Vision Pro uses high-speed cameras and a ring of LEDs that project invisible light patterns onto the user’s eyes for responsive, intuitive input. These groundbreaking innovations are powered by Apple silicon in a unique dual-chip design. M2 delivers unparalleled standalone performance, while the brand-new R1 chip processes input from 12 cameras, five sensors, and six microphones to ensure that content feels like it is appearing right in front of the user’s eyes, in real time.

R1 streams new images to the displays within 12 milliseconds — 8x faster than the blink of an eye. When things appearing in real time, your senses are in tuned without a slight feeling of nauseous. Apple Vision Pro is designed for all-day use when plugged in.
 

dante_mr

macrumors regular
Jun 13, 2023
146
190
Hmm, are you sure about that? I'm not convinced. Surely your eyes are focused on the screens in front of them. If you look at a picture of a mountain on your laptop your eyes are focused on the laptop screen, not the hypothetical mountain. And same for the vision pro - it's not real pass through - you're just seeing a display of what the the cameras at the front are showing.
Yes, but the Vision Pro has lenses that are at a given focal length.

It's the same reason you need prescription lenses to use the headset (if you have bad vision), because you're using your eyes the same way you use your eyes to see something far away.

For example the Quest 2 has a focal length of about 1.5 meters. So it's exactly like if you were looking at an object 1.5 meters away. So people with severe myopia need to use their glasses.

Whereas with a laptop or phone, your eyes' focal length is however far away it is from your hand.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.