Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Bodhitree

macrumors 68020
Apr 5, 2021
2,085
2,216
Netherlands
Would an M2 iMac 24" change the value proposition? Putting an M2 in with the 24" iMac and possibly into a 27" display options might make it interesting.

The M2 is a CPU spec bump and a significantly faster GPU, and it has more video encode/decode options. It’s a better gaming machine for the OP’s money and it’s a year further along the software curve, so you get longer support. My M1 iMac shipped with Big Sur, its had a year of Monterey and is now on Ventura.

I get that people are tempted by the refurb 27” Intel iMac on price / screen area, but for me it’s a big no, you’re looking at 2-3 years of software support at the most before the lights start going out. It is not worth it in the long term, unless you need Windows and Boot Camp.
 
  • Like
Reactions: mr_jomo and maflynn

Bodhitree

macrumors 68020
Apr 5, 2021
2,085
2,216
Netherlands
I am still impressed that my 2011 27” iMac is still a great machine. My basic computer needs are still being met With this machine. With everyone’s impute, the potential new Mini sounds interesting.

A 2011 iMac is getting long in the tooth though. You can run what, High Sierra without resorting to patchers? You’ll notice the difference in aesthetic and software features when you finally do upgrade to an up-to-date OS, especially if you also have iPhones. An M1 machine by comparison is also very much quicker… no more making a cup of tea while it boots, it takes 5 seconds tops if you leave the software as factory installed. Whatever option you go for, I think you’ll be pleasantly surprised once you get it home.
 

wilberforce

macrumors 68030
Aug 15, 2020
2,930
3,207
SF Bay Area
... you’re looking at 2-3 years of software support at the most before the lights start going out.
I'm not so sure it will have 2-3 years software support "at the most." Apple are still selling brand new Intel machines right now, and Apple has a history (and policy) of providing support for a minimum of 5 years since a computer was last sold (not first sold) by Apple.
Sure, one may not get the latest software features, but that applies for any product. Apple has a history of providing at least two MacOS major (yearly) version upgrades for each product, and supporting each version of MacOS for 3 years.

Of course, this may change, but for someone buying a new Intel Mac mini or a new Intel Mac Pro today, they would be pretty annoyed to find the "lights start going out" on their brand new machines in 2 years.

However, your point is taken that there is a higher risk of support being dropped earlier on an Intel machine.

We shall see!
 
Last edited:

Bodhitree

macrumors 68020
Apr 5, 2021
2,085
2,216
Netherlands
Apple are still selling brand new Intel machines right now, and Apple has a history (and policy) of providing support for a minimum of 5 years since a computer was last sold (not first sold) by Apple.

I would have agreed with that not so long ago, but with macOS Ventura Apple cut pretty deeply into the selection of Intel macs that they still support. Another year of that, and we will find Intel support ending rather sooner than people might have expected. Of course that is another attempt to move software developers over from building applications for Intel to building them for Apple Silicon.


As a proud owner of a 2011 edition iMac, I am sure the OP is well aware of how Apple support for older OSs works ;)
 
  • Like
Reactions: wilberforce

sublunar

macrumors 68020
Jun 23, 2007
2,311
1,680
I'm not so sure it will have 2-3 years software support "at the most." Apple are still selling brand new Intel machines right now, and Apple has a history (and policy) of providing support for a minimum of 5 years since a computer was last sold (not first sold) by Apple.
Sure, one may not get the latest software features, but that applies for any product. Apple has a history of providing at least two MacOS major (yearly) version upgrades for each product, and supporting each version of MacOS for 3 years.

Of course, this may change, but for someone buying a new Intel Mac mini or a new Intel Mac Pro today, they would be pretty annoyed to find the "lights start going out" on their brand new machines in 2 years.

However, your point is taken that there is a higher risk of support being dropped earlier on an Intel machine.

We shall see!
For me the clock started on Intel iMacs when the 27" was discontinued in 2021. The 2020 models I believe will have until 2026 and then add 2 years of security updates if Apple continue software support until the model range itself is marked as vintage (which takes 5 years).

This is not completely set in stone as there are exceptions in Apple's support history but any model getting short changed has typically been on sale for an exceptionally long time (eg 2014 Mini or 2012 non retina MacBook Pro) and there may genuinely be hardware reasons for dropping software support.

Having said that, at least Intel Macs like these should support Boot Camp for Windows 11 support going forward - you only needed to worry about having a Coffee Lake or later machine for full support there.
 
  • Like
Reactions: wilberforce

jtrainor56

macrumors regular
Oct 23, 2010
122
10
Ephrata, Pennsylvania
I am looking at this monitor for my Studio purchase. I am looking at a higher end configuration to replace my 2012 27” i7. I have the 3TB Fusion Drive that has gotten really tired and had to wipe the drive and only run Catalina stripped down to the basic Install. Right now it’s $419 on B&H Photo, after the 31st it goes for $580. Trying to hold off on a Studio purchase for a couple of more months since I configured it for just under $4000 with new keyboard and trackpad.

LG 27BN85UN-B 27" 4K HDR Monitor​

 

theluggage

macrumors G3
Jul 29, 2011
8,011
8,444
300 mm is 59% of 20", so those correspond to being able to distinguish 200 ppi vs. 300 ppi, and 300 ppi vs 600 ppi, at 20", respectively (same angular resolutions). And for those like me who often lean in closer, to a 15" viewing distance, it's 270 ppi vs. 400 ppi, and 400 ppi vs. 800 ppi.
I don't doubt that - although it's ridiculously hard to test it properly and objectively (the article you linked to is paywalled, so I can't see the detail but unless it's double-blind with a proper control and a clear explanation of what they meant by "simulated" or "discriminate" then forget it - and 50 subjects is a bit small for that).

As I said, the "retina" theory itself is a massive hand-wave and certainly isn't a magic threshold beyond which the display becomes "perfect". Since optical resolution is usually measured as the ability to separate two objects separated by distance x you could easily argue that you therefore need 1/x separate dots or lines per inch and that corresponds to 2/x pixels per inch - and, of course, in reality the viewing distance is as long as a piece of string.

I think the best interpretation of "retina"/"the 1 arc minute rule" is as a point beyond which you'll get rapidly diminishing returns from (expensively) increasing the resolution further.

I can certainly see that the display on my old 5k iMac display is somehow "crisper" than my 4k+ display - but then the brightness, contrast and gamut and screen texture are all completely different, too. Apple's screen coating really whacks up the contrast c.f. the Mateview's matte finish in ways that have nothing to do with PPI or "retina". That doesn't mean I'm conscious of pixellation on the 4k display. The question is whether it's worth paying 3-4 times the price for that extra bit of quality. Unless you're a YouTuber you don't buy displays to do A/B comparisons with competing models.

Aside: on the sub-pixel anti-aliasing thing - that only works if the OS knows the actual layout of the RGB sub-pixels on the display and can reliably control them by tweaking the colour of pixels. There are 101 things that can break that - I had an old display that older MacOS versions would mis-identify as YPbPr rather than RGB which only really mattered because it broke SPAA and gave all the text a rainbow halo. OLED and other displays have different sub-pixel (not even RGB) layouts and I guess any sort of colour correction in the monitor, compression (AirPlay, DisplayLink devices, maybe even DP 1.4 DSC), scaling or local dimming could mess up SPAA. The only systems that Apple could ever guarantee that SPAA would work on would be Apple Displays - but all of Apple's supported Macs, iDevices and displays now have "retina" screens that don't need SPAA, the timing of which seems to correspond with dropping SPAA from MacOS. Sure, Apple could have left the option in with a 'if it breaks, tough' caveat - but, hey, this is Apple we're talking about.
 

i486dx2-66

macrumors 6502
Feb 25, 2013
373
417
Aside: on the sub-pixel anti-aliasing thing - that only works if the OS knows the actual layout of the RGB sub-pixels on the display and can reliably control them by tweaking the colour of pixels.
Like how people were rotating their monitors 180-degrees to fix broken sub-pixel antialising for BGR monitors? 🙃
 

theorist9

macrumors 68040
May 28, 2015
3,880
3,060
I don't doubt that - although it's ridiculously hard to test it properly and objectively (the article you linked to is paywalled, so I can't see the detail but unless it's double-blind with a proper control and a clear explanation of what they meant by "simulated" or "discriminate" then forget it - and 50 subjects is a bit small for that).
Sorry about the paywall. The test was double-blind. Since this is a simple binary test—'do you see the difference or do you not?'—50 subjects are enough to get good statistical significance if the results are sufficiently clear-cut. The real issues are:

(a) It's just one study. You need more, done independently
(b) The simulated images, as you mentioned. They don't have 1000 ppi displays, so they simulated them by creating images at the needed ppi's on photographic film that had an underlying resolution of 3000 ppi, and backlit it to the required luminance to simulate a display. They spent much of the paper showing that the simulated images were valid substitutes, including measuring the MTF (mean transfer function) of the illumination vertically and horizontally across actual screen pixels vs the pixels in their backlit photographic film.

However, they could have directly tested if the simulated images are an adequate substitute for real images by seeing if they got the same results with simulated images vs real images by comparing results from, say, 160 ppi vs. 220 ppi simulated images with 160 ppi vs 220 ppi displays, which they didn't do.
As I said, the "retina" theory itself is a massive hand-wave and certainly isn't a magic threshold beyond which the display becomes "perfect". Since optical resolution is usually measured as the ability to separate two objects separated by distance x you could easily argue that you therefore need 1/x separate dots or lines per inch and that corresponds to 2/x pixels per inch - and, of course, in reality the viewing distance is as long as a piece of string.
I think most understand that the cutoff isn't a magic threhsold, and should instead be considered more as a person-dependent cutoff band. I'm thus inclined to criticize the 1 arc min cutoff on a different basis—not that it's hand-waving, but that it's probably wrong.

I think the most directly damning piece of evidence against the 1 arc min cutoff, which is based on 20/20 acuity, was this chart they presented of the acuity of their subjects, which they further say is "in line with expected values for such an age distribution". So right off the bat you can see that, if you do want to use that approach, you should be using 0.75 arc min (corresponding to 20/15 vision)—or 0.65 arc min, since many of those in the 20/15 category may actually be capable of seeing 20/13 (like me), which they didn't test for:

1672347908302.png


They authors add:

"However, recent studies show that foveal resolution, although varied amongst individuals, can regularly be 2–2.3 µm equivalent to 0.5 arcmin". For this, they cite the following (which I haven't read): E. Rossi, “Relationship between visual resolution and cone spacing,” Nat. Neurosci.12, No. 2, 156–157 (2010).

And also:

"An additional and less widely used metric for the human visual system is called minimum discriminable acuity (Vernier or hyperacuity). Vernier acuity is defined as the ability to determine the alignment of two line segments and is the functional mechanism by which we can read a Vernier scale and may be 5–10 times greater than visual acuity." For this, they cite: G. Westheimer, “Visual acuity and hyperacuity,” Invest. Ophthalmol. Vis. Sci. 14, No. 8, 570–572 (1975).

So there's a lot here that's still not understood.
I think the best interpretation of "retina"/"the 1 arc minute rule" is as a point beyond which you'll get rapidly diminishing returns from (expensively) increasing the resolution further.
I don't think you can interpret it that way, since we don't yet know where the "knee" is. It could be that, for most people, 0.7 arc min is the point beyond which you'll get rapidly diminishing returns. And of course each individual needs to decide for themselves.
I can certainly see that the display on my old 5k iMac display is somehow "crisper" than my 4k+ display - but then the brightness, contrast and gamut and screen texture are all completely different, too. Apple's screen coating really whacks up the contrast c.f. the Mateview's matte finish in ways that have nothing to do with PPI or "retina". That doesn't mean I'm conscious of pixellation on the 4k display. The question is whether it's worth paying 3-4 times the price for that extra bit of quality. Unless you're a YouTuber you don't buy displays to do A/B comparisons with competing models. [emphasis mine.]
Calling it merely an "extra bit" is begging the question of whether it really is just an extra bit. For me, it isn't. As soon as I upgraded to Mojave on my 4k 27", I was immediately struck by how much I didn't like how text was displayed, and I couldn't understand why it was worse until I later learned about subpixel AA. So you could call that an anecdotal blind test. I had no anticipation that there would be a difference, yet I noticed it—clearly.
I don't doubt that - although it's ridiculously hard to test it properly and objectively (the article you linked to is
Aside: on the sub-pixel anti-aliasing thing - that only works if the OS knows the actual layout of the RGB sub-pixels on the display and can reliably control them by tweaking the colour of pixels. There are 101 things that can break that - I had an old display that older MacOS versions would mis-identify as YPbPr rather than RGB which only really mattered because it broke SPAA and gave all the text a rainbow halo. OLED and other displays have different sub-pixel (not even RGB) layouts and I guess any sort of colour correction in the monitor, compression (AirPlay, DisplayLink devices, maybe even DP 1.4 DSC), scaling or local dimming could mess up SPAA. The only systems that Apple could ever guarantee that SPAA would work on would be Apple Displays - but all of Apple's supported Macs, iDevices and displays now have "retina" screens that don't need SPAA, the timing of which seems to correspond with dropping SPAA from MacOS. Sure, Apple could have left the option in with a 'if it breaks, tough' caveat - but, hey, this is Apple we're talking about.
It's that plus at least one other thing: It also doesn't work if you're using transparency. Some say that's part of the reason they got rid of it. But I think there are answers to both the RGB and transparency issues.

For transparency, just have subpixel AA be turned on only when you select "Increase Contrast" in Accessibility, since that defeats transparency anyways. It would make sense that those who choose that option (like I do) are those most interested in the sharpest possible text, and thus would welcome the subpixel AA.

And for RGB, the displays do send info. back to the Mac. If that info. includes pixel structure, then the Mac could choose to implement it based on that. If not, the Mac could compare the display identity against a database to determine if it has an RGB structure or not, and activate accordingly. If it's not found in the database, leave it off, but give the more sophisticated users the option to manually activate with a Terminal command if they can confirm the structure themselves, which is SOP for many options on the Mac.
 
Last edited:

mr_roboto

macrumors 6502a
Sep 30, 2020
856
1,866
That said - on a 27" screen 5k is better than 4k, but it's a lot of money for a small improvement. I think people who describe 4k as unusably blurry are being a little bit precious - but it's their money.
I agree that many will do fine with a 4K 27", but I would quibble with describing 5K as only a small improvement. Apple's 5K res is 5120x2880 = 14.75 MP, while the standard "4K" resolution is 3840x2160 = 8.29 MP. 1.78x as many pixels is substantial, IMO.
 

wilberforce

macrumors 68030
Aug 15, 2020
2,930
3,207
SF Bay Area
I agree that many will do fine with a 4K 27", but I would quibble with describing 5K as only a small improvement. Apple's 5K res is 5120x2880 = 14.75 MP, while the standard "4K" resolution is 3840x2160 = 8.29 MP. 1.78x as many pixels is substantial, IMO.
Yes, I think it odd that people compare camera sensors by the total number of pixels, but compare display panels by the number of pixels along one edge.
5K vs 4K display is roughly equivalent to a 21 MP camera vs a 12 MP camera (same ratio of pixels), which many people would consider a significant step up.
 

theorist9

macrumors 68040
May 28, 2015
3,880
3,060
Yes, I think it odd that people compare camera sensors by the total number of pixels, but compare display panels by the number of pixels along one edge.
5K vs 4K display is roughly equivalent to a 21 MP camera vs a 12 MP camera (same ratio of pixels), which many people would consider a significant step up.
For displays, I personally prefer using the horizontal (1D) number of pixels, because (for the same size and aspect ratio) that's what the resolution (by which I mean the ppi) is proportional to.

For example, comparing 4k (3840 × 2160) and 5k (5120 × 2880) 27" 16:9 displays, the 5k has 5120/3840 = 2880/2160 = 1.33x the resolution.

It's only because displays are 2D that you need (5120*2880)/(3840*2160) = 14.7 MP/8.29 MP = 1.77x the the number of pixels (i.e., 1.33^2) to achieve a 1.33-fold increase in resolution in all directions.

Though ppi is better still, since it allows a simple and direct comparison across displays of different sizes and aspect ratios.
 
Last edited:

theluggage

macrumors G3
Jul 29, 2011
8,011
8,444
agree that many will do fine with a 4K 27", but I would quibble with describing 5K as only a small improvement. Apple's 5K res is 5120x2880 = 14.75 MP, while the standard "4K" resolution is 3840x2160 = 8.29 MP. 1.78x as many pixels is substantial, IMO.
C.f. the iMac's move from 2560x1440 to 5120x2880 - or the general industry move from "Full HD" 1920x1080 to "4k UHD" 3840x2160 - both of which were 4x increases in the number of megapixels that's a (relatively) small improvement. Also, for many people, 4k is already into the realm of not being directly aware of 1 pixel artefacts, whereas 1080p/1440p are still clearly pixellated at computer monitor viewing distances.

Yes, I think it odd that people compare camera sensors by the total number of pixels, but compare display panels by the number of pixels along one edge.
Total pixels is a good way of exaggerating the difference between two cameras - especially in photography where resolution would normally be measured in lines per inch.

I'm guessing that the justification for this was that there was no single, standard size or aspect ratio for camera sensors - whereas, when "Full HD" launched, TVs had pretty much standardised on 16:9 so you only needed one dimension.

I think the most directly damning piece of evidence against the 1 arc min cutoff, which is based on 20/20 acuity, was this chart they presented of the acuity of their subjects, which they further say is "in line with expected values for such an age distribution".
...but their sample isn't necessarily representative age-wise (which you'd expect to correlate highly with visual acuity). Only 5 people in the sample were over 50. That's the problem with saying "50 is big enough for a sample" - as soon as you start to analyse by age/gender/ethnicity/etc. you end up with tiny sub-samples and the significance goes out of the window. Whether their main claim is valid, I don't think you can use their data to say anything about "typical" visual acuity.

Anyhow, I think you're getting into spurious precision - the difference between 0.7 arc minutes and 1 arc minutes is equivalent to moving your head 6" closer to a desktop display 20" away. When Apple talk about retina vs non-retina they're usually talking about a doubling of ppi (e.g. iPhone 3GS to iPhone 4) vs. a 25% difference between 4k and 5k. "Retina" really is a ball-park/order of magnitude thing.

As soon as I upgraded to Mojave on my 4k 27", I was immediately struck by how much I didn't like how text was displayed
...and would you have noticed the change on a 5k display?
 

meson

macrumors 6502a
Apr 29, 2014
516
511
I am still impressed that my 2011 27” iMac is still a great machine. My basic computer needs are still being met With this machine. With everyone’s impute, the potential new Mini sounds interesting.
I’m inclined to think that a mini will be a great fit for you. Just yesterday I finally pulled the 2013 iMac off of my desk at work. With the help of OCLP and newer versions of MacOS, it was serving as a better external display than what we had available otherwise and a secondary computer. Booting off of an external SSD, it is still a perfectly fine machine for most day to day tasks.

It will be replaced with a mini in the coming weeks. My LG 32UL500 came in yesterday, and it will allow me to reduce down to a single display on my desk. 4k at 32” provides an experience close enough to the “looks like” 1400x900 on my 13” M1 MBP that I fell comfortable using it at full resolution. I picked up the display first to be sure I’ll be happy with the experience.

The biggest downside to the mini is giving up the Apple display. By the time I add on the display, keyboard, and trackpad, I’m spending iMac money (which I’m sure helps sell a lot of iMacs).
 
  • Like
Reactions: Bodhitree

i486dx2-66

macrumors 6502
Feb 25, 2013
373
417
Also, for many people, 4k is already into the realm of not being directly aware of 1 pixel artefacts, whereas 1080p/1440p are still clearly pixellated at computer monitor viewing distances.
Generalizations based on resolution are nearly meaningless without PPI + distance in the equation.

I have a 42" 4K TV that looks great... from the couch.
My 28" 4K monitor is only so-so from a typical viewing distance.
A 27" 4K monitor would be noticeably better from the same distance, and a 32" 4K noticeably worse.
A 27" 5K monitor, which is both smaller in size and higher in raw pixel count, would be significantly better.

Total pixels is a good way of exaggerating the difference between two cameras - especially in photography where resolution would normally be measured in lines per inch.

I'm guessing that the justification for this was that there was no single, standard size or aspect ratio for camera sensors - whereas, when "Full HD" launched, TVs had pretty much standardised on 16:9 so you only needed one dimension.

Total pixels is exceptionally relevant in digital photography.

A measurement for line pair acuity gives a good representation of how much detail can be resolved with a particular lens or sensor, but without total pixel context, is absolutely of no benefit to everyday questions like "can I print this at 8x10" if I crop it to Christine's face?" or "will the text be readable".

Also, TV aspect ratios were absolutely NOT standardized when HDTV was launched, and it was a nightmare for anyone in the broadcast or content creation industries. "1080P" isn't always 1920x1080. "480P" could be any of six resolutions or two aspect ratios at eight different frame rates - and that's not even including interlaced modes. The tricks used for framing, editing, compressing, distributing, decoding, scaling, and displaying such a minefield of combinations on the different shapes and types of display devices, with the computing power available at the time, while appearing to make it "just work" to the end user, was a massive technological undertaking.
 

Bodhitree

macrumors 68020
Apr 5, 2021
2,085
2,216
Netherlands
The biggest downside to the mini is giving up the Apple display. By the time I add on the display, keyboard, and trackpad, I’m spending iMac money (which I’m sure helps sell a lot of iMacs).

Plus a decent webcam, microphones and speakers — useful for video conferencing, listening to music or watching tv and films. The keyboard and mouse come with the iMac, but not with the Mini. The thing is, if you consider the whole package, the iMac is still a decent deal, and if you’re going to be spending the money anyway you might as well pick the option that gets you the Apple Retina display.

It all comes down to whether you feel you absolutely need a 32” or 27” screen, or whether the iMac’s 24” screen is going to be ok for you.
 

meson

macrumors 6502a
Apr 29, 2014
516
511
Plus a decent webcam, microphones and speakers — useful for video conferencing, listening to music or watching tv and films. The keyboard and mouse come with the iMac, but not with the Mini. The thing is, if you consider the whole package, the iMac is still a decent deal, and if you’re going to be spending the money anyway you might as well pick the option that gets you the Apple Retina display.

It all comes down to whether you feel you absolutely need a 32” or 27” screen, or whether the iMac’s 24” screen is going to be ok for you.
There are quite a few compromises with the mini for sure. My choice to go with a mini and 4k really boiled down to horizontal screen real estate. When I need to have a bunch of reference material open while I work, 24” nor 27” will get it done without a second display. After that realization, it was a matter of going with a 32” 4k or an ultra wide monitor. I opted for the 4k for the higher pixel density. It provides the flexibility to jump to a scaled resolution when I don’t need as much info onscreen.

I’ll give continuity camera a try and I tend to use my AirPods for audio and microphone when in the office. If I’m not satisfied, I’ll use my laptop for the occasional virtual meeting.

The mini has long been a curiosity of mine and I finally have an excuse to give one a go as a complimentary machine. I’ll rely on my laptop for teaching and portable use and a mini for productivity.
 
  • Like
Reactions: Bodhitree

theluggage

macrumors G3
Jul 29, 2011
8,011
8,444
A measurement for line pair acuity gives a good representation of how much detail can be resolved with a particular lens or sensor, but without total pixel context, is absolutely of no benefit to everyday questions like "can I print this at 8x10" if I crop it to Christine's face?" or "will the text be readable".
...which you do by working out what the resulting linear resolution would be c.f. some arbitrary standard like 300ppi. For which you need the height and/or width of the image in pixels - not the total number of pixels (in which case you'd need to look up the aspect ratio and do a bunch of maths to work out the linear resolution). "3504x2336" is far more useful than "8 megapixels and a deep dive into the specs to find the aspect ratio" - as is something like "4k" or "1080p" in a context where you can assume the aspect ratio. Megapixels are only really useful for marketing - they give a single figure to "rank" on while also making small increases in linear resolution sound more impressive.

Also, TV aspect ratios were absolutely NOT standardized when HDTV was launched, and it was a nightmare for anyone in the broadcast or content creation industries.
The modern "Full HD (1920x1080)" and "HD ready (1280x720)" formats literally standardised the aspect ratio for new TV sets - but long before then 'widescreen' SD sets had settled on 16:9. There were some early, analogue HDTV formats with 4:3 or 5:3 ratios but they never got any traction and AFAIK were dead by the time the current standards launched. So when you were buying a "720p" or "1080p" device, you never needed to ask the aspect ratio. Heck, it's still hard to get a computer display that isn't 16:9 (which is an awful format for general computer use).

Content formats and how broadcasters reconcile the various cinema and camera formats with TVs - while also providing a fallback for old/cheap 4:3 are a whole other can of worms, and I'm sure much time was wasted trying to anticipate if/when the early HDTV standards would take off - but that doesn't really affect how new TVs are sold on the mass market.
 

i486dx2-66

macrumors 6502
Feb 25, 2013
373
417
Looks like Samsung just released a new display the "ViewFinity S9", a 27" 5K display to take on the Apple Studio display. If the pricing is reasonable, this could change the equation of this discussion!
 
  • Like
Reactions: DotCom2

theorist9

macrumors 68040
May 28, 2015
3,880
3,060
...but their sample isn't necessarily representative age-wise (which you'd expect to correlate highly with visual acuity). Only 5 people in the sample were over 50. That's the problem with saying "50 is big enough for a sample" - as soon as you start to analyse by age/gender/ethnicity/etc. you end up with tiny sub-samples and the significance goes out of the window. Whether their main claim is valid, I don't think you can use their data to say anything about "typical" visual acuity.
That would only be an issue if you were trying to find age-dependent screen resolutions, which you're not. You're trying to find a single screen resolution for everyone, which means you need a resolution that works for the age group with the best vision*. Thus if you can get good statistical discrimination even when confounded by subjects whose vision isn't the greatest, you'd get even clearer discrimination, with the same sample size, if you restricted it to subjects with excellent vision (e.g., if you did the test just with 20-somethings).

I.e., the fact that you get good statistical discrimination even when including older subjects means the opposite of what you think -- it doesn't mean the sample size needs to be larger, it means it could actually be smaller.

*By analogy, for audio, you want to design for at least a 20 Hz - 20 kHz hearing range, even though older folks may be limited to 12 kHz.
Anyhow, I think you're getting into spurious precision - the difference between 0.7 arc minutes and 1 arc minutes is equivalent to moving your head 6" closer to a desktop display 20" away.
You're being logically inconsistent. If it's not "spurious precision" for you to suggest 1 arc minute as a cutoff of diminishing returns, it's not "spurious precision" for me to suggest it's 0.7 arc minutes. That's because when you say about "1 arc minute" you don't mean "1 arc minute" as in "1 or 2 arc minutes", you mean about 1.0 arc minutes -- which is no more "precise" than my "about 0.7 arc minutes".
When Apple talk about retina vs non-retina they're usually talking about a doubling of ppi (e.g. iPhone 3GS to iPhone 4) vs. a 25% difference between 4k and 5k. "Retina" really is a ball-park/order of magnitude thing.
Nope, since if the difference between 4k and 5k (160 ppi vs 220 ppi) were immaterial to Apple, they would offer 160 ppi Retina external displays, which they don't and probably never will.
...and would you have noticed the change on a 5k display?
Not sure, but in comparing Snow Leopard (has subpixel AA) on the Retina display on my 2014 MBP with Monterey (no subpixel AA) on the Retina display on my 2019 iMac, I don't see a striking difference, indicating the loss of subpixel AA has much less effect with Retina displays, so there's a good chance the answer would be no.
 

theluggage

macrumors G3
Jul 29, 2011
8,011
8,444
That would only be an issue if you were trying to find age-dependent screen resolutions, which you're not. You're trying to find a single screen resolution for everyone, which means you need a resolution that works for the age group with the best vision*.
Which may or may not have been what the study set out to do, but this discussion is in finding a compromise between resolution and cost/technical complexity. Nobody is denying that 5k is "better" than 4k and if cost was no object then, yes, you'd pick "a resolution that works for the age group with the best vision" (which the study is saying would be > 1000ppi for a phone screen). But since higher resolutions are insanely expensive, the practical question is "what's good for 20/20 vision" since that is widely accepted as "normal" vision. Feel free to buy an 8k display if you have excellent eyesight.

...anyway, this whole argument helps show why having a headless computer that lets you choose your preferred display is a better idea than an all-in-one (although it's probably a non-starter for a phone :)).

That's because when you say about "1 arc minute" you don't mean "1 arc minute" as in "1 or 2 arc minutes", you mean about 1.0 arc minutes
No, because "1.0" implies "to 1 decimal place" and - if you resally want to pick that particular nit - "about 1" would imply "1 to the nearest integer" - and 0.7 to the nearest integer is 1.

Point is you've got at least a 30% uncertainty in preferred viewing distance and probably 100% uncertainty in how visual acuity/resolution quantises to pixels per inch (resolution in optics is the ability to distinguish two objects as separate - so to resolve 100 lines-per-inch you'd need 200 pixels-per-inch, but whether that counts as being able to see pixellation on a computer display is a massive hand wave) so, at best, the "retina" rule is going to be order-of-magnitude.

Nope, since if the difference between 4k and 5k (160 ppi vs 220 ppi) were immaterial to Apple, they would offer 160 ppi Retina external displays, which they don't and probably never will.
Now that is flawed logic, since there are other perfectly good reasons for Apple not making 160ppi displays. First, the 5k display is a sweet spot for Mac because it was exactly 2x the linear resolution of the previous 2560x1440 iMac display - and MacOS doesn't have a fully scalable UI so anything else needs non-integer scaling. Second, there are 101 perfectly good competing ~160ppi displays on the market, whereas Apple can sell 220ppi displays at a hefty premium. Third, Apple has a long tradition, going back to the 1980s, of making different-sized displays with fixed PPI (it used to be 72ppi).

Looks like Samsung just released a new display the "ViewFinity S9", a 27" 5K display to take on the Apple Studio display. If the pricing is reasonable, this could change the equation of this discussion!

More competition can only be good news - and the fact that it has multiple inputs and a stand which at least can pivot (probably with height adjustment and VESA support, but I don't think that's confirmed) starts to tick off my beefs with the Studio Display. I wouldn't get my hopes up too high about price, though - the only 5k competition starts at $1200 for the LG Ultrafine 5k and the Samsung looks a lot better. I'd guess that would be about the price point of the Samsung - we'll see.
 

i486dx2-66

macrumors 6502
Feb 25, 2013
373
417
More competition can only be good news
Yes, absolutely! 👍

I'm optimistic about the pricing. If the Apple Studio Display is starting at $1599, and the LG $1167 (current Amazon price), Samsung has to decide if they slot between those two or if they will try to undercut them both.

If they undercut both, that would be amazing, and would instantly capture a good chunk of the market going forward. Let's imagine a $1050 price just for discussion. It would be 2/3 the cost of Apple, and substantially better looking aesthetically than the comparably priced LG, so it would be a no-brainer to pick up the Samsung.

If they slot the pricing in-between, they would have to anchor it closer to the LG than the Apple. If the price wasn't appreciably lower than Apple, people would just buy the first-party ASD. And Samsung wouldn't want the LG to be significantly cheaper, or the majority of the cost-conscious buyers would go LG instead. So something like $1250 might be on the table.


Now of course there is an argument to be made that the ViewFinity S9, having a matte display, is really in competition with the $1899 Nano-Texture ASD, not the $1599 "standard" (glossy) version, and should slot above the glossy LG as well. That's the wildcard. If this was the case, then even a price of $1400-1500 could be considered competitive. But Apple's advertising revolves around that "From $1599" price. With only one offering instead of two, I don't think Samsung can push it, or waves of potential buyers will overlook the S9 before even reading the specs.
 

theluggage

macrumors G3
Jul 29, 2011
8,011
8,444
I don't get it. Why ist the LG so much more expensive in the US than in Europe?
Prices here in the UK seem to be all over the place - but my usual "reliable" supplier is still listing it at just under £1200 (but not in stock) including VAT so I don't think the list price has changed. However, it's an old product (and not made by Apple) so there may be discounting going on.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.