I am still impressed that my 2011 27” iMac is still a great machine. My basic computer needs are still being met With this machine. With everyone’s impute, the potential new Mini sounds interesting.
Would an M2 iMac 24" change the value proposition? Putting an M2 in with the 24" iMac and possibly into a 27" display options might make it interesting.
I am still impressed that my 2011 27” iMac is still a great machine. My basic computer needs are still being met With this machine. With everyone’s impute, the potential new Mini sounds interesting.
I'm not so sure it will have 2-3 years software support "at the most." Apple are still selling brand new Intel machines right now, and Apple has a history (and policy) of providing support for a minimum of 5 years since a computer was last sold (not first sold) by Apple.... you’re looking at 2-3 years of software support at the most before the lights start going out.
Apple are still selling brand new Intel machines right now, and Apple has a history (and policy) of providing support for a minimum of 5 years since a computer was last sold (not first sold) by Apple.
For me the clock started on Intel iMacs when the 27" was discontinued in 2021. The 2020 models I believe will have until 2026 and then add 2 years of security updates if Apple continue software support until the model range itself is marked as vintage (which takes 5 years).I'm not so sure it will have 2-3 years software support "at the most." Apple are still selling brand new Intel machines right now, and Apple has a history (and policy) of providing support for a minimum of 5 years since a computer was last sold (not first sold) by Apple.
Sure, one may not get the latest software features, but that applies for any product. Apple has a history of providing at least two MacOS major (yearly) version upgrades for each product, and supporting each version of MacOS for 3 years.
Of course, this may change, but for someone buying a new Intel Mac mini or a new Intel Mac Pro today, they would be pretty annoyed to find the "lights start going out" on their brand new machines in 2 years.
However, your point is taken that there is a higher risk of support being dropped earlier on an Intel machine.
We shall see!
I don't doubt that - although it's ridiculously hard to test it properly and objectively (the article you linked to is paywalled, so I can't see the detail but unless it's double-blind with a proper control and a clear explanation of what they meant by "simulated" or "discriminate" then forget it - and 50 subjects is a bit small for that).300 mm is 59% of 20", so those correspond to being able to distinguish 200 ppi vs. 300 ppi, and 300 ppi vs 600 ppi, at 20", respectively (same angular resolutions). And for those like me who often lean in closer, to a 15" viewing distance, it's 270 ppi vs. 400 ppi, and 400 ppi vs. 800 ppi.
Like how people were rotating their monitors 180-degrees to fix broken sub-pixel antialising for BGR monitors? 🙃Aside: on the sub-pixel anti-aliasing thing - that only works if the OS knows the actual layout of the RGB sub-pixels on the display and can reliably control them by tweaking the colour of pixels.
Sorry about the paywall. The test was double-blind. Since this is a simple binary test—'do you see the difference or do you not?'—50 subjects are enough to get good statistical significance if the results are sufficiently clear-cut. The real issues are:I don't doubt that - although it's ridiculously hard to test it properly and objectively (the article you linked to is paywalled, so I can't see the detail but unless it's double-blind with a proper control and a clear explanation of what they meant by "simulated" or "discriminate" then forget it - and 50 subjects is a bit small for that).
I think most understand that the cutoff isn't a magic threhsold, and should instead be considered more as a person-dependent cutoff band. I'm thus inclined to criticize the 1 arc min cutoff on a different basis—not that it's hand-waving, but that it's probably wrong.As I said, the "retina" theory itself is a massive hand-wave and certainly isn't a magic threshold beyond which the display becomes "perfect". Since optical resolution is usually measured as the ability to separate two objects separated by distance x you could easily argue that you therefore need 1/x separate dots or lines per inch and that corresponds to 2/x pixels per inch - and, of course, in reality the viewing distance is as long as a piece of string.
I don't think you can interpret it that way, since we don't yet know where the "knee" is. It could be that, for most people, 0.7 arc min is the point beyond which you'll get rapidly diminishing returns. And of course each individual needs to decide for themselves.I think the best interpretation of "retina"/"the 1 arc minute rule" is as a point beyond which you'll get rapidly diminishing returns from (expensively) increasing the resolution further.
Calling it merely an "extra bit" is begging the question of whether it really is just an extra bit. For me, it isn't. As soon as I upgraded to Mojave on my 4k 27", I was immediately struck by how much I didn't like how text was displayed, and I couldn't understand why it was worse until I later learned about subpixel AA. So you could call that an anecdotal blind test. I had no anticipation that there would be a difference, yet I noticed it—clearly.I can certainly see that the display on my old 5k iMac display is somehow "crisper" than my 4k+ display - but then the brightness, contrast and gamut and screen texture are all completely different, too. Apple's screen coating really whacks up the contrast c.f. the Mateview's matte finish in ways that have nothing to do with PPI or "retina". That doesn't mean I'm conscious of pixellation on the 4k display. The question is whether it's worth paying 3-4 times the price for that extra bit of quality. Unless you're a YouTuber you don't buy displays to do A/B comparisons with competing models. [emphasis mine.]
It's that plus at least one other thing: It also doesn't work if you're using transparency. Some say that's part of the reason they got rid of it. But I think there are answers to both the RGB and transparency issues.I don't doubt that - although it's ridiculously hard to test it properly and objectively (the article you linked to is
Aside: on the sub-pixel anti-aliasing thing - that only works if the OS knows the actual layout of the RGB sub-pixels on the display and can reliably control them by tweaking the colour of pixels. There are 101 things that can break that - I had an old display that older MacOS versions would mis-identify as YPbPr rather than RGB which only really mattered because it broke SPAA and gave all the text a rainbow halo. OLED and other displays have different sub-pixel (not even RGB) layouts and I guess any sort of colour correction in the monitor, compression (AirPlay, DisplayLink devices, maybe even DP 1.4 DSC), scaling or local dimming could mess up SPAA. The only systems that Apple could ever guarantee that SPAA would work on would be Apple Displays - but all of Apple's supported Macs, iDevices and displays now have "retina" screens that don't need SPAA, the timing of which seems to correspond with dropping SPAA from MacOS. Sure, Apple could have left the option in with a 'if it breaks, tough' caveat - but, hey, this is Apple we're talking about.
I agree that many will do fine with a 4K 27", but I would quibble with describing 5K as only a small improvement. Apple's 5K res is 5120x2880 = 14.75 MP, while the standard "4K" resolution is 3840x2160 = 8.29 MP. 1.78x as many pixels is substantial, IMO.That said - on a 27" screen 5k is better than 4k, but it's a lot of money for a small improvement. I think people who describe 4k as unusably blurry are being a little bit precious - but it's their money.
Yes, I think it odd that people compare camera sensors by the total number of pixels, but compare display panels by the number of pixels along one edge.I agree that many will do fine with a 4K 27", but I would quibble with describing 5K as only a small improvement. Apple's 5K res is 5120x2880 = 14.75 MP, while the standard "4K" resolution is 3840x2160 = 8.29 MP. 1.78x as many pixels is substantial, IMO.
For displays, I personally prefer using the horizontal (1D) number of pixels, because (for the same size and aspect ratio) that's what the resolution (by which I mean the ppi) is proportional to.Yes, I think it odd that people compare camera sensors by the total number of pixels, but compare display panels by the number of pixels along one edge.
5K vs 4K display is roughly equivalent to a 21 MP camera vs a 12 MP camera (same ratio of pixels), which many people would consider a significant step up.
C.f. the iMac's move from 2560x1440 to 5120x2880 - or the general industry move from "Full HD" 1920x1080 to "4k UHD" 3840x2160 - both of which were 4x increases in the number of megapixels that's a (relatively) small improvement. Also, for many people, 4k is already into the realm of not being directly aware of 1 pixel artefacts, whereas 1080p/1440p are still clearly pixellated at computer monitor viewing distances.agree that many will do fine with a 4K 27", but I would quibble with describing 5K as only a small improvement. Apple's 5K res is 5120x2880 = 14.75 MP, while the standard "4K" resolution is 3840x2160 = 8.29 MP. 1.78x as many pixels is substantial, IMO.
Total pixels is a good way of exaggerating the difference between two cameras - especially in photography where resolution would normally be measured in lines per inch.Yes, I think it odd that people compare camera sensors by the total number of pixels, but compare display panels by the number of pixels along one edge.
...but their sample isn't necessarily representative age-wise (which you'd expect to correlate highly with visual acuity). Only 5 people in the sample were over 50. That's the problem with saying "50 is big enough for a sample" - as soon as you start to analyse by age/gender/ethnicity/etc. you end up with tiny sub-samples and the significance goes out of the window. Whether their main claim is valid, I don't think you can use their data to say anything about "typical" visual acuity.I think the most directly damning piece of evidence against the 1 arc min cutoff, which is based on 20/20 acuity, was this chart they presented of the acuity of their subjects, which they further say is "in line with expected values for such an age distribution".
...and would you have noticed the change on a 5k display?As soon as I upgraded to Mojave on my 4k 27", I was immediately struck by how much I didn't like how text was displayed
I’m inclined to think that a mini will be a great fit for you. Just yesterday I finally pulled the 2013 iMac off of my desk at work. With the help of OCLP and newer versions of MacOS, it was serving as a better external display than what we had available otherwise and a secondary computer. Booting off of an external SSD, it is still a perfectly fine machine for most day to day tasks.I am still impressed that my 2011 27” iMac is still a great machine. My basic computer needs are still being met With this machine. With everyone’s impute, the potential new Mini sounds interesting.
Generalizations based on resolution are nearly meaningless without PPI + distance in the equation.Also, for many people, 4k is already into the realm of not being directly aware of 1 pixel artefacts, whereas 1080p/1440p are still clearly pixellated at computer monitor viewing distances.
Total pixels is a good way of exaggerating the difference between two cameras - especially in photography where resolution would normally be measured in lines per inch.
I'm guessing that the justification for this was that there was no single, standard size or aspect ratio for camera sensors - whereas, when "Full HD" launched, TVs had pretty much standardised on 16:9 so you only needed one dimension.
The biggest downside to the mini is giving up the Apple display. By the time I add on the display, keyboard, and trackpad, I’m spending iMac money (which I’m sure helps sell a lot of iMacs).
There are quite a few compromises with the mini for sure. My choice to go with a mini and 4k really boiled down to horizontal screen real estate. When I need to have a bunch of reference material open while I work, 24” nor 27” will get it done without a second display. After that realization, it was a matter of going with a 32” 4k or an ultra wide monitor. I opted for the 4k for the higher pixel density. It provides the flexibility to jump to a scaled resolution when I don’t need as much info onscreen.Plus a decent webcam, microphones and speakers — useful for video conferencing, listening to music or watching tv and films. The keyboard and mouse come with the iMac, but not with the Mini. The thing is, if you consider the whole package, the iMac is still a decent deal, and if you’re going to be spending the money anyway you might as well pick the option that gets you the Apple Retina display.
It all comes down to whether you feel you absolutely need a 32” or 27” screen, or whether the iMac’s 24” screen is going to be ok for you.
...which you do by working out what the resulting linear resolution would be c.f. some arbitrary standard like 300ppi. For which you need the height and/or width of the image in pixels - not the total number of pixels (in which case you'd need to look up the aspect ratio and do a bunch of maths to work out the linear resolution). "3504x2336" is far more useful than "8 megapixels and a deep dive into the specs to find the aspect ratio" - as is something like "4k" or "1080p" in a context where you can assume the aspect ratio. Megapixels are only really useful for marketing - they give a single figure to "rank" on while also making small increases in linear resolution sound more impressive.A measurement for line pair acuity gives a good representation of how much detail can be resolved with a particular lens or sensor, but without total pixel context, is absolutely of no benefit to everyday questions like "can I print this at 8x10" if I crop it to Christine's face?" or "will the text be readable".
The modern "Full HD (1920x1080)" and "HD ready (1280x720)" formats literally standardised the aspect ratio for new TV sets - but long before then 'widescreen' SD sets had settled on 16:9. There were some early, analogue HDTV formats with 4:3 or 5:3 ratios but they never got any traction and AFAIK were dead by the time the current standards launched. So when you were buying a "720p" or "1080p" device, you never needed to ask the aspect ratio. Heck, it's still hard to get a computer display that isn't 16:9 (which is an awful format for general computer use).Also, TV aspect ratios were absolutely NOT standardized when HDTV was launched, and it was a nightmare for anyone in the broadcast or content creation industries.
That would only be an issue if you were trying to find age-dependent screen resolutions, which you're not. You're trying to find a single screen resolution for everyone, which means you need a resolution that works for the age group with the best vision*. Thus if you can get good statistical discrimination even when confounded by subjects whose vision isn't the greatest, you'd get even clearer discrimination, with the same sample size, if you restricted it to subjects with excellent vision (e.g., if you did the test just with 20-somethings)....but their sample isn't necessarily representative age-wise (which you'd expect to correlate highly with visual acuity). Only 5 people in the sample were over 50. That's the problem with saying "50 is big enough for a sample" - as soon as you start to analyse by age/gender/ethnicity/etc. you end up with tiny sub-samples and the significance goes out of the window. Whether their main claim is valid, I don't think you can use their data to say anything about "typical" visual acuity.
You're being logically inconsistent. If it's not "spurious precision" for you to suggest 1 arc minute as a cutoff of diminishing returns, it's not "spurious precision" for me to suggest it's 0.7 arc minutes. That's because when you say about "1 arc minute" you don't mean "1 arc minute" as in "1 or 2 arc minutes", you mean about 1.0 arc minutes -- which is no more "precise" than my "about 0.7 arc minutes".Anyhow, I think you're getting into spurious precision - the difference between 0.7 arc minutes and 1 arc minutes is equivalent to moving your head 6" closer to a desktop display 20" away.
Nope, since if the difference between 4k and 5k (160 ppi vs 220 ppi) were immaterial to Apple, they would offer 160 ppi Retina external displays, which they don't and probably never will.When Apple talk about retina vs non-retina they're usually talking about a doubling of ppi (e.g. iPhone 3GS to iPhone 4) vs. a 25% difference between 4k and 5k. "Retina" really is a ball-park/order of magnitude thing.
Not sure, but in comparing Snow Leopard (has subpixel AA) on the Retina display on my 2014 MBP with Monterey (no subpixel AA) on the Retina display on my 2019 iMac, I don't see a striking difference, indicating the loss of subpixel AA has much less effect with Retina displays, so there's a good chance the answer would be no....and would you have noticed the change on a 5k display?
Which may or may not have been what the study set out to do, but this discussion is in finding a compromise between resolution and cost/technical complexity. Nobody is denying that 5k is "better" than 4k and if cost was no object then, yes, you'd pick "a resolution that works for the age group with the best vision" (which the study is saying would be > 1000ppi for a phone screen). But since higher resolutions are insanely expensive, the practical question is "what's good for 20/20 vision" since that is widely accepted as "normal" vision. Feel free to buy an 8k display if you have excellent eyesight.That would only be an issue if you were trying to find age-dependent screen resolutions, which you're not. You're trying to find a single screen resolution for everyone, which means you need a resolution that works for the age group with the best vision*.
No, because "1.0" implies "to 1 decimal place" and - if you resally want to pick that particular nit - "about 1" would imply "1 to the nearest integer" - and 0.7 to the nearest integer is 1.That's because when you say about "1 arc minute" you don't mean "1 arc minute" as in "1 or 2 arc minutes", you mean about 1.0 arc minutes
Now that is flawed logic, since there are other perfectly good reasons for Apple not making 160ppi displays. First, the 5k display is a sweet spot for Mac because it was exactly 2x the linear resolution of the previous 2560x1440 iMac display - and MacOS doesn't have a fully scalable UI so anything else needs non-integer scaling. Second, there are 101 perfectly good competing ~160ppi displays on the market, whereas Apple can sell 220ppi displays at a hefty premium. Third, Apple has a long tradition, going back to the 1980s, of making different-sized displays with fixed PPI (it used to be 72ppi).Nope, since if the difference between 4k and 5k (160 ppi vs 220 ppi) were immaterial to Apple, they would offer 160 ppi Retina external displays, which they don't and probably never will.
Looks like Samsung just released a new display the "ViewFinity S9", a 27" 5K display to take on the Apple Studio display. If the pricing is reasonable, this could change the equation of this discussion!
I don't get it. Why ist the LG so much more expensive in the US than in Europe?The only 5K competition starts at $1200 for the LG UltraFine 5K and the Samsung looks a lot better.
Yes, absolutely! 👍More competition can only be good news
Prices here in the UK seem to be all over the place - but my usual "reliable" supplier is still listing it at just under £1200 (but not in stock) including VAT so I don't think the list price has changed. However, it's an old product (and not made by Apple) so there may be discounting going on.I don't get it. Why ist the LG so much more expensive in the US than in Europe?