Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Aside from the first sentence, this is just a nonsensical statement.


LOL, well, whether you understand it or not, you seem to be drawing the wrong conclusion. I'd suggest you check out this Apple Developer article on HiDPI explained. Also, it's actually 1 point in user-space that represents the 4 pixels, which normally I would say is being pedantic, but in this case, maybe it would help you to understand the concepts better.

You are simply wrong about this. Have you actually used a 4K display and tried the various HiDPI settings? If you had, you'd know that isn't true. Because points in HiDPI can be expressed as floating points, it doesn't have to be "perfect multiples".

Think of it this way...
  1. Take a screenshot of a desktop on a 27" 5K iMac (5120x2880) using HiDPI 1440p.
  2. Make two copies, and using photoshop, scale one copy down to 3840x2160 and the other down to 1920x1440.
  3. Display the original screenshot on the 5K iMac, the 4K reduction on a 27" 4K display and the 1440 reduction on a 27" native 1440p display respectively.
  4. The results in order of best looking is: 5K iMac, 4K display, 1440p display.
If you want to insist that the native 1440p looks better than the non-"perfectly" scaled 4K, you're entitled to see it that way, but that's kind of like sticking your head in the sand.

I really wouldn't care all that much to carry this on, but there are so many users just finding out about 4K, and they read this stuff, and misinformation just gets spread over and over again.

Starting with the aRGB, you've been offering a lot of suspect advice in this thread and being rather pushy about it. :rolleyes:
okay you clearly have no clue what you are talking.
 
I'm not suggesting you need to purchase one of those. In fact to have a really robust solution, that's a small part of the overall cost of setup. There are some advantages to a quality display on its own, and you should consider a few factors when determining a display purchase. Here's my list.

Warmup time, uniformity, viewing angles, measured color temperature and its deviation from D65 (don't try to tweak this via profile creation), shadow detail.

What I was saying before about sRGB vs Adobe RGB is that the results of a patch test for a display and accompanying matrix profile may not strongly correlate with display gamut. There are quite a few colors that fall either inside sRGB or outside Adobe RGB where the thing of importance is the deviation between the color actually measured and the intended color relative relative to the coordinate basis used. I had to be very specific about that, because Adobe RGB isn't exactly a scaled superset of sRGB. Edit: Rewriting this part, by different topologies, I meant that if you look at a geometric representation of their gamuts in a profile viewer, you will see different shapes, rather than the same shape at different sizes.

I'm going a little further with this than I intended, but the criteria I mentioned above are things that do differ between displays, and they relate to things where there is a measurable benefit without having a completely fleshed out system in place.

Which monitor are you using/like now?
 
Ya that's true but talking about the differences in response time causes by the software or other technical points is really outside the scope of this discussion; I would advice the OP to buy either depending on which is cheaper. If they were 600-700 dollars and had G-sync I would buy one as well.

I meant that I would look at reviews on both, not just figure they use the same panel.

Which monitor are you using/like now?

I'm using one that I've had for a long time. It's an Eizo CG243W. Note I still deal with graphics a little bit, but mostly in terms of programming them these days. I bought that display for my old work as a replacement for an older NEC 2190UXi.
 
I meant that I would look at reviews on both, not just figure they use the same panel.

The reviews are good for both. The differences are basically with the stand and the UI; picture quality is for all intents and purposes identical because they use the exact same AU Optronics panel.
 
okay you clearly have no clue what you are talking.

Strong statement and one that is very poorly worded from a grammatical point of view. I would suggest that you try to back it up with something with a bit more substance.
 
I disagree. I am curious to know why you think that he doesn't know what he is talking about.

He said:

it's actually 1 point in user-space that represents the 4 pixels
If the source image is 1080p then 1 point is 1 pixel and it becomes 4 pixels under 4k HiDPI.

Because points in HiDPI can be expressed as floating points, it doesn't have to be "perfect multiples".
That is just regular scaling which crates a more blurry image. HiDPI is specifically only for 4x multiples of resolution. 5k:1440p, 4k:1080p or even 1440p:720P.

To recap he doesn't understand the difference between regular scaling and what makes Apple's HiDPI special.
 
Last edited:
He said:


If the source image is 1080p then 1 point is 1 pixel and it becomes 4 pixels under 4k HiDPI.


That is just regular scaling which crates a more blurry image. HiDPI is specifically only for 4x multiples of resolution. 5k:1440p, 4k:1080p or even 1440p:720P.

To recap he doesn't understand the difference between regular scaling and what makes Apple's HiDPI special.
I’ve attempted to be patient and cordial about this, but this isn’t a matter of opinion. You're wrong about the scaling. Period.

It’s difficult to respond specifically to what you’re saying because you’re vague and don't address the way the technology actually works, and your general understanding is based on a misconception that because something is scaled, it inherently creates a more blurry version of that image. In this case, whether it appears blurry or not is dependent on the fidelity of the scaling and the screen’s resolution.

On traditional “low-res” monitors, the OS composites the native desktop size at whatever resolution you've selected (whether or not it fits the native resolution of the display), sends it to the GPU which outputs it at the composited resolution to the display. The display then scales it the best it can to fit the native resolution. The display has a simple scaling processor that interpolates between the desktop resolution and the physically limited low resolution of the display. If the desktop resolution is at an aspect ration that doesn’t match the display’s, it gets distorted. The issue is well understood and quite apparent.

But that’s not how OS X HiDPI works (neither does Windows DPI scaling, but it takes a different approach). The way OS X HiDPI works is by compositing the desktop at twice its native size to an offscreen buffer (the backing store), and then scales it back down again to fit the native resolution of the display. The display itself is not doing the scaling. When using HiDPI resolutions, OS X uses HiDPI graphic resources (e.g. icons, as supplied by OS X and the apps themselves) to composite the desktop image.

We can actually see this at work by taking a screen grab with the OS X utility “Grab", which captures the offscreen buffer’s composited image before it’s scaled. The screen captures are at twice the resolution of the HiDPI scaling, e.g. a 2560x1440 HiDPI will result in a screen capture image that is 5120x2880 pixels. If we switch to a “low resolution” (i.e. non-scaled) setting of 2560x1440, we get the display scaling that results in sub-optimal output (though this still looks better than display scaling on a low-res display because the 4K display has more pixels to work with and therefor pixel interpolation is slightly less of a problem). If we take a screen grab of that, it’s at the expected 2560x1440 resolution. We can compare it to the HiDPI version, and the differences are fairly dramatic. Now, we’re comparing the HiDPI composite at twice the desktop resolution (5120x2880) to the low-res (2560x1440). Keep in mind that OS X is using the HiDPI versions of the graphic resource to composite the HiDPI desktop, so not only are there twice the pixels, but the resources are twice the fidelity (assuming they’ve been designed to take advantage of HiDPI).

Now, we don’t actually “see” the 5120x2880 composited image on our 4K display, since the 4K display is only 3840x2160. OS X scales down that 5120x2880 image to 3840x2160 to make it fit. That scaled down 3840x2160 (that is a scaled 2560x1440 HiDPI desktop) will still look better than a 2560x1440 version of that same desktop on a “native” 2560x1440 display.

Finally, about the “low-res” 1080p image. Again, you’re considering it in the wrong way. Scaling it to a larger size will not make it any sharper. But it won’t make it any less sharp either. On a 4K display, whether it’s being doubled by exactly a factor of 2, or it’s an oddball factor of 1.8, etc. your perception of it’s sharpness will remain the same in comparison to the same sized low-res display. As I pointed out in another thread, what may give the impression of “fuzzy” is the context - when surrounded by super sharp text and HiDPI graphics, the original low-res image can have the perception of being fuzzy where as on a low-res display, it doesn’t stand out as much.

I’m guessing your reply would have been something along the lines of “yeah, that’s what I meant all along”. Yeah, sure.
 
I meant that I would look at reviews on both, not just figure they use the same panel.



I'm using one that I've had for a long time. It's an Eizo CG243W. Note I still deal with graphics a little bit, but mostly in terms of programming them these days. I bought that display for my old work as a replacement for an older NEC 2190UXi.

Idk i think im leaning towards the ASUS PA249Q not overly expensive and seems ok quality

----------

I’ve attempted to be patient and cordial about this, but this isn’t a matter of opinion. You're wrong about the scaling. Period.

It’s difficult to respond specifically to what you’re saying because you’re vague and don't address the way the technology actually works, and your general understanding is based on a misconception that because something is scaled, it inherently creates a more blurry version of that image. In this case, whether it appears blurry or not is dependent on the fidelity of the scaling and the screen’s resolution.

On traditional “low-res” monitors, the OS composites the native desktop size at whatever resolution you've selected (whether or not it fits the native resolution of the display), sends it to the GPU which outputs it at the composited resolution to the display. The display then scales it the best it can to fit the native resolution. The display has a simple scaling processor that interpolates between the desktop resolution and the physically limited low resolution of the display. If the desktop resolution is at an aspect ration that doesn’t match the display’s, it gets distorted. The issue is well understood and quite apparent.

But that’s not how OS X HiDPI works (neither does Windows DPI scaling, but it takes a different approach). The way OS X HiDPI works is by compositing the desktop at twice its native size to an offscreen buffer (the backing store), and then scales it back down again to fit the native resolution of the display. The display itself is not doing the scaling. When using HiDPI resolutions, OS X uses HiDPI graphic resources (e.g. icons, as supplied by OS X and the apps themselves) to composite the desktop image.

We can actually see this at work by taking a screen grab with the OS X utility “Grab", which captures the offscreen buffer’s composited image before it’s scaled. The screen captures are at twice the resolution of the HiDPI scaling, e.g. a 2560x1440 HiDPI will result in a screen capture image that is 5120x2880 pixels. If we switch to a “low resolution” (i.e. non-scaled) setting of 2560x1440, we get the display scaling that results in sub-optimal output (though this still looks better than display scaling on a low-res display because the 4K display has more pixels to work with and therefor pixel interpolation is slightly less of a problem). If we take a screen grab of that, it’s at the expected 2560x1440 resolution. We can compare it to the HiDPI version, and the differences are fairly dramatic. Now, we’re comparing the HiDPI composite at twice the desktop resolution (5120x2880) to the low-res (2560x1440). Keep in mind that OS X is using the HiDPI versions of the graphic resource to composite the HiDPI desktop, so not only are there twice the pixels, but the resources are twice the fidelity (assuming they’ve been designed to take advantage of HiDPI).

Now, we don’t actually “see” the 5120x2880 composited image on our 4K display, since the 4K display is only 3840x2160. OS X scales down that 5120x2880 image to 3840x2160 to make it fit. That scaled down 3840x2160 (that is a scaled 2560x1440 HiDPI desktop) will still look better than a 2560x1440 version of that same desktop on a “native” 2560x1440 display.

Finally, about the “low-res” 1080p image. Again, you’re considering it in the wrong way. Scaling it to a larger size will not make it any sharper. But it won’t make it any less sharp either. On a 4K display, whether it’s being doubled by exactly a factor of 2, or it’s an oddball factor of 1.8, etc. your perception of it’s sharpness will remain the same in comparison to the same sized low-res display. As I pointed out in another thread, what may give the impression of “fuzzy” is the context - when surrounded by super sharp text and HiDPI graphics, the original low-res image can have the perception of being fuzzy where as on a low-res display, it doesn’t stand out as much.

I’m guessing your reply would have been something along the lines of “yeah, that’s what I meant all along”. Yeah, sure.
I think you might be right.. If i remember correctly the guys from ATP talked about this in a past episode and sounds similar to what youre saying here.
 
We can actually see this at work by taking a screen grab with the OS X utility “Grab"...The screen captures are at twice the resolution of the HiDPI scaling
I don't have a 4k display, but tried this on a 15" retina; that is interesting and I appreciate the explanation. So for an effective 1920x1200 setting it is mapping a 3840x2400 image. But it still has to scale that image down to the screen's physical 2880x1800 pixels and interpolate at a 4:3 ratio.

On a 4K display, whether it’s being doubled by exactly a factor of 2, or it’s an oddball factor of 1.8, etc. your perception of it’s sharpness will remain the same...
The rMBP looks noticeably sharper at "best for retina" (1440x900 effectively) or full 2880x1800. On something like an iPhone with higher density any "oddball" scaling isn't noticeable, but the Macbook Pro screen is not of sufficient density to pull that off imperceptibly. A 4k display is lower density than the rMBP so I would expect it to be even less perfect.
 
Idk i think im leaning towards the ASUS PA249Q not overly expensive and seems ok quality
meh, 24" and "1080p" are decidedly last gen, hell even 27" 1440p is. Plus 16:10 is an oddball resolution and has no native content (besides games).

If you are looking for a screen in that price range get a Qnix or Crossover 27" 1440p; be-warned tho, screens that cheap often exhibit backlight bleed which may be an issue for you since you want color work.

BTW my personal monitor is a QNIX QX2710 [DVI-DL only]
I think you might be right.. If i remember correctly the guys from ATP talked about this in a past episode and sounds similar to what youre saying here.
no, that guy has read too much and has no clue what he is talking about. No sane person would write such as a complicated and convoluted response.

I don't have a 4k display, but tried this on a 15" retina; that is interesting and I appreciate the explanation. So for an effective 1920x1200 setting it is mapping a 3840x2400 image. But it still has to scale that image down to the screen's physical 2880x1800 pixels and interpolate at a 4:3 ratio.

The rMBP looks noticeably sharper at "best for retina" (1440x900 effectively) or full 2880x1800. On something like an iPhone with higher density any "oddball" scaling isn't noticeable, but the Macbook Pro screen is not of sufficient density to pull that off imperceptibly. A 4k display is lower density than the rMBP so I would expect it to be even less perfect.
Bingo.

For best sharpness and no interpolation the monitor has to be run in either native resolution or HiDPI aka "best for retina."
 
Last edited:
I don't have a 4k display, but tried this on a 15" retina; that is interesting and I appreciate the explanation. So for an effective 1920x1200 setting it is mapping a 3840x2400 image. But it still has to scale that image down to the screen's physical 2880x1800 pixels and interpolate at a 4:3 ratio.
Correct. As far as I can tell, even when using 2880x1800 HiDPI, it's compositing at 5760x3600 and then scaling it down to 2880x1800.

The rMBP looks noticeably sharper at "best for retina" (1440x900 effectively) or full 2880x1800. On something like an iPhone with higher density any "oddball" scaling isn't noticeable, but the Macbook Pro screen is not of sufficient density to pull that off imperceptibly. A 4k display is lower density than the rMBP so I would expect it to be even less perfect.
Right, but let's be careful not to mix up what I was addressing in that quote, which was an image, with the general HiDPI scaling issue. My point there was that if you take a 1080p image and display it full screen on your rMBP at 1440x900 HiDPI, and then you switch to an "oddball" scaling, e.g. 1680x1050, and display the image at full screen, and then you display it full screen on a theoretical native 1080p display of the same size as your rMBP, it should look the same across the board. In the instance of the rMBP scaling, it would look the same because regardless of the HiDPI setting, the image is being stretched across 2880 pixels. Against the native 1080p display, there wouldn't be any way to distinguish a difference between an image that's been scaled 2:1 (or 1.8:1) and then displayed in the exact same physical space as the original image. You might think something is lost in the all the up/down scaling, but you can give it a shot for yourself. I tried it at various scalings on my 4K and I very honestly can't tell the difference.

Regarding desktop scaling specifically, it's hard to know without directly comparing it side by side with an equal sized native resolution display. I totally agree that 2:1 scale (best for retina) seems the crispest. Every scale above that gets a little bit less crisp. But that should be obvious, right? I mean, is that the "odd ball" scaling at work, or is that because less pixels are making up each screen element? On my 4K, going from an effective resolution of 1920x1080 to 2560x1440 is a really significant change in that respect (a third less pixels making up each screen element). Forget the screen scaling - it should look less crisp. (side note: OS X can currently only scale up 2:1. So, if you choose an HiDPI scaling less than 2:1 native, e.g. 1280x800, then it will only composite to 2560x1600, and scale UP to fit the 2880x1800 - of course it should still look pretty great because that's still a ton of pixels and everything will be huge, but scaling up may put a slight fuzz on things).

I don't have a 27" native 1440p monitor to compare to the "odd ball" 1440 HiDPI scaling of my 4K. I only have a 30", which is a couple inches wider, so not an exact apples to apples comparison, but the 4K looks noticeably sharper at 2560x1440 than the 30" at 2560x1600. Until there's comparative data (and I did google around searching for it), all I can go by is that on its own, at an "oddball" scaling, it still looks sharper than any large screen display I've used.
 
Correct. As far as I can tell, even when using 2880x1800 HiDPI, it's compositing at 5760x3600 and then scaling it down to 2880x1800.

Do you have any clue what you are saying?

HiDPI is "best for retina" it's 2x resolution scaling or pixel (4x) quadruple. (both are equivalent). 2880x1800 isn't HiDPI, it's native resolution. HiDPI is when Apple makes a 2880x1800 screen look like 1440x900. Cuz 2880 is 2x 1440 and 1800 is 2x 900. So for a 3840 x 2160 aka 4k display only 1920x1080 would be "best for retina" aka HiDPI, 1440p setting would be scaling. Likewise 5k can only use 1440p as HiDPI. Using any resolution besides native or HiDPI makes the screen less sharp. Case closed.

I totally agree that 2:1 scale (best for retina) seems the crispest. Every scale above that gets a little bit less crisp. But that should be obvious, right? I mean, is that the "odd ball" scaling at work, or is that because less pixels are making up each screen element?

You can never increase the PPI too much (unless Apple didn't have an appropriate HiDPI setting to accommodate it). If you want to set your 4K display to have the equivalent real estate of a 1440p, then you set it to 2560x1440 HiDPI, and you'll see the equivalent of 1440p, except there will actually be 3840x2160 pixels making up that "1440p", hence extra sharp. I don't want to get off track here - there's a bunch of info out there on how this works.

What you are saying is exactly in contradiction of what you said that started this entire debate. Well not really debate: you spreading misinformation and other users correcting you. The screen always has the same amount of pixels. The reason it becomes less sharp is because of scaling; so no, no "extra sharp."
 
Last edited:
If you're going 27" @ 1440, check out the ViewSonic VP2770-LED. I have two of them and love them.

If you need color matching, grab a Spyder for some basic color calibration - though they come pre-calibrated, I was able to get much better results using the Spyder.

Idk i think im leaning towards the ASUS PA249Q not overly expensive and seems ok quality

----------


I think you might be right.. If i remember correctly the guys from ATP talked about this in a past episode and sounds similar to what youre saying here.
 
Using any resolution besides native or HiDPI makes the screen less sharp. Case closed.

Not quite closed. In a "retina aware app" only the UI elements will appear less sharp when using the scaled resolutions. In other words, in a photo editing app the image that you're editing will not be affected by the scaling and will not appear less sharp.

http://www.anandtech.com/show/6023/the-nextgen-macbook-pro-with-retina-display-review/6

Read from "Where things get really exciting is when you have an application that not only handles scaling properly, but also takes advantage of the added resolution"

Additionally, "HiDPI mode" is not restricted to just the native resolution of the panel. You can enable HiDPI resolutions of any resolution that your monitor can display. Of course it won't be as sharp if you were using the native resolution in HiDPI mode, but proclaiming that only the native resolution can be HiDPI is false.

HiDPI is NOT "best for retina". I am not sure where you got that from. Most folks on this forum running 27" and bigger 4K monitors use 1440p HiDPI mode and I don't see many complaints about the loss of clarity going from 1080p HiDPI to 1440p HiDPI.
 
Last edited:
HiDPI is NOT "best for retina". I am not sure where you got that from. Most folks on this forum running 27" and bigger 4K monitors use 1440p HiDPI mode and I don't see many complaints about the loss of clarity going from 1080p HiDPI to 1440p HiDPI.

Technically, it's "Best for Display" that appears with most 4K displays. HiDPI refers to any mode where there are multiple pixels per point. It looks like OSX prefers an even multiple (2X) of pixels per point as the "Best for Display" choice up to a certain monitor size. Interestingly, for the Sharp 32" display you see in the Apple Stores, "Best for Display" is unscaled 1 pixel per point resolution. Yosemite looks pretty good regardless of the settings because much of the UI is resolution independent, so everything gets sharper the more pixels per point used. Fonts are always going to look better whether on Yosemite or Mavericks because they're pretty much always resolution independent. It's only the resolution dependent screen elements that might look blurry at the scaled 1440p setting that a lot of people are using with 27" 4K monitors, or apps that assume that the pixels per point is a round integer multiple and don't handle the scaling correctly.
 
Technically, it's "Best for Display" that appears with most 4K displays. HiDPI refers to any mode where there are multiple pixels per point. It looks like OSX prefers an even multiple (2X) of pixels per point as the "Best for Display" choice up to a certain monitor size. Interestingly, for the Sharp 32" display you see in the Apple Stores, "Best for Display" is unscaled 1 pixel per point resolution. Yosemite looks pretty good regardless of the settings because much of the UI is resolution independent, so everything gets sharper the more pixels per point used. Fonts are always going to look better whether on Yosemite or Mavericks because they're pretty much always resolution independent. It's only the resolution dependent screen elements that might look blurry at the scaled 1440p setting that a lot of people are using with 27" 4K monitors, or apps that assume that the pixels per point is a round integer multiple and don't handle the scaling correctly.

Good points. Thanks for filling in the details.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.