Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Not all applications support HiDPI, and will not look better on your "looks like 1440p" iMac than on a screen with 1440p native. Like I said, look to these forums and others to see evidence of this.
We're fully aware that some apps do not support HiDPI and have their graphics upscaled, so they look blurry compared to native apps. But the question is do they look worse on a 5K monitor than on a 1440p monitor? There is no technical reason for this unless Apple messes with the scaling.
 
I think there is some talking at cross purposes here!
The fact is that if you set your game to 1440p, it will look identical on a 5k iMac as it does on a 1440p monitor in terms of clarity and pixel appearance.
If you set it to some other res, then you will see some interpolation artifacts in comparison with a native monitor of the same res. You will have the same issue if you set a game to 1080 on a 1440 monitor.
 
The fact is that if you set your game to 1440p, it will look identical on a 5k iMac as it does on a 1440p monitor in terms of clarity and pixel appearance.
And this should apply to non-retina windowed apps, which should have their pixel exactly quadrupled. Yet, I've read that these apps look worse on a retina display than on a non-retina one. Don't ask me why, I've never compared myself, but I've read several reports about it.
 
60 FPS you should be able to sustain in 3840x2160 in Overwatch on High settings. That is actually weaker score, than I thought.

? I have a GTX 1070 that doesn't even hit 60fps in Overwatch at 4k, It hovers around 50-55fps. The 580 which is actually slower than a 1070 def wouldn't be able to sustain that unless you bring down the filters.
 
? I have a GTX 1070 that doesn't even hit 60fps in Overwatch at 4k, It hovers around 50-55fps. The 580 which is actually slower than a 1070 def wouldn't be able to sustain that unless you bring down the filters.
In what preset?

RX 480 was able to maintain 60 FPS in High setting. You have 5 presets: Low, Medium, High, Ultra, and Epic.
 
  • Like
Reactions: Glideslope
I think there is some talking at cross purposes here!
The fact is that if you set your game to 1440p, it will look identical on a 5k iMac as it does on a 1440p monitor in terms of clarity and pixel appearance.
Well maybe not.
2sbvekk.jpg

Ont the left is the non-retina resizing handle of the good old Quicktime Player 7. This is captured with my monitor set to its default resolution. The image is obviously zoomed in (with Preview). On the right, we see the same interface element, but the monitor was set to HiDPI mode (and the zoom factor in Preview is exactly half of that on the left). See how pixels are not exactly quadrupled? There are some sort of shadows around the darker pixels. These are not jpeg artefacts and they are not added by Preview. They make the element blurry. So clearly, the window server doesn't simply quadruple the pixels. Who knows what it does to low-resolution games?
 
Last edited:
Okay, that looks really bad - i expected on a 1:2 resolution a linear scaling without any interpolation. :( The question is - is this a problem from the LCD bridge or from the GPU driver?
 
In this convoluted sounding expression a North American thing? What's wrong with 'That itself'?! Am I missing something?

It's an idiom. I'm unaware of its origin, given how long its been used I'd imagine its translated Latin given its convoluted nature you mention. My take on it is it provides emphasis when you want to specify to consider something alone. Not knowing its an idiom leaves it virtually meaningless and redundant.
 
it pushes not proxy 4k from a fs7 in full in premiere. A fully loaded 2013 Mac Pro couldn't do that.
 
Okay, that looks really bad - i expected on a 1:2 resolution a linear scaling without any interpolation. :( The question is - is this a problem from the LCD bridge or from the GPU driver?

There's no way to rule out OS level blending since he used HiDPI mode to induce the artifacts. I'd like to see someone instead change the resolution the GPU is outputting to 1440p and capture some similar images.
 
It's most likely that Vega will be the GPU is the next iMac update. Looks like it should be 6 cores also.

Without any changes to the cooling system like in the iMac Pro - hardly. Vega plays in another TDP league, like the GTX 1080/1080Ti. Beside such powerhorse was never part of the iMac philosophie.
 
Well maybe not.
2sbvekk.jpg

Ont the left is the non-retina resizing handle of the good old Quicktime Player 7. This is captured with my monitor set to its default resolution. The image is obviously zoomed in (with Preview). On the right, we see the same interface element, but the monitor was set to HiDPI mode (and the zoom factor in Preview is exactly half of that on the left). See how pixels are not exactly quadrupled? There are some sort of shadows around the darker pixels. These are not jpeg artefacts and they are not added by Preview. They make the element blurry. So clearly, the window server doesn't simply quadruple the pixels. Who knows what it does to low-resolution games?

On Monday I'll run a few games and see if I can get some screenshots on the 2017 vs 2013 and see if the images are identical or if Apple is playing tricks with the scaling. I'll probably take images of textual menu buttons at 1440p on both versions to since text should show the differences most obviously.
 
*EDIT*
A cursory search on Google suggests that LCD monitors don't do integer scaling anyway (with some rare exceptions?). They perform some bilinear filtering or other tricks that are necessary at non-integer scaling factors, but these are applied at any factor.
https://community.amd.com/thread/195561
https://hardforum.com/threads/playing-at-1080p-on-a-4k-monitor.1868824/

IOW, if you want the best picture at 1440p, don't use the built-in display.
Unless the 5K iMac display pipeline is more intelligent an does nearest-neighbour scaling when set at 1440p. But I doubt it, considering how the window server behaves. Checking it would required taking high-resolution photos of the display from close (with a macro lens) and see if pixels are interpolated like in the screen capture I posted.
 
Last edited:
I think there is some talking at cross purposes here!
The fact is that if you set your game to 1440p, it will look identical on a 5k iMac as it does on a 1440p monitor in terms of clarity and pixel appearance.
If you set it to some other res, then you will see some interpolation artifacts in comparison with a native monitor of the same res. You will have the same issue if you set a game to 1080 on a 1440 monitor.

Theoretically that's true. In practice it's not true. I've played Diablo 3 at 1440p on my 1440p 27" iMac and it looks considerably sharper than 1440p on my 5K iMac.
 
  • Like
Reactions: silvetti
In what preset?

RX 480 was able to maintain 60 FPS in High setting. You have 5 presets: Low, Medium, High, Ultra, and Epic.

There's 5 pre-sets? Wow, I never knew that. Apart from the obvious sarcasm, I should've made myself more clear, as it seems there was some type of disconnect here. These are on Ultra pre-sets. If you prefer 4k at high, that's great. I myself would prefer to drop the resolution, add the highest filters possible, and if I have wiggle room increase the SMAA for the headroom.
 
Is this sort of blurry scaling similar to the effect produced by running a 5k display in a "low resolution" mode?

Screen Shot 19.png
 
Supposedly. But low resolution will always be less sharp than retina because of the lower effective pixel density. But is it made even worse by some particular scaling algorithms? That's the important question.
 
I think there is some talking at cross purposes here!
The fact is that if you set your game to 1440p, it will look identical on a 5k iMac as it does on a 1440p monitor in terms of clarity and pixel appearance.
If you set it to some other res, then you will see some interpolation artifacts in comparison with a native monitor of the same res. You will have the same issue if you set a game to 1080 on a 1440 monitor.

Unfortunately Pixel doubling doesn't work with 4K or 5K displays: https://gaming.stackexchange.com/questions/232957/1080p-on-4k-display

Best picture still the native resolution...

I experienced the same with my 4K display. It will not just double the pixels (from 1080p), but interpolate.
 
  • Like
Reactions: Glideslope
Without any changes to the cooling system like in the iMac Pro - hardly. Vega plays in another TDP league, like the GTX 1080/1080Ti. Beside such powerhorse was never part of the iMac philosophie.
I've read thet AMD Vega is coming to notebooks, they didn't say they are making just one or two cards to compete with 1080.
 
Well Vega RX will be the gaming version of the Vega GPU and AMD says it will be competitive with the GTX 1080 (Ti) and rumored to be around $500. There is an unidentified SiSoft Sandra score for an 8GB card that fits the description of what the Vega RX will look like that was 35% faster than the GTX 1080 and a bit slower than a GTX 1080 Ti.

I would guess that Vega RX Mobile will be the same GPU, just under-clocked to lower the TDP (as AMD is doing with the Radeon Pros in the iMacs and nVidia will be doing with their Max-Q GPUs).
 
Mac and PC games on my 5K display look really good to me at 2560x1440. And if you're going to spend the majority of your time gaming, you should be on a Windows PC to begin with.
Why? No matter whether it's a 'majority' of time or a minority, we should be able to use our Macs for normal stuff as well as play games whether os macOS or Windows Bootcamp.
I play many games @ 2560x1440 and usually either High or Ultra settings on my iMac 5k, and am very pleased with the results. I was a Windows PC user for many years, and have never regretted moving to Mac about 5 years ago. And I play a lot of games as wel as work stuff.
Not everyone is fps obsessed like so many peope on Macrumors. I don't really care if I get 100fps, 60 or so is fine by me!
 
Why? No matter whether it's a 'majority' of time or a minority, we should be able to use our Macs for normal stuff as well as play games whether os macOS or Windows Bootcamp.
I play many games @ 2560x1440 and usually either High or Ultra settings on my iMac 5k, and am very pleased with the results. I was a Windows PC user for many years, and have never regretted moving to Mac about 5 years ago. And I play a lot of games as wel as work stuff.
Not everyone is fps obsessed like so many peope on Macrumors. I don't really care if I get 100fps, 60 or so is fine by me!

What spec iMac you got and what games you play with at what sort of settings?
 
iMac spec in sig.
Games are too many to list, but recent examples that come to mind are:
DCS World, F1 2016, Deus Ex:MD, rFactor2, Fallout 4, Rise of Tomb Raider, Elite Dangerous.
All at least at high @ 2560x1440.
 
I tried Forza Horizon 3 on my iMac with 580 Pro (Win 10):
1440p, Max Settings, 2xAF, 2xAA - 60fps
4k, Max Settings, no AF, no AA - 31fps
5k, Max Settings, no AF, no AA - 28fps

1440p is the way to go, its the best compromise between performance and visual quality. Indeed, 5k is another level and it looks mind-blowing when I tried it. Wow!

Yes, there are more demanding games out there, but this shows how good this iMac is for gaming. So I would say this is the first iMac you can game on. Sure, a selfmade desktop is cheaper and perhaps faster, but still not bad that gaming isn't a no-go anymore! Very happy with this!
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.