Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
So 1440p is still the sweet spot, i'm still suprised that even with no AF/AA the framerate drops so fast on 4k...well ist 4 times the pixels.

How's the fan noise level during gaming, or its rpm?
 
So 1440p is still the sweet spot, i'm still suprised that even with no AF/AA the framerate drops so fast on 4k...well ist 4 times the pixels.

How's the fan noise level during gaming, or its rpm?

I don't worry about performance loss in 4k, because theres barely no gpu that can handle this resolution well. Geforce Titan X or maybe Geforce 1080 surely can. 4k is still very heavy on gpus! As you can see the fps difference between 4k and 5k isn't huge and as you said it's 4 times 1080p!
When I think about my old Win 10 PC with Geforce 960, the iMac now is a lot more powerful.
This PC was powerful til FullHD, anything above was too heavy. Or the small 2GB VRAM, so I got microlags with texture settings above Medium.

Fan noise was various. On loading screen it was at min, while gaming it was at middle, sometimes at its peak. But I was suprised that it wasn't at peak the whole time. I'm not that fan sensitive anyway, so I didn't recognized it till I really tried to hear it, becaus my speakers were turned on.
Didn't had any fan messing tool, so this is based on my latest Mac OS experience.

PS I will try Rise of the Tomb Raider later and report.
 
All these AMD Mac cards are now called Radeon Pro. On windows, the Radon Pro brand (ex "Fire Pro") is supposed to mean something special vs. the Radeon RX brand. In particular, these cards get "Pro" drivers. Do we get these Pro drivers on Windows with the iMacs?
Actually, I'd rather not have these drivers since I only plan to use Windows for gaming. I want game-oriented drivers.
 
I'm not that fan sensitive anyway, so I didn't recognized it till I really tried to hear it, becaus my speakers were turned on.

I'm using a headset (Sennheiser) all the time. But for example, the harbor in Word of Warships brings my 775m and the cooling fan at maximum temperature. :(

Since the 580 has a highter TDP that my 775m i'm a bit nervous of this topic. It's strange that my stores here in Austria still have the old iMac's on display, so i can't test the new ones.
 
I'm using a headset (Sennheiser) all the time. But for example, the harbor in Word of Warships brings my 775m and the cooling fan at maximum temperature. :(

Since the 580 has a highter TDP that my 775m i'm a bit nervous of this topic. It's strange that my stores here in Austria still have the old iMac's on display, so i can't test the new ones.

I would say almost any gamer has sound turned on and with that the fan really shouldn't bother you and as I said it's not always running on its peak. :)

Rise of Tomb Raider (rounded fps):

1440p, Max Settings, 16xAF, FXAA - 33fps
4k, Max Settings, no AF, no AA - 20fps
5k, Max Settings, no AF, no AA -15fps
4k, Med Settings, 2xAF, no AA - 28fps

SSMA has same impact on performance like raising resolution.
 
All these AMD Mac cards are now called Radeon Pro. On windows, the Radon Pro brand (ex "Fire Pro") is supposed to mean something special vs. the Radeon RX brand. In particular, these cards get "Pro" drivers. Do we get these Pro drivers on Windows with the iMacs?
Actually, I'd rather not have these drivers since I only plan to use Windows for gaming. I want game-oriented drivers.
I don't have any AMD GPUs right now, but from what I have witnessed, I think any AMD Radeon Pro GPU has Game Mode in its drivers.
 
I asked because AMD suggests that the Vega Frontier edition (a "Pro" card) would not be optimised for games. As if the Pro cards weren't good for gaming.
 
I asked because AMD suggests that the Vega Frontier edition (a "Pro" card) would not be optimised for games. As if the Pro cards weren't good for gaming.
Well thats because it is new architecture, and 80% of driver team engineers were working for past 6 months on drivers for this arch, and the are not ready, yet. RX Vega gaming cards are launching at Siggraph, and theoretically they should be ready by that time.

The thing is, NONE of drivers, professional, gaming are at this moment extracting all of capabilities of Vega GPU.

P.S. This is also the reason why AMD has not send any GPUs to reviewers to not destroy perception of it.
 
Why? No matter whether it's a 'majority' of time or a minority, we should be able to use our Macs for normal stuff as well as play games whether os macOS or Windows Bootcamp.
I play many games @ 2560x1440 and usually either High or Ultra settings on my iMac 5k, and am very pleased with the results. I was a Windows PC user for many years, and have never regretted moving to Mac about 5 years ago. And I play a lot of games as wel as work stuff.
Not everyone is fps obsessed like so many peope on Macrumors. I don't really care if I get 100fps, 60 or so is fine by me!

What about 75fps vs 60fps? Deal breaker? :rolleyes:
[doublepost=1498592704][/doublepost]
Well thats because it is new architecture, and 80% of driver team engineers were working for past 6 months on drivers for this arch, and the are not ready, yet. RX Vega gaming cards are launching at Siggraph, and theoretically they should be ready by that time.

The thing is, NONE of drivers, professional, gaming are at this moment extracting all of capabilities of Vega GPU.

P.S. This is also the reason why AMD has not send any GPUs to reviewers to not destroy perception of it.

This is why I don't see an iMac Pro in December. :apple:
 
I tried to play Elite Dangerous on Win 8.1 in 1440p on iMac 5k (Radeon R9 M290X) and it looked very nice, a lot better than on 1440p screen, because text, HUD and everything gets scaled up very nicely, it looks nearly like 5k. The rest of the picture does not scale up so well and then looks quite exactly like in non-retina screen.

But of course nothing looks worse, but only partly better. Strange wise I could not select any higher resolution ingame, although in macOS version you can select everything up to native 5k, but not sure if upscaling from 1440p games looks as good as on Windows there.

You can choose the 5K resolution if you are driving the desktop in 5K resolution as well. If you‘re using Windows under 1440p resolution, games will report 1440p as highest possible resolution. macOS is aware that you can go up to 5K, regardless of which resolution you actually use.
 
So how does overclocking work with Radeon Pro 580? My old late 2012 iMac GPU can be overclocked quite heavily, and still be 100% stable with decent temperatures - ie. the Nvidia 680MX at +250/+375, making it perform equal to the desktop GTX680.
 
I'm using a headset (Sennheiser) all the time. But for example, the harbor in Word of Warships brings my 775m and the cooling fan at maximum temperature. :(

Since the 580 has a highter TDP that my 775m i'm a bit nervous of this topic. It's strange that my stores here in Austria still have the old iMac's on display, so i can't test the new ones.

I had the 775m prior to getting the 580 and from my experience the new 580 runs much cooler than the 775m ever did. I'm not sure if Apple is prioritizing it differently but in switching from a i7 2013 775m to a i7 2017 580 I see about 10º hotter CPU temps on the 2017 but 10º cooler GPU temps.
[doublepost=1498838812][/doublepost]
So how does overclocking work with Radeon Pro 580? My old late 2012 iMac GPU can be overclocked quite heavily, and still be 100% stable with decent temperatures - ie. the Nvidia 680MX at +250/+375, making it perform equal to the desktop GTX680.

Out of curiosity how are you performing the overclock? I haven't made any attempts on overclocking (ever on my Macs) but if it is possible that is definitely interesting. Do you overclock just in windows or on macOS as well?
 
I overclock my Nvidia 680MX in Windows 7 with MSI Afterburner, and using a static high fan speed with Macs Fan Control. For demanding modern games I disable CPU turboboost to lower the temperature. However, I mostly play STALKER in various forms (like Call Of Chernobyl and Lost Alpha), so I don't need a monster GPU.
 
so i had a question about this mac and using a Dell 4K HDR 27 inch Monitor, the UP2718Q. Will this iMac's GPU (the 580) be able to display HDR content on the Dell Monitor through Mac OS, or windows? There seems to be hardly any information of anyone using this monitor anywhere, let alone with the new 5k iMac.
 
Well maybe not.
2sbvekk.jpg

Ont the left is the non-retina resizing handle of the good old Quicktime Player 7. This is captured with my monitor set to its default resolution. The image is obviously zoomed in (with Preview). On the right, we see the same interface element, but the monitor was set to HiDPI mode (and the zoom factor in Preview is exactly half of that on the left). See how pixels are not exactly quadrupled? There are some sort of shadows around the darker pixels. These are not jpeg artefacts and they are not added by Preview. They make the element blurry. So clearly, the window server doesn't simply quadruple the pixels. Who knows what it does to low-resolution games?

Now that I got my shinny 5K, I could check how games behave in respect to upscaling. :)
Evidence point out that exact pixel doubling is applied to a game when resolution is set to 1440p and the iMac screen is set to its recommended resolution.
Behold:
2lt69l3.jpg


Normally, these characters (from the HL2 ep1 settings menu) are exactly 1-pixel thick. Now we see that they are exactly two-pixel thick. This is a photo taken with my macro lens.

Now when the monitor is set to 5120*2880 @1X (non-retina mode), you get this:
2dtkb7.jpg


No, the image is not out of focus. It's because some sort of smoothing is applied. It's obvious on a screen capture:

211nur.png


I also tried with the monitor set a true (non-retina) 2560*1440. Pixels appear to be doubled like in the first image, but it looks like some sharpening filter is applied as there are darker pixels around the white text. Weird.

Anyhow, the system appears to apply perfect pixel doubling to games running at exactly half the screen resolution, which make them look just as sharp as on a 1440p monitor, contrarily to what's been suggested here. Well, at least for HL2 ep1.
That's pretty awesome considering that PC gamers cannot get this and complain about the lack of integer scaling for games run at exactly half the monitor resolution. They instead get some blurry render like the second image.
 
Last edited:
  • Like
Reactions: Synchro3 and PJivan
Precisions: simple integer scaling (pixel doubling in both dimensions) also works in windowed mode for some games. I've tested it on Source games and Dear Esther (Unity). On Tomb Raider, some subtle smoothing appears to be applied in windowed mode (pixels are not exactly quadrupled), but fullscreen mode is perfectly sharp.
 
  • Like
Reactions: PJivan
Now that I got my shinny 5K, I could check how games behave in respect to upscaling. :)
Evidence point out that exact pixel doubling is applied to a game when resolution is set to 1440p and the iMac screen is set to its recommended resolution.
Behold:

Normally, these characters (from the HL2 ep1 settings menu) are exactly 1-pixel thick. Now we see that they are exactly two-pixel thick. This is a photo taken with my macro lens.

Now when the monitor is set to 5120*2880 @1X (non-retina mode), you get this:

No, the image is not out of focus. It's because some sort of smoothing is applied. It's obvious on a screen capture:


I also tried with the monitor set a true (non-retina) 2560*1440. Pixels appear to be doubled like in the first image, but it looks like some sharpening filter is applied as there are darker pixels around the white text. Weird.

Anyhow, the system appears to apply perfect pixel doubling to games running at exactly half the screen resolution, which make them look just as sharp as on a 1440p monitor, contrarily to what's been suggested here. Well, at least for HL2 ep1.
That's pretty awesome considering that PC gamers cannot get this and complain about the lack of integer scaling for games run at exactly half the monitor resolution. They instead get some blurry render like the second image.

Wow this is great.....I really hope that Metal 2 will perform above expectation, and that quality porting will become a standard.
 
Interestingly, games in macOS integrate better with the system than on Windows. I've recently tried doom. It messes with the native display resolution, which complicates switching to other apps when the game is running. It also messes with the size of the pointer.

Back to the iMac: pixels in games run at 1280*720, which is exactly 1/4 of 5k in both dimensions, are not simply quadrupled. Some interpolation is applied (maybe to reduce blocking).
 
Last edited:
  • Like
Reactions: Synchro3
I really hope that Metal 2 will perform above expectation, and that quality porting will become a standard.
One thing worries me about Metal 2, it's the "direct to display" feature. We get the integer scaling here thanks to the window sever that is clever enough to avoid pixel interpolation when it's not needed. But "direct to display" will bypass the window server entirely, hence scaling may be done by the monitor hardware, which will most likely do some interpolation, like all monitors.
BTW, I tried HL2 1440p under Win 10 on the iMac. It's not sharp. It's amazing that macOS offers a gaming-oriented feature that PC users have been asking for years.
 
All these AMD Mac cards are now called Radeon Pro. On windows, the Radon Pro brand (ex "Fire Pro") is supposed to mean something special vs. the Radeon RX brand. In particular, these cards get "Pro" drivers. Do we get these Pro drivers on Windows with the iMacs?
Actually, I'd rather not have these drivers since I only plan to use Windows for gaming. I want game-oriented drivers.

No.

Software vendors aren't wasting time certifying Mac bootcamp drivers. User group is too narrow.

If you are a creative professional than who cares? If your work looks good, it is good.

If you are working for Lockheed Martin designing something that a minute error cost millions of dollars and potentially lives then you won't be using a Mac anyway.
 
AFAIK, you can install standard windows drivers on boot camp, you don't need specific "bootcamp drivers". The radeon Pros in iMac could use the same drivers as a radeon pro WX 7100, which is based on the same GPU.
 
  • Like
Reactions: Mac32
I have not had a gaming computer before (one might argue that my new iMac isn't on either), laptops all the way, and Macs at that. Must say it's really nice to play CoD MWF 3 on max settings.

Now as can be deducted from the above, I am not a hard core gamer, but it seems smooth enough.

Humbly, Ylan
 
I'm in a position that I can return the 2017 iMac and wait for the Pro. I have the funds to afford it, but I don't know if I can justify it. As far as the video card goes, the tflop calculation would show about a 78% performance boost, though even if less, that's pretty significant. It should be equivalent to the NVidia 1080 in performance while the Vega 64 would have about 10% extra over the baseline. I'd probably stick with the baseline expecting the upgrade to add even more. I'm running games at half resolution and they're running fine right now. I may get into VR games in the future, but that's in the air and probably years down the road. I expect the upgrade to last 10 years.

So the question, 580 vs Vega 56, looking at over 2k more just for the benefit of the Vega at probably 70% improved performance...worth it? Is it even needed? IDK

CPU is also an equation. I do encoding and upgrading to an existing 8 core xeon would give me about a 16% performance boost.
However, single core ops would likely hurt pretty bad on a xeon as they run slower with more cores; 70% of the speed of an i7 when benched. What impact would that have on general use? And also, that's based on an older xeon as we don't know exactly what will be in the imac pro.

Niceties: grey in color, better cooling.
I'd appreciate any thoughts.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.