Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Nobody knows for sure but I think the fact that Polaris has been available for almost a year now (with lower TDP variants available since Fall 2016) and yet we haven't seen a refresh indicates that Apple has bigger and better plans for the next iMac's GPU (and hopefully CPU as well.)

An interesting note, the Fiji based R9 Nano, manufactured on a 28nm process, offers performance very close to a GTX 980TI/GTX 1070/R9 Fury X with a TDP of only 175W (calling into question Apple's excuse that it couldn't fit better GPUs in the nMP).
I can't see any reason why the move to 14nm wouldn't enable AMD to create a comparable Vega card with AT LEAST R9 Nano level performance that can stay within a 120W TDP. Of course, whether or not they actually CHOOSE to design such a chip is a completely different matter.

Considering the above and Apple's recent comments on GPU's I'm personally hopeful that the next iMac will take a much needed step up in GPU performance.
If the iMac Pro has display with 8K resolution, then only way to power this monster will be Vega architecture. Nothing currently on market is able to power this resolution within reasonable thermal envelope. Volta and Vega will. However I think for both architectures we are looking at 200W TDP for reasonable performance in 8K.
 
If the iMac Pro has display with 8K resolution, then only way to power this monster will be Vega architecture. Nothing currently on market is able to power this resolution within reasonable thermal envelope. Volta and Vega will. However I think for both architectures we are looking at 200W TDP for reasonable performance in 8K.

If the dinky chip in the MacBook Pro can drive 2 5k displays then I'm sure any modern midrange chip could drive an 8k display. Although I would love to see Apple bump up the thermal tolerances in the iMac to take something like vega.
 
If the dinky chip in the MacBook Pro can drive 2 5k displays then I'm sure any modern midrange chip could drive an 8k display. Although I would love to see Apple bump up the thermal tolerances in the iMac to take something like vega.
Its not about driving the display, but delivering performance, that can drive any work done in 8K resolution.
 
Does anyone think that the iMac "Pro" will be a redesign? personally I don't think so, I'm thinking it will just be a 5K iMac with higher end configuration options, maybe Touch Bar on the keyboard as well?
 
  • Like
Reactions: George Dawes
Since nvidia released GTX 1080 Max-Q with TDP 110W , i wonder if Apple could/want to make a BTO next imac with it
 
Last edited:
If the iMac Pro has display with 8K resolution, then only way to power this monster will be Vega architecture. Nothing currently on market is able to power this resolution within reasonable thermal envelope. Volta and Vega will. However I think for both architectures we are looking at 200W TDP for reasonable performance in 8K.

I'm sorry I'm really confused with how the above relates to my post that you quoted. I was extrapolating from the R9 Nano (based on 28nm Fiji) to point out that it SHOULD be possible to build a Vega card offering excellent performance within 120W. I don't think 200W are necessary to power 8K (outside of gaming) That said...

Its not about driving the display, but delivering performance, that can drive any work done in 8K resolution.

The biggest hurdles with current cards are interface related, as 8K requires Displayport 1.4 or HDMI 2.1 to display at 60hz. That said, let's not forget that Apple built their own customized timing controller for the original 5K iMac as Displayport 1.2 (which was all that was available at the time) couldn't handle the bandwidth. If they decided they really wanted to go for 8K resolution they certainly have the engineering talent to do so, and I believe DP 1.4 is already on the market so this actually shouldn't even be an issue. As for performance... please don't forget that in 2015 Apple decided to sell a 5K iMac with the R9 M380...

Personally, I really hope the next iMac isn't 8K (and I don't think it will be), not only is it too soon (seriously who needs 8K of resolution, especially the way macOS uses it?), but even at 32", the increase in pixel density over the 5K iMac really doesn't make sense (5K is a noticeable step up from 4K at 27", but even at 32" 8K just seems excessive), and would hurt performance for very little gain.

GTX 1080 Max-Q
This is... really cool! This is what I want to see more of from the jump to 14/16nm!
I know I'm beating a dead horse but if AMD could deliver the Nano with a 175W TDP on 28nm in 2015, the future should be bright indeed for laptops and AIOs.
 
Since nvidia released GTX 1080 Max-Q with TDP 110W , i wonder if Apple could/want to make a BTO next imac with it

Yeah, this is great to see! This would be great in an iMac! Thats gotta be the performance per watt of the 14/16 nm GPU generation.

The biggest hurdles with current cards are interface related, as 8K requires Displayport 1.4 or HDMI 2.1 to display at 60hz. That said, let's not forget that Apple built their own customized timing controller for the original 5K iMac as Displayport 1.2 (which was all that was available at the time) couldn't handle the bandwidth. If they decided they really wanted to go for 8K resolution they certainly have the engineering talent to do so, and I believe DP 1.4 is already on the market so this actually shouldn't even be an issue. As for performance... please don't forget that in 2015 Apple decided to sell a 5K iMac with the R9 M380...

Personally, I really hope the next iMac isn't 8K (and I don't think it will be), not only is it too soon (seriously who needs 8K of resolution, especially the way macOS uses it?), but even at 32", the increase in pixel density over the 5K iMac really doesn't make sense (5K is a noticeable step up from 4K at 27", but even at 32"), and would hurt performance for very little gain.

Another thing to remember here is that Apple doesn't usually use these TV standards for their monitors so there is no reason to assume they will use 8k. On the desktop they have stuck to a target of 220 PPI for a retina screen so if they decide they want a 32" iMac they could custom order a display that is the right PPI.

Displayport 1.4 has the problem that it uses compression to achieve the full bandwidth. Supposedly its lossless but this still may not fly with professionals. Obviously the display standard doesn't matter for the iMac, but they did promise a professional display for the upcoming mac pro.
 
Yeah, this is great to see! This would be great in an iMac! Thats gotta be the performance per watt of the 14/16 nm GPU generation.



Another thing to remember here is that Apple doesn't usually use these TV standards for their monitors so there is no reason to assume they will use 8k. On the desktop they have stuck to a target of 220 PPI for a retina screen so if they decide they want a 32" iMac they could custom order a display that is the right PPI.

Yeah this sums up my feelings on this issue pretty well. I could see Apple doing a 6K 32" screen (214PPI) (although honestly 5K on a 32" screen would be like 4K on a 27", AKA not bad at all), or I could see them doing a high res ultra wide screen at 29-30 while keeping the resolution at around 5K.

That said. There is something to be said for mass production, and Apple can save a lot of money by using the same panel other manufacturers are using (even if the quantity's aren't that high as in the case of 5K displays). So I'm still not really feeling a higher res iMac until we start hearing more about higher res displays. As I've already said, what I want is OLED (and ultra-wide) :)

Displayport 1.4 has the problem that it uses compression to achieve the full bandwidth. Supposedly its lossless but this still may not fly with professionals. Obviously the display standard doesn't matter for the iMac, but they did promise a professional display for the upcoming mac pro.
I don't necessarily see this as being a big hindrance. At the end of the display the compression either hurts your workflow or it doesn't. If it's truly lossless then its lossless and while some may doubt the claim at first, people will quickly jump on the bandwagon once its been proven. If it's "lossless to the human eye", then you'll get more resistance, but I'd imagine most pros will still get over it once they realize it doesn't actually impact their workflow (and by gen 2 the bandwidth issues should be sorted out by display port 1.4b/1.5/whatever comes next). If its faux-lossless (AKA noticeably lossy but advertised as lossless) then depending on what's being sacrificed, that could certainly be a non starter for photo/video.
That said (yes I use these two words too much), just a few short years ago, a lot of people use to complain that no pro could possibly edit photos/do color sensitive work on a glossy screen, and yet here we are with no matte screens across the lineup and very little complaining
 
I suppose they could up the video-card ante ( for gaming ?? ) , apart from that my dad's imac 5k is a monster

Love it , really don't see how it could get much faster
 
The video signal would have to be lossless guys or you'd see visual artifacts present themselves. You can't have a lossless video signal/stream - you can have a compressed video feed you're displaying on the screen but you absolutely cannot be missing any display info to drive the display.

The discussion of pros being weary of "faux lossless" is just naive. The compression will help drive massive resolutions without having to re-engineer display connectors every "2" years to add more bandwidth using more physical lanes.
 
I suppose they could up the video-card ante ( for gaming ?? ) , apart from that my dad's imac 5k is a monster

Love it , really don't see how it could get much faster

What's the screen like with normal content? because it's a 5K screen i just wondered (I'm using a 2012 27" iMac by the way).
 
The video signal would have to be lossless guys or you'd see visual artifacts present themselves. You can't have a lossless video signal/stream - you can have a compressed video feed you're displaying on the screen but you absolutely cannot be missing any display info to drive the display.

The discussion of pros being weary of "faux lossless" is just naive. The compression will help drive massive resolutions without having to re-engineer display connectors every "2" years to add more bandwidth using more physical lanes.
I don't think anyone's being naive here, obtuse maybe, but not naive. We probably all need to define our terms better. When I was talking about lossless vs "faux-lossless," I wasn't talking about the kind of lossy compression you'd see in a poorly encoded or bitrate starved H264 video which can result in obvious artifacts like macro blocking or ghosting.

I'm talking about the kind of lossy compression that can be passed off as "lossless" to general users because 99% of the general populace couldn't tell the difference in a blind test. Things like compression of the color channels which reduce the tonality of the image displayed, saving bandwidth, while appearing the same to most people.

While such techniques have the benefit of allowing us to achieve higher resolutions with less bandwidth, it could, depending on the design, be a serious issue for people who work in color/tonal sensitive environments (photographers/videographers/design work), where what's displayed on screen needs to be the same as what comes out in print/on other screens.

This is going to become especially important as more devices, computer displays,and TVs begin to adopt higher color gamuts (DCI-P3) and HDR. If a color channel appears to be clipping on my screen (or exhibits color shift), but really isn't, that'd be a serious problem.
 
  • Like
Reactions: Works4Me
So what's the consensus/guess, are we going to see something in May?
It's been a very quiet spring and it WWDC is a long ways off to not unveil any new hardware.
Plus that's really a software event as well.
 
So what's the consensus/guess, are we going to see something in May?
It's been a very quiet spring and it WWDC is a long ways off to not unveil any new hardware.
Plus that's really a software event as well.

No, I would say October for Mac's, iPhone and iPad Pro along with new Apple Watch in September.
 
No, I would say October for Mac's, iPhone and iPad Pro along with new Apple Watch in September.

I agree with this unfortunately, as would love to see a new iMac.

June - MacBook and possible MBP update [30% chance I reckon]
September - iPhone & iPad Pro
October - iMac & MBP [if not updated in June]

I am in the market for a decent desktop right now and all this waiting has me stumped on what to do.
Currently deciding between a PC workstation, 15" MacBook Pro and a Surface studio - all very different machines. Would have just bought the iMac instantly but need better GPU and figured with the MBP can at least go eGPU.
 
I agree with this unfortunately, as would love to see a new iMac.

June - MacBook and possible MBP update [30% chance I reckon]
September - iPhone & iPad Pro
October - iMac & MBP [if not updated in June]

I am in the market for a decent desktop right now and all this waiting has me stumped on what to do.
Currently deciding between a PC workstation, 15" MacBook Pro and a Surface studio - all very different machines. Would have just bought the iMac instantly but need better GPU and figured with the MBP can at least go eGPU.

I've given up trying to predict Apple releases. I never thought the mac pro would be neglected for so long. The iMac and MacBook Pro were previously released at least once a year, and the last MacBook pro update took 1.5 years and the imac hasn't been updated in the same amount of time.

A new iMac could be announced tomorrow or it could be next year. Who knows...
 
I've given up trying to predict Apple releases. I never thought the mac pro would be neglected for so long. The iMac and MacBook Pro were previously released at least once a year, and the last MacBook pro update took 1.5 years and the imac hasn't been updated in the same amount of time.

A new iMac could be announced tomorrow or it could be next year. Who knows...

Totally agree and I am wrong every time [or at least for the last few years] so don't pay much attention to me !
just my guesses.....
 
A new iMac could be announced tomorrow or it could be next year. Who knows...

Apple did confirm during that Mac Pro press meeting that new "Pro" centred iMacs will be this year, as for the other Mac's such as the MacBook 12" it's not known when they will be out but my guess is an October event OR they could be waiting on the new Apple campus Steve Jobs theatre to be up and running, then announce a whole load of new products.
[doublepost=1493697619][/doublepost]
I am in the market for a decent desktop right now and all this waiting has me stumped on what to do.
Currently deciding between a PC workstation, 15" MacBook Pro and a Surface studio - all very different machines. Would have just bought the iMac instantly but need better GPU and figured with the MBP can at least go eGPU.

In that case personally I would go with a MacBook Pro, if the 5K iMac isn't enough, it also depends on your feelings towards Windows personally I much prefer Mac OS and Apple's design hardware.
 
I wonder if 'pro centered' means we'll get Target Display Mode back. Not a huge deal, but nice to have.

I think the "Pro" is going to be higher specs in configurations, i think there will be the normal 5K iMac for consumers and then a higher spec available to build to order configuration.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.