Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Wow - you should let Apple know about that. They've sold loads of them already, but maybe they could do a recall. ;)
They don't care. "Let it boil at 100C" they say.. so it does. :apple:

i5 3.3GHz consumes 26W less than i7 4.0GHz when at 100% utilization..

Wouldn't be surprised, if Apple has counted that these chips can do 100C 37 months.. and then die. Just to survive over the Apple care. Then we can sell theses suckers next computer and there will be less machines in the second hand market.
 
Thanks for the excellent link. That's exactly what we were looking for, a decent benchmark of the 395X vs 295X.
It shows barely a 5% increase in graphic benchmarks, and as a result I for one will assume that increase is purely down to the 395X being on newer Imac with 6th generation i7.
Nope, the core clock for M395X is 50 MHz higher than is with M295X. 850 MHz vs 900 MHz.
 
Thanks for the excellent link. That's exactly what we were looking for, a decent benchmark of the 395X vs 295X.
It shows barely a 5% increase in graphic benchmarks, and as a result I for one will assume that increase is purely down to the 395X being on newer Imac with 6th generation i7.

You're welcome.
 
It would be really nice with 3d mark 11 scores under windows 8 or windows 10
The best way to compare with other GPUs. :)
 
Nope, the core clock for M395X is 50 MHz higher than is with M295X. 850 MHz vs 900 MHz.
Well I thought it was impossible for something to both such and blow at the same time, but the 395 obviously does.
If they just kept the old 295 graphics chip I would expect a bigger increase than 5% with the 2 generations of upgrade in processor alone, going from Haswell, missing Broadwell, and straight to skylake. Intel claims about a 20% increase in chip performance, which should translate to a overall 10% system increase on an iMac, a tad less on a regular pc. So a meagre 5% increase to me suggests there was no improvement whatsoever, in fact maybe a very slight degradation.
 
They don't care. "Let it boil at 100C" they say.. so it does. :apple:

i5 3.3GHz consumes 26W less than i7 4.0GHz when at 100% utilization..

Wouldn't be surprised, if Apple has counted that these chips can do 100C 37 months.. and then die. Just to survive over the Apple care. Then we can sell theses suckers next computer and there will be less machines in the second hand market.
So Apple could've cut the power usage when going for the i5 by more than 50 watts, run a lot cooler, and at the same time blow away the performance of the 3 year old m395x if they went with a much more modern gtx 980M that uses 25 watts less power than the AMD chip?
 
So Apple could've cut the power usage when going for the i5 by more than 50 watts, run a lot cooler, and at the same time blow away the performance of the 3 year old m395x if they went with a much more modern gtx 980M that uses 25 watts less power than the AMD chip?


Not sure where you got the '25 watts less' figure from... from everything I've read the 980m draws more power load for load than the AMD chips. Though it does produce more performance for the power it draws.

Problem is, when it comes to pushing a 5k screen, the 980m is worse than the AMD cards, which I would assume is why apple chose the AMD. They don't market their imacs to gamers, particularly.
 
  • Like
Reactions: MandiMac
They don't care. "Let it boil at 100C" they say.. so it does. :apple:

i5 3.3GHz consumes 26W less than i7 4.0GHz when at 100% utilization..

Wouldn't be surprised, if Apple has counted that these chips can do 100C 37 months.. and then die. Just to survive over the Apple care. Then we can sell theses suckers next computer and there will be less machines in the second hand market.
I suspect this to be closer to the truth than many might imagine ; building for longevity/reliability, I fear, appears no longer a priority to many of the world's manufacturers - particularly over the past few years. With quality control lacking and value for money fiercly erroded, it will be of little surprise as the integrity of reputations slowly decline in the pursuit of greed. How sad if Apple are to join them.
 
Not sure where you got the '25 watts less' figure from... from everything I've read the 980m draws more power load for load than the AMD chips. Though it does produce more performance for the power it draws.

Problem is, when it comes to pushing a 5k screen, the 980m is worse than the AMD cards, which I would assume is why apple chose the AMD. They don't market their imacs to gamers, particularly.
This is right.

I just dug around a little. See this link for example - it clearly says that the GTX 980 for laptops for instance needs over 160 watts up to 200, so it is way beyond what the iMac is built for. Then again, here this article says that the max resolution is 3840x2160. Seems we have these reasons why Apple went with AMD this time.
 
  • Like
Reactions: jerwin
M395X should definitely be Tonga (GCN1.2) with 2048 cores (source: AMD website)
Is M395 a mobile version of R9 380? I guess it is a Tonga (GCN 1.2) with 1792 cores. Can anyone confirm this?
M390 is likely a rebranded M290X with Pitcairn/Curacao (GCN 1.0) with 1280 cores.
M380 according to AMD website is a Cape Verde (GCN1.0) with 640 cores. It is both old and weak.
But I've also seen some sources claim that it is a Bonaire (GCN1.1) with 768 cores. That would be much better than a 5 years old Cape Verde...
 
Nope, the core clock for M395X is 50 MHz higher than is with M295X. 850 MHz vs 900 MHz.

Nope, it's 59 Mhz higher and Desmond is right about M395 1792 core ;)


m395x-png.595420


gpu-z-imac.png


imac395-jpg.592769
 
Last edited:
This is right.

I just dug around a little. See this link for example - it clearly says that the GTX 980 for laptops for instance needs over 160 watts up to 200, so it is way beyond what the iMac is built for. Then again, here this article says that the max resolution is 3840x2160. Seems we have these reasons why Apple went with AMD this time.
From your very same link on notebook check, the Tonga m295x is rated for 125 watts and the GTX 980M is rated for 100 watts. I think the text is off in those articles, more likely it's calculating total system draw including the cpu/gpu put together in the laptop. This 100/125 watt power ratings are also found from other sources.
 
From your very same link on notebook check, the Tonga m295x is rated for 125 watts and the GTX 980M is rated for 100 watts. I think the text is off in those articles, more likely it's calculating total system draw including the cpu/gpu put together in the laptop. This 100/125 watt power ratings are also found from other sources.
I agree with you when it comes to the GTX 980M, but for the actual GTX 980 for Notebooks (or 990M or whatever) it seems like wccftech is on to something.

They say
The GeForce GTX 980 has a TDP of 165W, the GTX 980M has a TDP of 125W while the GeForce GTX 980 (Laptop SKU) will have a TDP around 125W – 150W (configurable by OEM).

My point still stands: If we're looking at some sort of barrier Apple won't break with the 100-125 watts TDP (Thermal Design Power) because of the thermals, we won't see a hungrier card anytime soon. The TDP doesn't define how many watts the card actually uses, but how efficient the cooling around the card should work. Throttling down a GTX 980 for Notebooks until it's usable won't give a clear advantage over AMD's current offerings which are better suited for 5K, too.
 
I agree with you when it comes to the GTX 980M, but for the actual GTX 980 for Notebooks (or 990M or whatever) it seems like wccftech is on to something.

They say
The GeForce GTX 980 has a TDP of 165W, the GTX 980M has a TDP of 125W while the GeForce GTX 980 (Laptop SKU) will have a TDP around 125W – 150W (configurable by OEM).

My point still stands: If we're looking at some sort of barrier Apple won't break with the 100-125 watts TDP (Thermal Design Power) because of the thermals, we won't see a hungrier card anytime soon. The TDP doesn't define how many watts the card actually uses, but how efficient the cooling around the card should work. Throttling down a GTX 980 for Notebooks until it's usable won't give a clear advantage over AMD's current offerings which are better suited for 5K, too.
I know we're going back and forth here, but no one is planning on gaming at 5K short of Blizzard games based on decade old graphics engines. When gaming at 1440P (which is the old iMac 27 resolution and what most iMac 5K gamers usually play at) the 980M kills the m295x/395x. The difference is smaller at higher resolutions, but the 980M keeps it's advantage albeit smaller. I know the GTX 990M is too power hungry for the iMac...
 

Attachments

  • Screen Shot 2015-10-25 at 10.20.03 AM.png
    Screen Shot 2015-10-25 at 10.20.03 AM.png
    260.6 KB · Views: 277
  • Screen Shot 2015-10-25 at 10.20.43 AM.png
    Screen Shot 2015-10-25 at 10.20.43 AM.png
    224.9 KB · Views: 267
I agree with you when it comes to the GTX 980M, but for the actual GTX 980 for Notebooks (or 990M or whatever) it seems like wccftech is on to something.

They say
The GeForce GTX 980 has a TDP of 165W, the GTX 980M has a TDP of 125W while the GeForce GTX 980 (Laptop SKU) will have a TDP around 125W – 150W (configurable by OEM).

My point still stands: If we're looking at some sort of barrier Apple won't break with the 100-125 watts TDP (Thermal Design Power) because of the thermals, we won't see a hungrier card anytime soon. The TDP doesn't define how many watts the card actually uses, but how efficient the cooling around the card should work. Throttling down a GTX 980 for Notebooks until it's usable won't give a clear advantage over AMD's current offerings which are better suited for 5K, too.
https://forums.macrumors.com/thread...s-computer-line.1928218/page-14#post-22135918

Here you have real answer why Apple picked AMD this time around. GCN cards simply has much higher compute performance in OpenCL.
 
  • Like
Reactions: MandiMac
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.