Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
2) Ethernet to TC for media storage...

If you only use the TC as media server, should not be a big problem, but the read/write speed will be slower than normal HDD operation due to the ethernet speed limit (and TC is not design for high performance network storage operation). The 1000Mb/s is theoretical limit, you will never achieve that speed. Speed test in my current generation 3T TC shows the following figure. (This TC is not full yet, still has 128G free space, so this number is more or less the same as what you can get).

Screen Shot 2015-02-19 at 14.20.16.jpg

It's good enough for streaming, but it may be relatively slow when you managing the video (e.g. copy a large video to the TC).

However, if you use the TC for backup as well, then it's another story. There will be end up lots of fragmentation in the TC HDD, and it will make your TC getting slower and slower for media operation. The following figure is from my 2T TC, which is fully loaded for years, so lot's of fragmentation. And this is the performance you can get from the same test when backup also in progress.

2t.jpg

As you can see, not much better than a USB stick, and imagine how frustrated will be if you want to copy a 10GB video to that TC with this speed.
 
Last edited:
If you only use the TC as media server, should not be a big problem, but the read/write speed will be slower than normal HDD operation due to the ethernet speed limit (and TC is not design for high performance network storage operation). The 1000Mb/s is theoretical limit, you will never achieve that speed. Speed test in my current generation 3T TC shows the following figure. (This TC is not full yet, still has 128G free space, so this number is more or less the same as what you can get).

View attachment 530536

It's good enough for streaming, but it may be relatively slow when you managing the video (e.g. copy a large video to the TC).

However, if you use the TC for backup as well, then it's another story. There will be end up lots of fragmentation in the TC HDD, and it will make your TC getting slower and slower for media operation. The following figure is from my 2T TC, which is fully loaded for years, so lot's of fragmentation. And this is the performance you can get from the same test when backup also in progress.

View attachment 530538

As you can see, not much better than a USB stick, and imagine how frustrated will be if you want to copy a 10GB video to that TC with this speed.

That really, really helpful. Thank you for taking the trouble. I am using TC as a TM bakup drive so your points are well made. Will get a TB spinning drive

JD
 
If you only use the TC as media server, should not be a big problem, but the read/write speed will be slower than normal HDD operation due to the ethernet speed limit (and TC is not design for high performance network storage operation). The 1000Mb/s is theoretical limit, you will never achieve that speed. Speed test in my current generation 3T TC shows the following figure. (This TC is not full yet, still has 128G free space, so this number is more or less the same as what you can get).

View attachment 530536

It's good enough for streaming, but it may be relatively slow when you managing the video (e.g. copy a large video to the TC).

However, if you use the TC for backup as well, then it's another story. There will be end up lots of fragmentation in the TC HDD, and it will make your TC getting slower and slower for media operation. The following figure is from my 2T TC, which is fully loaded for years, so lot's of fragmentation. And this is the performance you can get from the same test when backup also in progress.

View attachment 530538

As you can see, not much better than a USB stick, and imagine how frustrated will be if you want to copy a 10GB video to that TC with this speed.

That really, really helpful. Thank you for taking the trouble. I am using TC as a TM bakup drive so your points are well made. Will get a TB spinning drive

JD


For the love of god - thats what I told you to do several posts back, I just didn't justify it with pictures...but you chose to attack my questioning of your knowledge rather than actually listen :D
Anyway, if you are getting a thunderbolt spinning drive, make sure it's some sort of RAID system - you could attach a spinning drive to a computer with a trillion simultaneous thunderbolt wires and it wouldn't make it faster because of the disk speed. Personally, I think the cheapest option for what you've described would probably be a USB 3 G-RAID. You'd get plenty of storage space, the speed would be more than enough for your footage, and USB technology is cheaper than thunderbolt. It's easy to get drawn into thinking you need a thunderbolt connection with SSDs and stuff, but you could save yourself a lot of money with something that for you will work at the same speed.
 
Thanks again. I think I will go with the 512GB SSD but at least start by using the TC as my media server drive. If this isn't working then I will look at a TB/ USB 3.0 drive.

Still not sure what vid card to get though! :confused:

I would get the 295 again. If you read the other threads, there are a lot of people who are satisfied with the 295.
 
M295X vs GTX 775M

It depends, for gaming, user experience usually defined by the minimum FPS, not the average.

In this aspect, the M295X is 35% better than the 775m

Not in my case, I just ran it myself for comparison and the 775M is slightly faster. I left it running for an hour after the test; the GPU finally maxed out at 92oC with the fan at 1200 rpm:

Screenshot%202015-03-15%2018.55.15.png
 
Not in my case, I just ran it myself for comparison and the 775M is slightly faster. I left it running for an hour after the test; the GPU finally maxed out at 92oC with the fan at 1200 rpm:

Image
Valley bench it's not real life use, with a 775M you cannot even think to play at 5K, the M295X is much faster than the 780M when it comes to high resolution (>4K) by a good 20%
 
...the soldering needs at least about 200°C to melt according to an apple technician.

...

So enjoy your rImacs folks and trust apple.. they wouldnt risk that `11 MacBook Pro fiasco again...
The technician is correct that the *solder that attaches the GPU to the board* melts at 200 deg. C, **BUT**...

What matters is the SILICON JUNCTION TEMPERATURE (T sub j).

If you think the GPU is getting hot at 105 deg. C on *the GPU case*, the poor junction temperature is nearer to 150 deg. C at the silicon level.

This is what ultimately destroys integrated circuits. On top of that, lead-free solder means that in the fabrication plant, they use temps of around 220 deg. C (and sometimes as much as 240 deg. C) to solder all the components onto the board during manufacture. This process alone is damaging components right off the bat. Don't get me started on lead-free solder - suffice to say the arguments against lead solder are totally bogus, and lead-free solder on its own is leading to premature failure of boards. Medical and military have special exemptions because of the problems! The perceived environmental problems of lead solder, as well as being exaggerated, pale in comparison to the environmental damage due to prematurely failed equipment, but I digress...

It seems clear from reading this thread that the cooling solution employed by Apple is inadequate.

People who are blaming AMD for the noise and heat issues are blaming the wrong people (unless they know AMD mis-lead Apple on the thermal output of their GPUs, but I find this improbable).

EVERYONE with a riMac 5k and M295X need to contact Apple to ask what they can do to resolve the thermal problem under load. It is the only way this will get addressed. No component should ever reach maximum thermal output and trigger thermal protection mechanisms as a matter of course. It is a total failure of the cooling solution should this situation occur, and is IMHO a design defect that warrants rectification.

It is very likely damaging sales, too. I would love to buy one, but with everything I read on this issue, I don't want to risk it because I know what it means for components to regularly trigger thermal protections - premature failure.

I suggest the reason sites are not reviewing this model (STILL) to any great degree is because of the obvious problems, and they probably don't want to report on it.

Of course, those with issues are always vocal, and that must be considered, as well as early batches of anything being subject to manufacturing issues that get ironed out as production continues, but unless the cooling method is changed, it appears this problem won't be going away any time soon, which is a shame. :(
 
Last edited:
The technician is correct that the *solder that attaches the GPU to the board* melts at 200 deg. C, **BUT**...

What matters is the SILICON JUNCTION TEMPERATURE (T sub j).

If you think the GPU is getting hot at 105 deg. C on *the GPU case*, the poor junction temperature is nearer to 150 deg. C at the silicon level.

This is what ultimately destroys integrated circuits. On top of that, lead-free solder means that in the fabrication plant, they use temps of around 220 deg. C (and sometimes as much as 240 deg. C) to solder all the components onto the board during manufacture. This process alone is damaging components right off the bat. Don't get me started on lead-free solder - suffice to say the arguments against lead solder are totally bogus, and lead-free solder on its own is leading to premature failure of boards. Medical and military have special exemptions because of the problems! The perceived environmental problems of lead solder, as well as being exaggerated, pale in comparison to the environmental damage due to prematurely failed equipment, but I digress...

It seems clear from reading this thread that the cooling solution employed by Apple is inadequate.

People who are blaming AMD for the noise and heat issues are blaming the wrong people (unless they know AMD mis-lead Apple on the thermal output of their GPUs, but I find this improbable).

EVERYONE with a riMac 5k and M295X need to contact Apple to ask what they can do to resolve the thermal problem under load. It is the only way this will get addressed. No component should ever reach maximum thermal output and trigger thermal protection mechanisms as a matter of course. It is a total failure of the cooling solution should this situation occur, and is IMHO a design defect that warrants rectification.

It is very likely damaging sales, too. I would love to buy one, but with everything I read on this issue, I don't want to risk it because I know what it means for components to regularly trigger thermal protections - premature failure.

I suggest the reason sites are not reviewing this model (STILL) to any great degree is because of the obvious problems, and they probably don't want to report on it.

Of course, those with issues are always vocal, and that must be considered, as well as early batches of anything being subject to manufacturing issues that get ironed out as production continues, but unless the cooling method is changed, it appears this problem won't be going away any time soon, which is a shame. :(

Two weeks ago I had the occasion to do a test on a 5 years old Dell Studio Xps, with the AMD 4850, the temperature in games was constantly at 108°, we are talking about a 5 years old mainstream and gaming unit, bought and used for years by many thousands of users and nothing happened in all these years.

You can talk about fan noise, but when there's no performance issues or fried units it'all paranoia.


The Dell's card is a Reference design 4850 desktop board:
$_1.JPG

And at 14K rpm it's loud, a lot loud, not even comparable to the 5K
 
That may be true for that card, but why then are people clearly demonstrating the GPU going into thermal protection, aka throttling?

Is the thermal protection of the AMD GPU paranoid and throttling unnecessarily?

EDIT TO ADD: The card you mention is also a desktop card, whereas the 5k iMac is using a mobile chip.
 
Last edited:
That may be true for that card, but why then are people clearly demonstrating the GPU going into thermal protection, aka throttling?

Is the thermal protection of the AMD GPU paranoid and throttling uneccessarily?

Read all my previous post, I did dozens of test and the the M295X is not going under thermal Protection even with a broken fan, the AMD PowerTune it's made to avoid that situation:
https://www.amd.com/Documents/PowerTune_whitepaper_WEB.pdf

In the worst case scenario of a malfunctioning fan you'll lose a very small amount of the performance (2-5%), some units are "cooler" than others (like mine, 100°max) and others can top at 108° but we both have the same performace, only the fan speed are different (-300 rpm under the same load for the cooler ones)

----------

EDIT TO ADD: The card you mention is also a desktop card, whereas the 5k iMac is using a mobile chip.

This doesn't make any difference, and it's even worse, a desktop unit have better cooling capabilities, if it runs up to 108° it means that is meant to do that.

P.s: Just to refresh how is "Throttling": it's jumping from a "P1" state to a "P2" killing 80% of the performance, clearly not the 5K case.

This thread is long, I understand that, but take some time and read all the discussion if you care about, and only then, post your questions if you still have some, or we are likely going to write the same things already discussed
 
Last edited:
AMD Radeon R9 M295X Core Clock Throttling, Heat, and Performance

Watch from 2:30. The same issue applies to the GPU as we have seen in this thread.

http://youtu.be/tgTMxB-ffjM

Astelith is the only person I've seen on the Internet who claims to have a "non-faulty" 5K iMac - ie. an i7/M295X iMac that doesn't thermal throttle.
 
Watch from 2:30. The same issue applies to the GPU as we have seen in this thread.

http://youtu.be/tgTMxB-ffjM

Astelith is the only person I've seen on the Internet who claims to have a "non-faulty" 5K iMac - ie. an i7/M295X iMac that doesn't thermal throttle.

Claims and proved, I must say ;)

But why don't you all wait the rev 2 for the peace of mind ? It could be 7 months from now, with a M395X and same temps lol

P.s. That review it's pretty annoying and also inaccurate in some points and wrong in others. Plus, why the hell put 2xAA in crisis @4k?, cmon !
 
Two weeks ago I had the occasion to do a test on a 5 years old Dell Studio Xps, with the AMD 4850, the temperature in games was constantly at 108°, we are talking about a 5 years old mainstream and gaming unit, bought and used for years by many thousands of users and nothing happened in all these years.

You can talk about fan noise, but when there's no performance issues or fried units it'all paranoia.
Are you honestly suggesting that a (comparably) huge mainstream tower PC with lots of internal space can be any kind of reference for the impact of heat on components in an anemic monitor housing with a fraction of the volume and a fragile super high resolution display mounted literally only millimeters away from heat generating and transporting components? :rolleyes:
 
Are you honestly suggesting that a (comparably) huge mainstream tower PC with lots of internal space can be any kind of reference for the impact of heat on components in an anemic monitor housing with a fraction of the volume and a fragile super high resolution display mounted literally only millimeters away from heat generating and transporting components? :rolleyes:
According to iStat menu everything but the GPU and CPU inside the 5K is running under 50/60C even after hours of stress, have you ever read the average temps on a macbook air or a macbook pro ? Now I'm using a MBA 2012 with is way hotter than my 5K, If I run WoW here I can have even 105 in the CPU with all the other components very hot too.
Honestly, I'm more concerned about my Air than the 5K ! :)


MBA 2012


5K
 
Last edited:
Two weeks ago I had the occasion to do a test on a 5 years old Dell Studio Xps, with the AMD 4850, the temperature in games was constantly at 108°, we are talking about a 5 years old mainstream and gaming unit, bought and used for years by many thousands of users and nothing happened in all these years.

You can talk about fan noise, but when there's no performance issues or fried units it'all paranoia.


The Dell's card is a Reference design 4850 desktop board:
Image
And at 14K rpm it's loud, a lot loud, not even comparable to the 5K
[url=http://s16.postimg.org/5686h1hb9/AMD_4850.jpg]Image[/url]


Wow, more FUD.

Did you stick a pencil in the fan to get it that hot? Or maybe you are looking at the Fahrenheit scale?

Interesting in that Tom's Hardware did find the 4850 to be hottest card in this test:

http://www.tomshardware.com/reviews/radeon-hd-4870,1964-17.html

But somehow their record heat production on this card was 87C. Note that the chart's last number is 90C and no cards posted numbers past 87.

So, just 21C cooler than yours. Hmmm....

Meanwhile, 2 inches away on the 1 piece logic board it is only 50-60C. This is referred to as "thermal stress". Similar to the radiated heat hitting back of that pricey 5K display less than 1 cm away. That is called "cooking".

Buy the machine with 295 if you want. Buy the Applecare if you are smart. But trying to claim that 105C inside that case on a regular basis is a GOOD thing is like gargling with Drano to keep your teeth clean.
 
According to iStat menu everything but the GPU and CPU inside the 5K is running under 50/60C even after hours of stress, have you ever read the average temps on a macbook air or a macbook pro ? Now I'm using a MBA 2012 with is way hotter than my 5K, If I run WoW here I can have even 105 in the CPU with all the other components very hot too.
Honestly, I'm more concerned about my Air than the 5K ! :)


MBA 2012
[url=http://s3.postimg.org/f3yo62sqr/Screen_Shot_2015_03_17_at_00_14_12.png]Image[/url]

5K
[url=http://s3.postimg.org/bcon0dy4j/Screen_Shot_2015_03_16_at_11_57_35_PM.png]Image[/url]

Haven't you noticed there's no CPU/iGPU after warranty replacement programs on macbook pros with no dGPU's while the 2010/2011/2012 macbook pro's with Nvidia/AMD GPU's are now known to be prone to premature dGPU failure necessitating a recall? I don't know about AMD/Nvidia, but Intel has them squarely beat in the reliability department!
 
blah blah blah

Yes, generally speaking you'd rather have your video card or any component for that matter run cooler always. Staying within a cooler thermal envelope certainly will provide that chip with a longer life.

I've seen however coworkers rip out low quality caps from known ****** suppliers and solder in their own caps just for fun and out of boredom. Ive seen people bake video cards in a toaster oven.

Gaming and pushing your video card to its maximum designed thermal envelope wont harm it. Running benchmarks to produce an effect that may or may not arise in a real world scenario likely wont either. But looping benchmarks for hours on end to illustrate something that no one is going to see in the real world is stupidity, and it will hurt your system in the long run. Playing bf4, and causing the gpu to throttle a couple times isnt an issue. Looping benchmarks is.

If your m295x ends up failing before your power supply, or hard drive, it is ridiculous to sit here and say it was because of the extended thermal envelope that apple forced onto the card for the sake of fan noise.
 
As I said, wait for another version if you don't like this one, I'm super happy with my 5K as many other users, I really don't care about temps, the machine does it's job perfectly, that's all that matters to me... and if it fries there's Apple care, but if some world class engineers made the 5K this way, maybe it's just fine to run at this temps.

What are you trying to prove ? At Apple they need to fire all the product designers ? How many 5K had cooked parts from last October ?
 
Haven't you noticed there's no CPU/iGPU after warranty replacement programs on macbook pros with no dGPU's while the 2010/2011/2012 macbook pro's with Nvidia/AMD GPU's are now known to be prone to premature dGPU failure necessitating a recall? I don't know about AMD/Nvidia, but Intel has them squarely beat in the reliability department!

That's not a fair comparison though. As iGPUs are not really powerful and designed to run on lower power which means lower heat overall. It's a trade-off between performance and low-energy/low-heat uses. One won't play Crysis and the other won't have 10+ hrs battery life.
 
That's not a fair comparison though. As iGPUs are not really powerful and designed to run on lower power which means lower heat overall. It's a trade-off between performance and low-energy/low-heat uses. One won't play Crysis and the other won't have 10+ hrs battery life.

Astelith was showing how his MBA's CPU/gpu can run at 105C all day while gaming without issue. I was responding to that by saying that intels chips are much more heat resistant than amd/nvidia.
 
Astelith was showing how his MBA's CPU/gpu can run at 105C all day while gaming without issue. I was responding to that by saying that intels chips are much more heat resistant than amd/nvidia.

It can be, but my point is: out there there are plenty of devices running above 100° and running for years without breaking... and without attracting all the attention this iMac 5K has.
To me it sounds more like a witch-hunt than a objective discussion considered that nobody experienced faulty boards or performance issues related to the GPU.
 
Apple has to improve their application of thermal paste, which is most likely also a reason for the very high temperatures. Even if the temps themselves wouldn't be damaging in the long run (which they most likely are), when the GPU/CPU is at a constant 105C - there will be throttling. That means you're loosing performance when decoding, playing games etc., which partly defeats the purpose of buying a powerful CPU or GPU, wouldn't you say?
Well, I'm keeping my late 2012 iMac with i7/680MX for now...
 
Apple has to improve their application of thermal paste, which is most likely also a reason for the very high temperatures. Even if the temps themselves wouldn't be damaging in the long run (which they most likely are), when the GPU/CPU is at a constant 105C - there will be throttling. That means you're loosing performance when decoding, playing games etc., which partly defeats the purpose of buying a powerful CPU or GPU, wouldn't you say?
Well, I'm keeping my late 2012 iMac with i7/680MX for now...

The problem here is mainly because people don't get how the M295X is built, I mean, the architecture.
With AMD Power Tune the card can't throttle, it has no P-States but a dynamic frequency, eventually the core clock will drop a bit to stabilize the temp but the performance drop is 2-3% and not "user-noticeable".

Why for the most of the users here is so hard to understand this ? read the AMD white paper few posts above.

I read and did a lot to understand this machine, forum, white papers, tests... and the only thing confirmed at the moment is that some units are cooler than others, leading to a big difference in fan speed (and so noise level), except this shameful "quality discontinuity" from Apple, we cannot complain for other things such as performance, no application or games can actually be impacted by noticeable performance decrease (like a game dropping from 60fps to 30/20 due to high temp).

And again, if somebody want to believe that this 5K is the worst mac ever (because you've read some bull$ in a forum) it's their problem, but not the reality :rolleyes:, and writing false statements in a public forum doesn't help the community, it can only feed this madness :)

In your specific case, if you don't want/need the 5K you should stick to the 2012 model and wait, but before seeing again an Nvidia in a 5K we will likely have to wait Oct 2016.
 
It can be, but my point is: out there there are plenty of devices running above 100° and running for years without breaking... and without attracting all the attention this iMac 5K has.
To me it sounds more like a witch-hunt than a objective discussion considered that nobody experienced faulty boards or performance issues related to the GPU.

There's 'plenty of other devices' and then there's Apple's Devices... Ever since the 2008 Macbook Nvidia debacle, there's been numerous other Apple dGPU devices with premature failure, iMacs and the recent 2010-2012 MBP AMD AND Nvidia GPU issues. (Even the previously thought to be reliable Kepler equipped 2012 rMBP - which I own - who's reliability now has been called into question) We'll see if the later Apple dGPU products won't be showing the same reliability issues as the previous models. Given their track record, I'm not convinced...
 
There's 'plenty of other devices' and then there's Apple's Devices... Ever since the 2008 Macbook Nvidia debacle, there's been numerous other Apple dGPU devices with premature failure, iMacs and the recent 2010-2012 MBP AMD AND Nvidia GPU issues. (Even the previously thought to be reliable Kepler equipped 2012 rMBP - which I own - who's reliability now has been called into question) We'll see if the later Apple dGPU products won't be showing the same reliability issues as the previous models. Given their track record, I'm not convinced...

Ok, but are you sure it's dependent by the temperature ? It may be a production process failure like bad wafer lot or everything else
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.