Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I will ask again. W9000 with 6 GB of VRAM has 250W TDP. How come it is not able to work in thermal envelope that is half that? How come it can be too hard for it? Don't you think that if that is the case, then the problem should be MUCH bigger in 250W thermal envelope? And this is limited in GPU BIOS.

I don't know how many different ways I can say this, the problem, as I've heard it described, relates to the implementation of the extra 3gb of VRAM, which is a custom design for Apple's cards - not the same design as that used in the FirePro, so what the FirePro, or even the retail Radeon are rated for, or capable of, isn't really relevant.

For whatever reason the custom design of the Apple card is causing some component in the GPU boards to overheat. Overall power draw and TDP are not the problem here, whatever is going wrong, the system isn't protecting itself against it - perhaps there's some part with higher electrical resistance and temperature than anticipated at high temperatures, in a specific place that doesn't have any thermal monitoring, so the system has no way to know that the problem is occurring.
 
  • Like
Reactions: ITguy2016
While its an interesting theory, Koyoot is thinking what I was. Since W9000/D700/7970 are based on the Tahiti XT series shouldn't we see similar problems with the W9000?
From post #1522:

As I've seen it described by people I have good reason to believe know very specifically what the problem is, the addition of the extra 3GB of vram to the Radeon design has given rise to a localised thermal problem that the system cannot cool, or protect against.​

The way I am reading this is:
  • Apple took a consumer grade card and doubled the amount of RAM
  • This resulted in a localized, to the RAM, heat concentration
  • The lack of adequate cooling in the nMP prevents the system from sufficiently cooling the area where the heat has concentrated
  • As a result the heat generation leads to their eventual failure.
Why might this not affect the W9000?
  • It was designed for 6GB of RAM instead of having it added in later
  • The different form factor and its installation in a case which provides proper cooling capacity results in a properly cooled card.
 
I don't know how many different ways I can say this, the problem, as I've heard it described, relates to the implementation of the extra 3gb of VRAM, which is a custom design for Apple's cards - not the same design as that used in the FirePro, so what the FirePro, or even the retail Radeon are rated for, or capable of, isn't really relevant.
s-l500.jpg

It may be because the VRAM itself consumes 60W(12*5W) from whole 129W of maximum power(yes, VRAM is the highest consuming part of almost every GPU), and that the memory cells are too close to each other(may be a bit problematic to cool).

However it is highly unlikely cause for the problems.

From post #1522:

As I've seen it described by people I have good reason to believe know very specifically what the problem is, the addition of the extra 3GB of vram to the Radeon design has given rise to a localised thermal problem that the system cannot cool, or protect against.​

The way I am reading this is:
  • Apple took a consumer grade card and doubled the amount of RAM
  • This resulted in a localized, to the RAM, heat concentration
  • The lack of adequate cooling in the nMP prevents the system from sufficiently cooling the area where the heat has concentrated
  • As a result the heat generation leads to their eventual failure.
Why might this not affect the W9000?
  • It was designed for 6GB of RAM instead of having it added in later
  • The different form factor and its installation in a case which provides proper cooling capacity results in a properly cooled card.
I don't want to be rude, but all of what you think from the start is wrong.

900x900px-LL-7e0a084f_AMD-FirePro-W9000-6GB-GDDR5-PCB.jpeg

Compare the memory layouts.
 
It may be because the VRAM itself consumes 60W(12*5W) from whole 129W of maximum power(yes, VRAM is the highest consuming part of almost every GPU), and that the memory cells are too close to each other(may be a bit problematic to cool).

However it is highly unlikely cause for the problems.
OK, what do you think is the cause of the problem and why?

I don't want to be rude, but all of what you think from the start is wrong.
Then prove me wrong, don't just say it.
 
OK, what do you think is the cause of the problem and why?


Then prove me wrong, don't just say it.
Because:
2556.jpg


Fire Pro D500 with 3 GB of VRAM. Will draw exactly the same amount of power. Every single memory cell draws 5W of power. The only thing that pops to my mind as a cause for D700 is that it has slightly higher memory clocks. Iverall difference in power consumption should be around 5W less for D500 memory compared to D700. Again, power consumption of memory is related to the clocks, and memory bus, not the amount of the memory itself.

3 GB 384 Bit memory will consume exactly the same power as 6 GB 384 bit memory, at the same clock speeds. They differ by internal bit configuration for each memory cell.

It can be the cause for the problem. But like I have said, it is highly unlikely.
 
Because:
2556.jpg


Fire Pro D500 with 3 GB of VRAM. Will draw exactly the same amount of power. Every single memory cell draws 5W of power. The only thing that pops to my mind as a cause for D700 is that it has slightly higher memory clocks. Iverall difference in power consumption should be around 5W less for D500 memory compared to D700. Again, power consumption of memory is related to the clocks, and memory bus, not the amount of the memory itself.

3 GB 384 Bit memory will consume exactly the same power as 6 GB 384 bit memory, at the same clock speeds. They differ by internal bit configuration for each memory cell.

It can be the cause for the problem. But like I have said, it is highly unlikely.
I think we've figured out who is wrong in this discussion you and I are having and I know it's not who you said it is.
 
I think we've figured out who is wrong in this discussion you and I are having and I know it's not who you said it is.
You think? Do you base this on technical analysis/knowledge on the topic, or your individual logic? I presume that you base this on that you do not know how 3 GB made from 12 Memory cells differ from 6 GB made from 12 memory cells. Each memory cell has 32 bit bus. If you will find out SK Hynix GDDR5 portfolio's of products, you will find out how they differ.

And again. 3 GB of GDDR5 on 384 bit memory bus, will consume exactly the same amount of power as 6 GB on GDDR5 on 384 memory bus at the same clocks.

Both GPUs have exactly the same amount of memory cells, each cell has 32 bits of memory bus. So how they differ?

I will answer. There are two variations. 2 Gb and 4 Gb memory. You can guess which one goes where. Both are using exactly the same amount of power at the same clock speed.
 
  • Like
Reactions: ManuelGomes
You think? Do you base this on technical analysis/knowledge on the topic, or your individual logic? I presume that you base this on that you do not know how 3 GB made from 12 Memory cells differ from 6 GB made from 12 memory cells. Each memory cell has 32 bit bus. If you will find out SK Hynix GDDR5 portfolio's of products, you will find out how they differ.

And again. 3 GB of GDDR5 on 384 bit memory bus, will consume exactly the same amount of power as 6 GB on GDDR5 on 384 memory bus at the same clocks.

Both GPUs have exactly the same amount of memory cells, each cell has 32 bits of memory bus. So how they differ?
One only needs common sense. Double the amount of memory and power consumption is going to increase (all else being equal). Your statement that it does not doesn't even pass the common sense test.
 
Being an engineer myself ( Gas Turbine ) its somewhat similar design to a gas turbine which I'm familiar with. I consider it an ingenious design that is very efficient in cooling. The axial flow design is more efficient than conventional box PC with multiple fans blowing in different directions.

The concept may be correct, but given the thermal failure induced problems people are encountering it seems like it is not working as anticipated in practice. Valid concept, poor execution.

And coming out of industrial design myself and having worked with the nMP for the past 2 years I don't feel that the vents are big enough to guarantee sufficient cooling.
 
  • Like
Reactions: ITguy2016
The concept may be correct, but given the thermal failure induced problems people are encountering it seems like it is not working as anticipated in practice. Valid concept, poor execution.

And coming out of industrial design myself and having worked with the nMP for the past 2 years I don't feel that the vents are big enough to guarantee sufficient cooling.
The thermal failure on the tcMP it's more related to the thermal compound and the bga solder than it's thermodynamics I'm engineer too and I use to work with thermodynamics, the tcMP thermal issues (as very similar issues plagued the iMac and the mbp11) use to be related to thermal compound "decay" or solder "fatigue" on year long warm environments, ok that's it's an failure from Apple to underestimate that factor (and actually not to choose the right compound) they slowly learned the lesson, and I'm sure this kind of failure will be less common in the future.

Similar failures would be present on other designs and are not as related to the thermodynamic efficiency as to mechanical factors (the core it's very efficient and flexible).

PD guys (in general) please don't passionate, remember we are in shortage on "Waiting next Mac Pro" thread names... Let's this thread live enough to see the 2016/2017 tcMP.
 
  • Like
Reactions: ManuelGomes
Lets Assume that Apple does not change the overall design of the computer, but updates it to current hardware.

Will you, readers, people interested in computers, buy it?
 
The thermal failure on the tcMP it's more related to the thermal compound and the bga solder than it's thermodynamics I'm engineer too and I use to work with thermodynamics, the tcMP thermal issues (as very similar issues plagued the iMac and the mbp11) use to be related to thermal compound "decay" or solder "fatigue" on year long warm environments, ok that's it's an failure from Apple to underestimate that factor (and actually not to choose the right compound) they slowly learned the lesson, and I'm sure this kind of failure will be less common in the future.

Similar failures would be present on other designs and are not as related to the thermodynamic efficiency as to mechanical factors (the core it's very efficient and flexible).

PD guys (in general) please don't passionate, remember we are in shortage on "Waiting next Mac Pro" thread names... Let's this thread live enough to see the 2016/2017 tcMP.

I can't speak for the solder, but if we add up all the issues from solder, to inadequately sized air ducts etc I think it's fair to say that the 6,1 is not a successful design. Conceptually it may have been a sound idea and should have worked on paper, but in practice it's not working out as planned. Kind of like the Edsel.Or Poprocks.

Hopefully Apple has learned from this mistake and the similar issues they are experiencing with the iMac etc. But given the amount of power that industrial design has within Apple, I fear that we will see the trend of aesthetics over function and engineering soundness to continue.

Regardless it's time to put this one on the shelf and move on.
 
From post #1522:

As I've seen it described by people I have good reason to believe know very specifically what the problem is, the addition of the extra 3GB of vram to the Radeon design has given rise to a localised thermal problem that the system cannot cool, or protect against.​

Who are these people? Please, no insiders who must remain silent due to NDA's and the like.​

The way I am reading this is:
  • Apple took a consumer grade card and doubled the amount of RAM
As I've shown before, most workstation graphic cards are based on consumer versions. Such as the D700/Radeon 7970/W9000. Some how the W9000 seems to not have an issue with added VRAM. The 3GB VRam on the 7970 I think has more to due with price than heat issues and to make their W9000 brand more enticing with the 6GB

Why might this not affect the W9000?It was designed for 6GB of RAM instead of having it added in later

The W9000 is based on the Tahiti XT Series chip, just like the Radeon 7970 and D700. Thats why workstation graphic cards will be released AFTER the consumer versions come out. As far as I know, no workstation graphic cards have been released based on a brand new GPU chip thats never been used. Talking about the Nvidia/AMD brand cards.
 
Last edited:
I'll be more interested to buy it with current design and updated internals. Right now it's not justifiable in my mind without a price cut to go along with the 3 year old tech.

Not to derail in anyway, but it with TB3 being adopted more and more by other vendors/manufacturers, I would imagine TB1/2 device to quickly see legacy status before too long. Really limiting potential expansion on the current offerings without adapters and dongels.
 
So how about this, lets take the obvious stuff:
Xeon E5v4 CPUs, 2400 MHz ECC RAM, 2.5 GB/s SSD's, dual AMD GPUs based on Polaris Ellesmere and Vega 10 configs, Thunderbolt 3, with 10 USB-C ports, that can be used either way: as USB-C or Thunderbolt 3.

Current design.
 
  • Like
Reactions: ManuelGomes
Lets Assume that Apple does not change the overall design of the computer, but updates it to current hardware.

Will you, readers, people interested in computers, buy it?

Depends on a few things. I dislike the trashcan design and Apple's neglect has evaporated more than 20 years of goodwill they've built up with me, but I am very closely tied to the mac OS for my work.

First how much RAM is on the GPU card? My work requires a large amount.

If they keep the machine on a 3 year cycle then a purchase only makes sense within the first 6 months of release and I would have to buy the top of the line GPU. The major internals of the machine can't be upgraded so you need to get in early in the cycle so it has time to depreciate. It makes no sense to buy a machine that can't be upgraded 6 months before it's EOL.

If they stick with the same case I would still be very worried about thermal issues. So Apple care would be a must. I would probably try to purchase a spare case and modify it for better cooling, aesthetics be dammed.

It's a tough call. I have a cheap PC on the way so I can test Windows 10. If I can make it work I may just dump Apple after more than 20 years and order an HP. I'm done with getting jerked around by them, because now it's cutting into my work.
 
  • Like
Reactions: koyoot
It's a tough call. I have a cheap PC on the way so I can test Windows 10. If I can make it work I may just dump Apple after more than 20 years and order an HP. I'm done with getting jerked around by them, because now it's cutting into my work.
If it's "throwaway cheap", I hope that you don't evaluate the system as well as the OS. The build quality and components on some of the super cheap ones show that you get what you pay for, and don't come close to even the entry workstations from HP and Dell.

For us, an entry workstation is a Dell Precision 3620 - I just got a quote on one for $1280.
  • Xeon E3-1245 v5 (quad with HT, 3.5 GHz (3.9 Turbo), HD P350)
  • 365 watt 90% power supply
  • 16 GiB 2133 MHz DDR5 with ECC
  • 2 TB 7200RPM drive (we'll stick an 850 EVO in it for a boot drive)
  • DVD RW
The workstation that it's replacing has a fairly new GTX 960, so we'll move that over.

Also if you let Win10 upgrade to the latest build - you'll have a Ubuntu bash shell to use.
 
I've noticed some people post on the various MP threads that their company leases MPs and they get a new one every three years like clock work (no offense, but I call these people "Chocolate eaters"). Someone in accounting just clicks on Apple's website and a guy from I.T. just shows up with the new MP and performs a migrate while the person is off at a meeting or something (one poor guy lamented loosing his 6.1 tower). So given that Apple would like to keep this "autopilot revenue" stream flowing, it makes sense that we will see a new MP before these 3 year leases are up. Unless - these people will just get a brand new 6.1 just like the one they currently have - this makes no sense but I've worked for some companies that are this stupid.....

So my prediction is the 7.1 will hit before December 2016, but I wouldn't bet on it.....
 
If it's "throwaway cheap", I hope that you don't evaluate the system as well as the OS.

It's just to poke around with Windows 10. I'm familiar with the HP boxes from work as they are replacing Mac Pro units at a rapid clip.

There are a lot of cheap HP z640 on eBay...

[doublepost=1473176266][/doublepost]
I've noticed some people post on the various MP threads that their company leases MPs and they get a new one every three years like clock work (no offense, but I call these people "Chocolate eaters"). Someone in accounting just clicks on Apple's website and a guy from I.T. just shows up with the new MP and performs a migrate while the person is off at a meeting or something (one poor guy lamented loosing his 6.1 tower). So given that Apple would like to keep this "autopilot revenue" stream flowing, it makes sense that we will see a new MP before these 3 year leases are up. Unless - these people will just get a brand new 6.1 just like the one they currently have - this makes no sense but I've worked for some companies that are this stupid.....

So my prediction is the 7.1 will hit before December 2016, but I wouldn't bet on it.....


Well, that's not entirely true. Accounting is not ordering anything on autopilot. At least not in my business.

The IT departments that I work with don't like the nMP and management feels that they are way overpriced. Hence the mass switch to HP and similar workstations. The problem for Apple is that once a company switches to Windows or Linux it's difficult to get them to come back. Nobody wants to turn their infrastructure upside down on a regular basis and waste money in the process. If the switch works and is cheaper and more reliable there's no point for them to come back.

This probably doesn't matter to Apple since they appear to regard the Mac as some legacy appendage they got stuck with and are indifferent too. Let's face it. Within Apple Steve Jobs was the last proponent of the Mac and with him gone the focus is on phones, iPads, top 40 music and watchbands.
 
Last edited:
Who are these people? Please, no insiders who must remain silent due to NDA's and the like.​
This question should not be addressed to me as I was not the one who wrote the quoted material. Yes, that's right...I did not write post #1522 which is why I referred people back to it.
 
Lets Assume that Apple does not change the overall design of the computer, but updates it to current hardware.

Will you, readers, people interested in computers, buy it?

No - It is a TCO fail. The nMP design requires too many compromises and is missing too much functionality. The design may be great for Final Cut X users, but I don't use that product. I do 3d art & there are too many limitations.

Back at the turn of the century, P.T. Barnum was pushing Make your Mac the hub of your digital lifestyle.

Current macs offerings no longer support that concept. I am not reworking my entire entertainment workflow to compensate for Timmy & Sir Idiot Boy's lack of vision.

With the nMP design, I'd have to replace all of the missing functionality - i.e. multiple T-bolt external drive boxes (with associated power bricks), another one to hold my blu-ray player (with associated power brick), and a separate dock (with associated power brick) because there aren't enough USB connectors (scanner, phone, ipad, usb sticks, my Griffin Tech dial thingie, etc). I also don't see the advantage of paying for a 2nd GPU that will never be used.

Then there is the thermal issues - with the cMP, I can crank both the cpu & the gpu at the same time. Can't do that with the nMP.

As a cMP user, I don't give a **** what my box looks like, nor is the noise level an issue - cars passing by the house are a lot noiser than my cMP. In addition, my art workflow will take as many CPU/GPU cores and ram that I can throw at it; the nMP isn't designed for that - it was designed for hipsters, just like everything else that Apple has released since P.T Barnum died back in 2011.

I am not Apple's target audience anymore - so I am going to migrate to companies that want my money. Software migration will run me about $600 (Zbrush & Windows 10) & hardware costs would be less than an nMP (mainly due to replacing everything ripped out of the nMP).

Windows 10 is also about as reliable as OSX, so that advantage has also gone out the window.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.