Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Z28McCrory

macrumors regular
Original poster
Jan 20, 2014
117
54
Indiana
I'm on the fence about ordering a 7,1 with the 580 and adding a 2x VII (for a significant savings and performance benefit over a Vega II), or biting the bullet and simply ordering the Pro Vega II.

I basically had my mind made up to go the VII route, but some late night reading found an article about the VII discussing how the ECC is disabled (supposedly this was AMDs way of separating it from the workstation class cards).

My purpose for the 7,1 is photo and video editing. I do photography and video production for a living. After reading some articles about ECC vs non-ECC for GPUs, people cited examples of things such as "ECC may prevent a random pixel from being an incorrect color, etc". Thinking about my use case, this seems like a potential big deal if thats truly the case (rendering out a video, only to discover a few incorrect hot pixels, or artifacts, etc).

I'm probably to the point where I'm entirely overthinking this and going down too deep a rabbit hole. There is some comfort in simply ordering it with a Vega II and being done with it for a while. But there is also comfort is saving a significant chunk of money (and in the case of having 2x VIIs, having a machine with performance that I wouldn't have been able to afford had I went with 2x Vega IIs). I also foresee a time a few years down the road, where either option will be ready to replace (a big reason why a MP 7,1 appeals to me... having this option), and it being much less painful to upgrade from the VII since the initial cost was so much lower (whereas upgrading from the $2400 option Vega II in its nice Apple MPX black aluminum form may seem like a tough pill to swallow).
 
Yes! I’m having same indecisive thoughts. ( I ordered with 580x first, and now got vega II ordered, but now thinking of cancelling and going back to 580x)

but do we know 580x will work with two Radeon VII cards? I looks like it will physically, but don’t know if that will be too much power draw.
[automerge]1576692430[/automerge]
In terms of your ECC memory question, I don’t think it matters.

Just looked it up and the 2013 MacPro didn’t have ECC enabled on its GPUs. So my thought would be if it was fine for that Mac Pro it’s fine for this one.
 
Last edited:
That's a good question, but I think the issues with bit-flipping is much more of an issue with scientific/financial workloads. You didn't mention your current setup, but I'm guessing it doesn't have ECC memory for the GPU either.
This is more of an academic question, because while possible, there are very few real world examples from photo and video editing studios. The other thing to consider is the speed penalty you will incur with ECC memory, as it's generally slower.
 
Yes! I’m having same indecisive thoughts. ( I ordered with 580x first, and now got vega II ordered, but now thinking of cancelling and going back to 580x)

but do we know 580x will work with two Radeon VII cards? I looks like it will physically, but don’t know if that will be too much power draw.
[automerge]1576692430[/automerge]
In terms of your ECC memory question, I don’t think it matters.

Just looked it up and the 2013 MacPro didn’t have ECC enabled on its GPUs. So my thought would be if it was fine for that Mac Pro it’s fine for this one.

Glad to hear I'm not the only one!

I would LOVE to find someone who bought a 7,1 with the Vega II, and happens to have a VII laying around that they're willing to swap into it and run a few benchmarks (and well... overall just tell me if everything works as expected).

As for the 2x VII + Keeping the 580x... I don't know. In my particular case, it wouldn't be the end of the world if I had to remove the 580x. I don't plan on buying the XDR display, so the VIIs are all I need in terms of connectivity (they're actually preferred over the Vega II in terms of ports, because my current pair of 4k monitors have DP and miniDP, so I could natively go straight into a VII).
[automerge]1576692743[/automerge]
That's a good question, but I think the issues with bit-flipping is much more of an issue with scientific/financial workloads. You didn't mention your current setup, but I'm guessing it doesn't have ECC memory for the GPU either.
This is more of an academic question, because while possible, there are very few real world examples from photo and video editing studios. The other thing to consider is the speed penalty you will incur with ECC memory, as it's generally slower.

My current setup is a 6,1 with D500s. According to a quick search, it does have ECC memory on the GPU.
 
ECC is primordial for scientific workloads, remember the Big Mac cluster of 1100 PowerPC G5s that didn't worked reliably with nodes crashing frequently because of not having ECC RAM and had to be sold piece by piece?

Seems the original article about the RAM errors vanished from the web, but this MacWorld article about the sell of the Macs is still online: https://www.macworld.com/article/1029443/macmall.html

Edit:

Found the IEEE paper about the Big Mac cluster not working because of RAM errors. https://spectrum.ieee.org/computing...mputer-dirty-power-cosmic-rays-and-bad-solder
 
Last edited:
  • Like
Reactions: MisterAndrew
I have run some large photogrammetry projects (5600 images) with my 2013 Mac Pro and have never had a failure due to data corruption. Other than wanting to be a purist and have the latest GPU's with the new machine I don't believe you will have any problems with the Radeon VII for photo and video editing. Another thing to note is all HBM2 DRAM is ECC, just not the internal caches of the Radeon VII. So either way you will be getting ECC DRAM.

Source: https://www.anandtech.com/show/13923/the-amd-radeon-vii-review/3
 
In terms of your ECC memory question, I don’t think it matters.

Just looked it up and the 2013 MacPro didn’t have ECC enabled on its GPUs. So my thought would be if it was fine for that Mac Pro it’s fine for this one.
 
I remember discussions about this for the 6,1. The AMD Pro graphics cards of that era, including D300/D500/D700 on the Mac Pro, don't have hardware ECC VRAM. There is a "virtual" ECC function that can be enabled or disabled in drivers.

ECC memory on the FirePro D300, D500 and D700 is disabled.

Source: https://www.anandtech.com/show/7603/mac-pro-review-late-2013/9

That is an old article. About a year after, the newer Windows drivers included a toggle to turn it on or off.
[automerge]1576695420[/automerge]
ECC is primordial for scientific workloads, remember the Big Mac cluster of 1100 PowerPC G5s that didn't worked reliably with nodes crashing frequently because of not having ECC RAM and had to be sold piece by piece?

Seems the original article about the ECC errors vanished from the web, but this MacWorld article about the sell of the Macs is still online: https://www.macworld.com/article/1029443/macmall.html

According to Wikipedia, they upgraded to ECC ram and it solved their problem.

 
Last edited:
I believe that option in the Windows drivers only enables "Virtual ECC" which will reduce your overall memory pool for the error correction bit.
 
Neither Vega II nor Radeon VII has ECC memory, only hbm2 memory 32 and 16gb respectively, technically Vega II are R.VII with twice memory and tb3 plus infinity fabric and slightly lower clock and no fp64 under metal
 
Neither Vega II nor Radeon VII has ECC memory, only hbm2 memory 32 and 16gb respectively, technically Vega II are R.VII with twice memory and tb3 plus infinity fabric and slightly lower clock and no fp64 under metal

Thank you for pointing that out. It makes the decision easy then.
 
I'm on the fence about ordering a 7,1 with the 580 and adding a 2x VII (for a significant savings and performance benefit over a Vega II), or biting the bullet and simply ordering the Pro Vega II.

I basically had my mind made up to go the VII route, but some late night reading found an article about the VII discussing how the ECC is disabled (supposedly this was AMDs way of separating it from the workstation class cards).

My purpose for the 7,1 is photo and video editing. I do photography and video production for a living. After reading some articles about ECC vs non-ECC for GPUs, people cited examples of things such as "ECC may prevent a random pixel from being an incorrect color, etc". Thinking about my use case, this seems like a potential big deal if thats truly the case (rendering out a video, only to discover a few incorrect hot pixels, or artifacts, etc).

I'm probably to the point where I'm entirely overthinking this and going down too deep a rabbit hole. There is some comfort in simply ordering it with a Vega II and being done with it for a while. But there is also comfort is saving a significant chunk of money (and in the case of having 2x VIIs, having a machine with performance that I wouldn't have been able to afford had I went with 2x Vega IIs). I also foresee a time a few years down the road, where either option will be ready to replace (a big reason why a MP 7,1 appeals to me... having this option), and it being much less painful to upgrade from the VII since the initial cost was so much lower (whereas upgrading from the $2400 option Vega II in its nice Apple MPX black aluminum form may seem like a tough pill to swallow).

Believe it or not, memory is actually affected by cosmic rays. This is rare but not as rare as you might suspect. A cosmic ray hitting your memory can actually flip the state of a bit. ECC has correction to detect that event and correct for it. If you are doing large complicated things that execute for long durations and either knowing if the result is correct or not can be difficult or repeating the operation can be costly should it have gone awry then knowing that a cosmic ray event won't destroy your work is a pretty good thing. Someone playing video games and watching youtube, who cares. Someone performing a massive render or huge simulation calculation, they want ECC.
 
  • Like
Reactions: OkiRun
Ordinary memory has parity checking. It can detect (but not correct) a single bit flip. ECC memory can correct single bit flips, and detect double bit flips. The denser memory becomes, the more likely the event of a double bit flip becomes. They are still quite rare, but you want ECC RAM in two situations:
1. Absolute correctness is required. I'm on record stating that release versions of all software should be compiled with ECC RAM. You want your bank's software running on ECC RAM.
2. You have a long chain of computations that feed the results of one computation into another. I'm not an expert at ray tracing, but from what I do know, the results at any given pixel don't depend on the others, nor are the pixels from one frame in an animation fed back into the next frame. So ray tracing or any other rendering where the pixel values aren't fed back into further computations is fine without ECC. In the worst circumstance with a rendering, you end up having the two high end bits of a byte flip, such as from 0x00 to 0xC0, but further post processing (scaling down, compression) could easily wash that out.
 
Believe it or not, memory is actually affected by cosmic rays. This is rare but not as rare as you might suspect. A cosmic ray hitting your memory can actually flip the state of a bit. ECC has correction to detect that event and correct for it. If you are doing large complicated things that execute for long durations and either knowing if the result is correct or not can be difficult or repeating the operation can be costly should it have gone awry then knowing that a cosmic ray event won't destroy your work is a pretty good thing. Someone playing video games and watching youtube, who cares. Someone performing a massive render or huge simulation calculation, they want ECC.
Moreless this is true, buy the actual reasons are more complicated.

Fwiw, you need ECC if you run an algorithm with very long process with progressively correlated dependant variables (breaking cyphered codesl, or if you run such dense clusters where a single cosmic Ray may harm a series of servers (a weather forecast) But most important reason is to assure you keep data integrity, this maybe supreme on legal liability scenarios as with health records, for audio video 3d studios, a cosmic Ray glitch is not something to care even creating original content as it is even useful as original signature, and almost impossible for human audience to detect it.
 
ECC memory on the FirePro D300, D500 and D700 is disabled.

Source: https://www.anandtech.com/show/7603/mac-pro-review-late-2013/9

It wasn't disabled, they were gaming cards with a custom expanded capacity of non-ecc memory.

The Pro Vega II is likewise a non ECC card.

Interesting note from Barefeats on Twitter - most of the time the Pro Vega II Duo sits one of its sub cards as idle, because it looks to the system like two separate cards, and unless the app in particular is specifically coded to use multiple GPUs, it's the same situation as the 2013. Wasn't the point of metal to pool all GPU resources so the system always saw all GPUs as a single resource?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.