Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
cool, may be good to order a second min 6 pin to 6 pin as im not shore how good it is to pull that much power from those lines? (i have no idea if it's good or bad).

can you do a pass on the 'extreme' or 'extreme HD' preset? so i can compare to my GTX770 :D
 

Attachments

  • Screen Shot 2018-05-22 at 5.59.03 pm.png
    Screen Shot 2018-05-22 at 5.59.03 pm.png
    1.1 MB · Views: 110
thanks, looks like a solid 10FPS+ or 1/4 faster than my card now in compute tasks your card will relay pull away from my GTX770
i think the titan black was the last titan to be a "compute card" the later ones got kneecapped with floating point speed, well until vaga came out and the currant titan got a update which gave it a massive compute speed boost.

octanebench is a nice compute benchmark, on the score page your titan black has 84 points and my GTX770 has 47 points :eek: thats where you see the real speed difference.
https://render.otoy.com/octanebench/results.php?v=3.06.2&sort_by=avg&filter=&singleGPU=1
(make shore to click the single GPU option to see only single gpu scores)
https://render.otoy.com/octanebench/
 
  • Like
Reactions: Henninges
I hope it's ok to resurrect an old thread. I'm trying to find out of my experience is normal with a new (to me) flashed Radeon 7970.

It's an MSI OC edition, which I believe is a reference card that's factory overclocked about 9% ... a little slower than the GHZ edition. The card works fine and benchmarks fine in my Mac Pro 5.1. But it's always kicking out a lot of heat. One of the reasons I chose this card was that in all the reviews the power consumption was low at idle. It's a power hog when working hard, but my workflow won't have it working hard for hours on end.

Anyway, I checked HardwareMonitor, and it shows power consumption at idle to be around 60 watts! Around 30 watts from the PCI slot, and around 15 each from the booster cables. Review sites say it should only consume 10–14 watts at idle. So I'm getting 4 to 6 times worse than this.

Meanwhile, running full tilt in a stress test, I can't get it to pull more than 185 watts. So its maximum power is lower than expected, while its idle power is much higher.

Anything to be done about this? Not the end of the world, but it would be nice to not have a space heater under my desk all summer long.

FWIW, AMD advertises power saving features on this card to reduce energy consumption at idle. Is it possible that these only work under Windows?

Thanks for any thoughts.
 
Last edited:
I hope it's ok to resurrect an old thread. I'm trying to find out of my experience is normal with a new (to me) flashed Radeon 7990.

It's an MSI OC edition, which I believe is a reference card that's factory overclocked about 9% ... a little slower than the GHZ edition. The card works fine and benchmarks fine in my Mac Pro 5.1. But it's always kicking out a lot of heat. One of the reasons I chose this card was that in all the reviews the power consumption was low at idle. It's a power hog when working hard, but my workflow won't have it working hard for hours on end.

Anyway, I checked HardwareMonitor, and it shows power consumption at idle to be around 60 watts! Around 30 watts from the PCI slot, and around 15 each from the booster cables. Review sites say it should only consume 10–14 watts at idle. So I'm getting 4 to 6 times worse than this.

Meanwhile, running full tilt in a stress test, I can't get it to pull more than 185 watts. So its maximum power is lower than expected, while its idle power is much higher.

Anything to be done about this? Not the end of the world, but it would be nice to not have a space heater under my desk all summer long.

FWIW, AMD advertises power saving features on this card to reduce energy consumption at idle. Is it possible that these only work under Windows?

Thanks for any thoughts.
If you have 2 or more monitors, power save mode is disabled. That’s why I changed to eVGA GTX 680 for Mac, only when 3D is used that the card starts to be power hungry.
 
I hope it's ok to resurrect an old thread. I'm trying to find out of my experience is normal with a new (to me) flashed Radeon 7990.

It's an MSI OC edition, which I believe is a reference card that's factory overclocked about 9% ... a little slower than the GHZ edition. The card works fine and benchmarks fine in my Mac Pro 5.1. But it's always kicking out a lot of heat. One of the reasons I chose this card was that in all the reviews the power consumption was low at idle. It's a power hog when working hard, but my workflow won't have it working hard for hours on end.

Anyway, I checked HardwareMonitor, and it shows power consumption at idle to be around 60 watts! Around 30 watts from the PCI slot, and around 15 each from the booster cables. Review sites say it should only consume 10–14 watts at idle. So I'm getting 4 to 6 times worse than this.

Meanwhile, running full tilt in a stress test, I can't get it to pull more than 185 watts. So its maximum power is lower than expected, while its idle power is much higher.

Anything to be done about this? Not the end of the world, but it would be nice to not have a space heater under my desk all summer long.

FWIW, AMD advertises power saving features on this card to reduce energy consumption at idle. Is it possible that these only work under Windows?

Thanks for any thoughts.

Are you sure your card is the 7990?
 
Are you sure your card is the 7990?

It is not! That was a terrible typo. It's a 7970. Fixed in the original. Thanks for catching.
[doublepost=1537646040][/doublepost]
If you have 2 or more monitors, power save mode is disabled. That’s why I changed to eVGA GTX 680 for Mac, only when 3D is used that the card starts to be power hungry.

That's interesting, thanks. And annoying. I replaced a 680 with this one because the 680 was causing all kinds of problems, including refusing to wake from sleep.

If there's only one monitor, how does power save mode work? I read that one feature is powering the card down when the monitor is turned off, but that doesn't seem useful. Does it allow the card to draw less power when it's not doing gpu-intensive work?
 
Last edited:
It is not! That was a terrible typo. It's a 7970. Fixed in the original. Thanks for catching.
[doublepost=1537646040][/doublepost]

That's interesting, thanks. And annoying. I replaced a 680 with this one because the 680 was causing all kinds of problems, including refusing to wake from sleep.

If there's only one monitor, how does power save mode work? I read that one feature is powering the card down when the monitor is turned off, but that doesn't seem useful. Does it allow the card to draw less power when it's not doing gpu-intensive work?
78xx and 79xx cards with just one monitor and using 2D only turn off units and enter a power saving mode, power usage is down to around 15Watts or less, with 2 monitors jumps to 50W or more.

Check here https://forums.macrumors.com/thread...tion-chart-older-cards.2071191/#post-25105831
 
Last edited:
78xx and 79xx cards with just one monitor and using 2D only turn off units and enter a power saving mode, power usage is down to around 15Watts or less, with 2 monitors jumps to 50W or more.

Check here https://forums.macrumors.com/thread...tion-chart-older-cards.2071191/#post-25105831

Awesome, thanks. That explains it.

After Mojave ships, I'll wait and see if there are some workarounds. My hope is that a non-metal gpu like the gt-120 will still work as a 2nd card for the 2nd monitor, without interfering with anything. I won't bet on it yet, but it seems like it would solve the problem.
 
Awesome, thanks. That explains it.

After Mojave ships, I'll wait and see if there are some workarounds. My hope is that a non-metal gpu like the gt-120 will still work as a 2nd card for the 2nd monitor, without interfering with anything. I won't bet on it yet, but it seems like it would solve the problem.
Tests done with windows, shows that GT120 without drivers goes to 75ºC idle. It’s the driver that sets the power management.

If I was you, I’ll start with plan B.

Edit: 75ºC, not 75W.
 
Last edited:
Tests done with windows, shows that GT120 without drivers uses 75W idle. It’s the driver that sets the power management.

If I was you, I’ll start with plan B.

Where did you see that? I haven't found any test that show power usage at idle, but every spec I've read including Nvidia's says the maximum power draw is 50w.
 
Where did you see that? I haven't found any test that show power usage at idle, but every spec I've read including Nvidia's says the maximum power draw is 50w.
Sorry, not 75W without driver, 75ºC. @h9826790 tested this exhaustively. Check his posts about GT120.
 
Last edited:
  • Like
Reactions: h9826790
Thanks. I need stop worrying about it for now and see what happens when Mojave ships. If a 120 turns out to be compatible, it wouldn't be too big a deal to rent one from ebay and see if makes a real difference.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.