Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

imrazor

macrumors 6502
Original poster
Sep 8, 2010
403
120
Dol Amroth
I recently picked up a MacPro5,1 with a stock ATI 5770. One of the first things I did was to replace it with an RX 580. However, since I put it in it has been running hot, at least under Boot Camp. I found that after about an hour of intensive usage, the screen would randomly blank out for 2 - 3 seconds. I used MSI Afterburner (a popular PC utility for getting diagnostic readings off GPUs, as well as overclocking) and found temps at idle to be around 45C and hitting 75C after 2 - 3 minutes of a benchmark.

So I used Afterburner to set a custom fan curve, which is loud, but keeps the card cool and stable. But it seems like this shouldn't be necessary. Do GPUs tend to run hot in Mac Pros? I've got the case shoved into an area with mediocre air circulation, but I'm dealing with a small space and don't have much choice.
 
I think the RX580 gets too much praise around here. The only reason it's worth talking about is because that's the card that can get you into Mojave, at least currently. It's a GTX 1060 at best, which is only good for "solid" 1080p gaming. The problem? It generally runs hotter, consumes more TDP, and is not a very efficient card comparatively. It acts like an overclocked RX480, which it basically is.

Back to the topic. Some RX580s need some undervolting from the factory. Others seem to be dialed in pretty well from the factory. It just depends.

However, no vanilla RX580 should be blacking out the screen and such. Overheating usually does other things rather than blacking out the screen.....

The Mac Pro case is fine for GPUs.

I would download AMD Wattman, and then proceed to undervolt the graphics card (run stress tests) and leave the fan curve alone (reset it). I was surprised (look up my VR thread) about how much mV I was able to undervolt it, and how much free performance and less heat I got out of it by doing so. Wattman loads up every time Windows starts, so you get your profile loaded automatically.

Undervolting has zero downsides and contains many upsides.

Delete MSI Afterburner, you won't need it with Wattman.

By the way, your temps are "fine" for an RX 580...those are "normal" temps, but it's always better to try for better. Wattman will also tell you if your card is even hitting the boost speeds consistently in gaming. That's one of the reasons why you want to undervolt poorly optimized cards so you can hold your boost speeds.
 
Last edited:
This card is a bit unusual in that it is a 'pull' from a Dell OEM system. It looks exactly like a reference RX 480. I surmise that Dell had a backlog of 480s when the 580 was released, and simply overclocked the BIOS and re-flashed it with a 580 identifier. So it's using the stock 480 blower instead of the better twin/triple fan design of most partner RX 580s. It runs at 1266 MHz core clock speed, but at least it has 8GB of VRAM. But I also know it doesn't run this hot in a regular PC, so I'm at a loss to figure out why it's running so hot in the Mac Pro.

I installed the latest Radeon drivers in Windows, which comes with Wattman. But I'm much more familiar with MSI Afterburner (which can also start with Windows), so I'm a bit hesitant to dive into unknown software.
 
My MSI RX580 is a blower design....no problems with it.

You're looking into a problem that isn't there. Those temps are normal for the RX580. The RX580 is not an efficient card and runs hot.

I provided instructions on what you can do in order to undervolt it. AMD wattman is part of the driver package, so it isn't third-party software. You just have to enable it (google search). MSI Afterburner is third party, and Wattman is much more complete and allows undervolting. Wattman will also graph your clock rates over time, so it's a great way to see what's going on and how much/how often it is boosting during gaming. In about 30 minutes I'd have that card optimized by AMD Wattman.
 
This card is a bit unusual in that it is a 'pull' from a Dell OEM system. It looks exactly like a reference RX 480. I surmise that Dell had a backlog of 480s when the 580 was released, and simply overclocked the BIOS and re-flashed it with a 580 identifier. So it's using the stock 480 blower instead of the better twin/triple fan design of most partner RX 580s. It runs at 1266 MHz core clock speed, but at least it has 8GB of VRAM. But I also know it doesn't run this hot in a regular PC, so I'm at a loss to figure out why it's running so hot in the Mac Pro.

I installed the latest Radeon drivers in Windows, which comes with Wattman. But I'm much more familiar with MSI Afterburner (which can also start with Windows), so I'm a bit hesitant to dive into unknown software.
This Dell card is nothing but trouble when installed in a MP5,1. It's a factory modded RX 480 to RX 580 firmware and use more energy from the PCIe slot than the spec. There are lot's of reports of problems like yours and shutdowns too. It will probably get worse.

It's a bad fit, try to exchange/replace it for a more compatible one like Sapphire Pulse RX 580.
 
Good info!

if you are broke, and on the meantime, I bet undervolting will make it safe...
The problem is not the cooling solution, but the the way the card power circuit was designed. Undervolting it won't change it, AMD initial reference design had serious flaws that were corrected after Dell produced this cards and no one else used the reference design after the problems were discovered.

Keeping the card cool helps, but is just a stop gap.
 
Last edited:
It will draw less voltage with undervolting...which has nothing to do with "cooling solution" or fan curve. If it was my system I'd do it right away. I mentioned nothing of a sort about cooling.

Undervolting will certainly change the power draw which directly influences the "power circuit." Absolutely 100% fact in all cases.

If you can drop it 50Mv from most power stages I would bet it would be a serviceable card. I dropped a reference 580 blower MSI card down from stage 2 and beyond by -90Mv and stress tested the **** out of it.....zero hiccups and 100% boost all the time.

If it barfs at -10Mv, then I would immediately toss it.

Keep in mind his temps are 100% accurate for an RX580. Do a google search, plenty, and I mean plenty of people are reporting same temps with 2-3 fan design third-party RX580s. That's why I implied it's not temperature or cooling related.

I'm not saying the Dell card isn't flawed, but I haven't seen evidence to imply that it's a "throwaway" without doing a simple undervolt first. Undervolt it. Test. Learn. Report back. Save money, or perhaps it is flawed and doesn't work. But you should TRY first.

Throwing it away and buying another one without trying first means we don't get to learn anything here if undervolting it to where it should be was all it took.
 
Last edited:
Here is a thread where I took my reference MSI RX580 down by -90Mv, and obtained the highest benches....no doubt the card runs cooler and better.


-90Mv is pretty impressive. It's worth trying in 10Mv increments to see where yours lands. Your problem may just go away without spending a stupid amount of a money for a crappy old and outdated RX580 card, which are all going to be pretty overpriced at this point.

In-fact, I "maybe' could have gone more, but I thought -90mv was a serious achievement that I just stopped. Why not suggest the same to the OP rather than making him give up? I am definitely open ears if this has been tried before, but if it hasn't, there's no room to talk and I will still recommend what I am recommending.
 
Here is a thread where I took my reference MSI RX580 down by -90Mv, and obtained the highest benches....no doubt the card runs cooler and better.


-90Mv is pretty impressive. It's worth trying in 10Mv increments to see where yours lands. Your problem may just go away without spending a stupid amount of a money for a crappy old and outdated RX580 card, which are all going to be pretty overpriced at this point.
It's not the same, you are comparing apples and oranges since your GPU has a PCIe 8-pin power connector and never will use the PCIe slot power over the specification.

Dell faked RX 580 uses more than 75W from the PCIe slot while all other cards, even when running Furmark, never get near the limit, with most cards around 30~35W.

It's a flawed design, Dell even announce it as 160W, with just one PCIe 6-pin power connection. This card never should be used in a MP5,1.

490-BEET_mvi1.jpg
 
Last edited:
Now that it's showing only 1 6-pin power plug, makes sense. I'd still want to undervolt it to see what happens. You never know with the silicon lottery...

Either way, if it's pulling more than 75w from the PCI-slot, I'd veer away. Good info, thanks. That's not a Mac Pro 5,1 issue, that's an any-PC issue.
 
found temps at idle to be around 45C and hitting 75C after 2 - 3 minutes of a benchmark.

That temperature is normal. It is designed to run at that temperature (both idle and under stress).

I don't know the stock voltage of this particular Dell RX580. But you may try 1243MHz with 980mV, and slowly reduce it to find out the minimum stable voltage. I believe most RX580 should able to to go lower voltage with this clock speed. Also, reduce the VRAM speed to 1750MHz @ 975mV.

This setting is NOT a random clock speed, but direct copy from the WX7100. Which use the same chip, but with 130W TDP.

AFAIK, the RX580 balance the power draw quite well. In your case, if it want to draw 160W, then 80W from the slot, and 80W from the 6pin. Of course, this is a bad news to the users. However, if we can really reduce the card's actual power draw to below 150W (even under Furmark), then the power draw via the slot should never go above 75W. Therefore, undervolt may able to help in this case.

In general, there is no need to use Afterburner. Simply use Wattman is better and safer (it has build in fail safe protection). Also, less confusion.

However, no matter which software you use, make sure the PowerTune setting is NO more than 30% in your case.

The default TDP for this Dell RX580 is 110W, but PowerTune 50%. Which means if you allow it to stay at +50%, the max allowed power draw will become 165W.

TBH, if I were you, I will dump the ROM, mod that PowerTune limit to 30%. So that, under default setting (including macOS), the card will be limited to draw 143W.

So, if the card draw 50% power from the slot as expected, then the power delivered via the slot will able to stay just below 75W.

This should help to maintain higher stability, lower the GPU temperature, also avoid fired the slot / logicboard / etc.

So I used Afterburner to set a custom fan curve, which is loud, but keeps the card cool and stable.

I suspect this can help NOT because the GPU is overheating. 75°C is really nothing. The temperature target of this particular card is 80°C. Which means, the default fan profile should stay at low RPM until the GPU approaching 80°C, then slowly spin up to let the GPU stay at 80°C. So, 75°C is very very normal. And this card set the max temperature at 90°C. The shutdown temperature and hotspot temperature are even higher.

Then why spin up the fan make it run more stable?

May be because the card will draw less when it run cooler. So, in your case, spin up the fan more allow the card's power draw just stay within the "stable range".

Of course, this is just my personal guess, but can happen.

Anyway, I created a post about RX580 ROM study at here. Even though that's focus on the PULSE RX580, but you may still go to have a look.

 
Thanks for all the great info guys.

I've used this RX 580 in 3 systems (including the original Dell it came in) and never had problems with it. Did Apple, unlike those PC makers, just not 'overprovision' the PCIe power rail?

I'm not sold on modding the BIOS. I don't want to take a chance on bricking a card that functions. Two other possibilities occur to me. One, I could limit the power draw with Afterburner/Wattman. There's a slider in Afterburner to adjust the power level from -50% to +50%, and I assume a similar function exists in Wattman. Hmm, that probably wouldn't work for Mac OS, would it?

For option B - why not undo Dell's hackery, and just flash the thing with a stock reference RX 480 ROM? Would that get the PCIe power draw back within spec?
 
Thanks for all the great info guys.

I've used this RX 580 in 3 systems (including the original Dell it came in) and never had problems with it. Did Apple, unlike those PC makers, just not 'overprovision' the PCIe power rail?
These initial reference cards fried a lot of PCs 16x slots before AMD patched the drivers. It's stupid to use more current then the spec, you never know if hardware designers overprovised or not.

MP4,1/MP5,1 is very sensitive to overconsumption of the PCIe slot power but very tolerant of PCIe Boost.

I'm not sold on modding the BIOS. I don't want to take a chance on bricking a card that functions. Two other possibilities occur to me. One, I could limit the power draw with Afterburner/Wattman. There's a slider in Afterburner to adjust the power level from -50% to +50%, and I assume a similar function exists in Wattman. Hmm, that probably wouldn't work for Mac OS, would it?

For option B - why not undo Dell's hackery, and just flash the thing with a stock reference RX 480 ROM? Would that get the PCIe power draw back within spec?
The problem is not the card firmware fake mod to RX 580, but that it has just one PCIe power connection and the power circuit uses more current from the PCIe slot that the spec allows. You can't change the hardware design, but you can downvolt it to below the spec.

There are reports here that even the Dell RX 480 reference card, the not firmware modded one, makes MP5,1 shutdown when in high load.
 
Delete afterburner. Get rid of it.

Open up Wattman, and downvolt the stages and see how it plays. It's not flashing, or breaking anything. To make it really simple, only downvolt the last stage by increments of -10v. Keep rinse and repeating until it fails during load testing. Let's say there's 6 stages. You want to go to the 6th stage and start there. During successful testing, you may find that your 6th stage may be less than the 5th stage. When that happens, make the 5th stage the same as the 6th stage. You don't need to fine tune individually stages, just carry your max stage across the rest of the stages if it's lower than the rest. Make sense?

Example:

Before
Stage 1: 680mv
Stage 2: 800mv
Stage 3: 990mv
Stage 4: 1050mv
Stage 5: 1083mv
Stage 6: 1115mv

After
Stage 1: 680mv
Stage 2: 800mv
Stage 3: 990mv
Stage 4: 1048mv
Stage 5: 1048mv
Stage 6: 1048mv

There are some utilities, forgot them off the top of my head, but maybe Wattman does this, where it shows how peak watt consumption. You can do a before vs after downvolting. Like h98 said, you want to limit consumption on this card to 160W max @ peak. You should be able to calculate whether or not you met that goal with just downvolting.

If not, like h98 said, time to start reducing the clock speed--he gave you some numbers to use (which I would guess would be the RX480s speeds). Then go back to downvolting with those new numbers plugged in.

You are correct that the moment you get into MacOS, it's gameover unless you reflash it at the lower voltage settings that you've uncovered in Wattman. But at that point if you do like we say, you've already put in the work, and flashing is easy since you KNOW what the numbers are.

The card is flawed, but it sounds like something that would be fun to toy with. I would not run it in the system without "fixing" it like we're suggesting.
 
Last edited:
There are two reasons I don’t like Wattman:

1) It’s too effing complicated.
2) I seem to always get this message after making changes in Wattman and then rebooting:
Default Radeon WattMan settings have been restored due to unexpected system failure

That occurs not only with this RX 580, but also my MSI Vega 56.

That error in particular drives me nuts, so I just use Afterburner, which seems to work just fine. Any particular reason folks here dislike it?
 
Last edited:
Afterburner is actually more complicated for undervolting...that’s if you can get it to work. I never got it to work for me. Trying to plot voltage graphs where Wattman does it for you makes your 1) completely wrong. In-fact if I can stretch my memory back so far, MSI Afterburner doesn't even ALLOW you to play with voltages unless you do some special instructions...and even then it probably won't work.

Give Wattman 10 minutes and a fresh mindset before bashing it. Believe me, I was an afterburner fan for years back when OCing GPUs was a cool thing (and it's not anymore) but that is NOT the program you want to be using right now for a card that needs UNDERVOLTING so that it is safe to run in your system. Tsialex is proving some very serious and correct warnings....

For 2), you need to provide information about what you tried doing before it barfed. The failure means you did something, or the card is doing something that causes a hard stop. It should report an error...thats how it protects your system. What did you do to make it throw up an error? Try doing nothing and running a game or something. Record the power draw, boost speed, etc.

If it throws up an error when you haven't touched anything at all, well, I guess that's what we're telling you that you NEED to start reducing the power draw. The way to do that FAST is to concurrently reduce clock speeds (h98 post) and reduce the amount of voltage that the card is ALLOWED to pull from the system. Wattman as-is may be correctly reporting that your system is NOT stable. You should actually congratulate it if so...which is proof that MSI afterburner just ain't smart...and is rather on the stupid side.

You have the information in this thread to make it happen. You either play around with it and make it work, or you pull it out and get something that does work. Your choice.

The MSI Vega 56 is under the TDP limits of the Mac Pro, so cannot tell you why Wattman doesn't like it. Me thinks you have something else going on at this point....driver issues?
 
Last edited:
So I tried some experiments with Wattman and Afterburner. In Wattman, I tried applying a -10mV and -15mV undervolt. And it worked, and did not reset. I tried using Afterburner for the overlay and to set the fan curve, and Wattman still did not have any problems retaining settings. I think the problem is the 'special instructions' to enable voltage control with Afterburner. The resets I was describing were on other systems with Afterburner voltage control turned on, not the Mac Pro.

I also monitored the card's power draw with the AMD performance overlay (CTRL-SHIFT-O) that AMD has baked into the current driver. It showed that power draw did not exceed 110W even when running TimeSpy Extreme. It could be that it is only showing 6-pin power draw, but if that's true it's really scary to be pulling that much power through a connector rated for 75w. That would also mean total power draw is 185w, which seems unlikely for this card even with Dell's BIOS hack.

As far as Vega 56 power consumption goes, the card is rated at 165w (it's the Air Boost model), which slightly exceeds the Dell's 160w. It also has two 8-pin connectors (!!). I have tried booting in the Mac Pro, but just got a black screen. That may because the card refused to boot with 2 6-pin connectors in the 8-pin slots.

So I'm going to go back to Windows and Wattman and try some more undervolting. I may have missed this in the conversation, but why not just impose a 90% power limit (144w) instead of undervolting? And as far as playing with the BIOS power tables, I'm afraid of doing something horribly wrong and flashing the card with a broken BIOS that completely bricks the card.
 
I may have missed this in the conversation, but why not just impose a 90% power limit (144w) instead of undervolting?

Because, undervolt can improve performance. Limit the max power draw, will only reduce performance.

If all you want is just limit the power draw to improve stability, of course you can just reduce the power limit.
 
  • Like
Reactions: fendersrule
So I tried some experiments with Wattman and Afterburner. In Wattman, I tried applying a -10mV and -15mV undervolt. And it worked, and did not reset. I tried using Afterburner for the overlay and to set the fan curve, and Wattman still did not have any problems retaining settings. I think the problem is the 'special instructions' to enable voltage control with Afterburner. The resets I was describing were on other systems with Afterburner voltage control turned on, not the Mac Pro.

I also monitored the card's power draw with the AMD performance overlay (CTRL-SHIFT-O) that AMD has baked into the current driver. It showed that power draw did not exceed 110W even when running TimeSpy Extreme. It could be that it is only showing 6-pin power draw, but if that's true it's really scary to be pulling that much power through a connector rated for 75w. That would also mean total power draw is 185w, which seems unlikely for this card even with Dell's BIOS hack.

As far as Vega 56 power consumption goes, the card is rated at 165w (it's the Air Boost model), which slightly exceeds the Dell's 160w. It also has two 8-pin connectors (!!). I have tried booting in the Mac Pro, but just got a black screen. That may because the card refused to boot with 2 6-pin connectors in the 8-pin slots.

So I'm going to go back to Windows and Wattman and try some more undervolting. I may have missed this in the conversation, but why not just impose a 90% power limit (144w) instead of undervolting? And as far as playing with the BIOS power tables, I'm afraid of doing something horribly wrong and flashing the card with a broken BIOS that completely bricks the card.

1) You didn't listen. Don't mess with fan curve. Leave it alone. Uninstall Afterburner.

2) So you made it -15mV undervolt. That's minor, but noteworthy, kinda. I bet you could go more...

3) Stop playing with Afterburner. Uninstall it.

4) I doubt a Vega 56 will have a 165 TDP that has 2x 8-pin connectors. Each 8-pin is rated at 150 watts. 150 x 2 = 300. Guess what? PCI-E slot is 75W. = 375W. I call BS. In general, you should NEVER stick a card that has more than an 8-pin connector in a 4,1-5,1 Mac Pro.

5) You cannot use a 6-pin connector to power an 8-pin connector. 6 pin = 75 Watts. 8 pin = 150 Watts. They aren't even close. The fact that it didn't even boot already tells you this.

6) If you impose a "power limit" you are not improving the card's efficiency, and I've already mentioned that AMD RX580s can be very non-efficient to very efficient depending on brand, model and silicone lottery. Undervolting is tweaking the efficiency of the card. The undervolting process is giving the card exactly what it needs and nothing more to run at the desired speed. Certain bins are really crappy. Certain bins are great. The RX580 is not very good when comparing to nVidia's Pascal, which doesn't need undervolting at all. Reducing the power limit without undervolting is just a blanket for "keep running the way you are running, however crappy you are running, but I'm giving you only 80% of the power instead." Make sense?

7) Uninstall afterburner. I'm pretty damn sure AMD Wattman gives you the same controls.

I think I'm done here. You don't provide any reason for me to keep helping you. You stoped at -15Mv, but never explained why you stopped. Then you proceeded to stick in a Vega 56 into the system and powered it incorrectly to further damage your system. Then you proceeded to keep Afterburner installed. I guess all I can say is that I wish you well, but more importantly, I hope the Mac Pro survives the noob onslaught and abuse that it's getting.

This is why you do not buy computers on eBay, folks.
 
Last edited:
I'm using an EVGa PowerLink with my MSI ArmorRX 580 8gb.

I set up the PowerLink like this.
( Note : I'm using the PowerLink as a bridge - it sits on top of my PCI area fan. )

==============================================
POWER to the PowerLink

2 X Mini to 1 X 8pin to PowerLink
+
2 X SATA to 1 X 6 pin to PowerLink 8 pin input ( right side of PL 8 pin input socket )
+
PCIe slot power

= = = = = = = = = = = = = = = = =

Power FROM the Powerlink TO the RX 580

1 X female 8 pin to Male 8pin cable to the RX 580.

===============================================

Theoretically, the above should supply more than enough balanced power to the RX 580 8gb,

Any positive / negative comments on this setup ?
 
@fendersrule You seem to have a basic misapprehension about my purpose with this Mac Pro. It's a 'project', an experiment to see what is possible with a Mac Pro. If I manage to fry it, then I'll be sad, but I'll eat the cost and move on. I'm also trying to use spare parts on hand as opposed to buying new parts. I have no intention of creating a rock solid video editing workstation. At best, it will end up as a backup gaming rig doing some occasional light video editing. If it ends up a little unstable, or bites the dust, that's OK.

As far as Afterburner is concerned I was continuing to use it in order to see if I could duplicate the errors I was seeing on other systems. That's why I made such a minor change as -15mV; I was trying to see if I could get Wattman to produce the same error I had been seeing on my gaming rig. It didn't, so I'm surmising that enabling Afterburner's voltage control fubars Wattman. So you were right about Wattman working properly.

As far as the Vega 56 power usage goes, here are the specs:

MSI shows 210w, which is far less than the 375w which 2x8-pin + PCIe slot power provides. Why? So overclockers can choose the +50% power option and push 330w through the card.

Furthermore, users in this thread seem to report that using 2x6-pin connectors with some video cards working, especially when using the secondary 150w BIOS onboard the Vega 56. See here:
My Vega 56 does not appear to be among them. From that thread, it seems my best option is to get the Vega working is a 3xSATA-to-8pin adapter, which should theoretically provide enough power for overclocking.
 
I have used Afterburner for years before my RX 580 but hit lots of problems with the AMD drivers, any OC or even just using the overlay gave a lot of problems.

unexpected system failure

if your seeing that something is wrong, i saw that with bad OC settings.

i have only had good luck with watman
(i did kill one windows install when playing with OC and seeing that pop up, i assume it was bad OC settings that did it had to do a clean install to fix it)

the dell GPU will be designed to work fine in there systems running at stock only, dell makes custom non standard systems so do not treat them as 'normal' off the shelf parts.

if your interested with playing with OC try Buildzoid YT https://www.youtube.com/channel/UCrwObTfqv8u1KO7Fgk-FXHQ
he has covered both the vega 56 and RX 480/580 fairly well.
most GPU's will target about 80c some 90c and tend to be ok (but not good) to hit around 100c before thermal cutoff.
It is better to have the GPU cooler as it gets hot power use will go up and when the power stages are hot there lifespan will shorten faster.

for power use of the card in osx i use Hardware monitor
-running lux mark stress test-
Screen Shot 2019-10-19 at 4.10.32 pm.png

thats with a modded bios set to run slower than stock, i don't like fan noise and don't mind it being slower.
-fur mark stress test-
Screen Shot 2019-10-19 at 4.32.18 pm.png

if i add up booster 1+2 (that's the two 6 pins on the mobo) it's about the same as what wattman reports in windows.
so with my card i can see the driver is not reporting all power use just what the 8pin is pulling (i have two 6 pins in to the single 8 pin on the gpu) and ignoring the pci power use.

Buildzoid measures a lot of his hardware by hand (he highlights how far of some HW/SW reading are compared to realty)

one thing to also point out is that all software power readings tend to give an average power use, hardware tends to spike power use hard for short time periods that is not always measured by software or is averaged out (that's why Buildzoid has hand measured most things).
vega cards are well know for high spikes

also want to point out that not all GPU's are of the same quality, Buildzoid dose PCB breakdowns of lot's of GPU/mobo's which relay show the variation in parts used.

o and this video show's some of the fun of vega 56

edit
just want to add that a macpro is not a good OC computer and all is done at your own risk, in the end i under volted and under clocked my RX 580 to keep fan nose down.
 
Last edited:
@orph Yes, I had fun with that Vega 56 in my gaming rig. I had good success under volting it, and then applying a +50% power limit. In the end, the simplest way to mod it was to flash it with a Vega 64 BIOS which automatically OC'd the memory (I got lucky and got a model with Samsung HBM2 instead of Hynix) and applied a much higher power limit (~220w on the PCIe connectors IIRC.)

I've got an EVGA Powerlink that will hopefully arrive today. I've already re-flashed the Vega 56 with the 150w 'efficiency' BIOS, and will attempt to get it working with the 2x6-pin connectors routed through the Powerlink. Later I may try finding a 2x or 3x SATA power adapter and bumping up the power limit + under volting.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.