Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

ZZ Bottom

macrumors 6502a
Original poster
Apr 14, 2010
829
258
I know this question has been asked many times, but I cannot seem to locate a conclusive answer to my specific spin to it;

I received an EVGA GTX 1080 FTW which has two 8-pin power inputs. It came with two dual 6-pin to single 8-pin adapters. The Mac Pro only has two 6-pin cables.

Can I use one adapter for each, leaving the 2nd 6-pin unconnected? In other words, will it function in this arrangement:
Mac Pro single 6-pin --> dual 6-pin (only one connected) to 8-pin --> GTX 1080
Mac Pro single 6-pin --> dual 6-pin (only one connected) to 8-pin --> GTX 1080

In case it matters, I am not using the Mac Pro for gaming. The Nvidia is only desired to speed up some Adobe app functions like 3D object painting in Photoshop and 3D extruding in AE.
 
I’d send it back and get a Founders Edition reference card. It has a single 8-pin and works well.

It is company purchased and not exchangeable. I bought a 1080ti SC2 with single 8-pin for my personal machine at home, so I was a bit taken back when I unboxed this 1080 FTW to see dual 8-pins...

I have the GTX 1080 Founders Edition and works great.

Picked up a Dual Mini PCI-E 6-Pin to Standard PCI-E 8-Pin cable (https://www.moddiy.com/products/Apple-Dual-Mini-PCI%2dE-6%2dPin-to-Standard-PCI%2dE-8%2dPin-Video-Card-Cable.html) to use with 5,1 when replacing the GTX 680 (official Mac version) that was in there previously.

The card came with two of those adapters. What I'm asking is if I can use one adapter for each 8-pin input. Thus converting a single 6-pin into a single 8-pin twice.
 
The link above is for a TWO 6-Pin to ONE 8-Pin cable. The 6-pin from the 5,1's board is not the standard regular 6-pin size, so the cables it shipped with may not be the right size.

You need to find what the power requirements of the card are. It may be possible if it draws under 225 watts. If draws more than 225 watts, you may need an external PSU. Technically, each of the 6-pins are 75 watts. Plus you get 75 watts from the PCIe = 225 watts total.

8-pin power is technically 150 watts. With two 8-pins, your card MAY draw 300 watts or more.
 
  • Like
Reactions: itdk92
I know this question has been asked many times, but I cannot seem to locate a conclusive answer to my specific spin to it;

I received an EVGA GTX 1080 FTW which has two 8-pin power inputs. It came with two dual 6-pin to single 8-pin adapters. The Mac Pro only has two 6-pin cables.

Can I use one adapter for each, leaving the 2nd 6-pin unconnected? In other words, will it function in this arrangement:
Mac Pro single 6-pin --> dual 6-pin (only one connected) to 8-pin --> GTX 1080
Mac Pro single 6-pin --> dual 6-pin (only one connected) to 8-pin --> GTX 1080

In case it matters, I am not using the Mac Pro for gaming. The Nvidia is only desired to speed up some Adobe app functions like 3D object painting in Photoshop and 3D extruding in AE.

One of the easy way would be

[dual mini 6pin -> single 8pin] + [Single 8pin (female) -> dual 6+2 pin]

I am using this bridge method to power my 1080Ti. If this cable combination can power a 250W graphic card, then it should be very safe to power your 215W 1080 FTW.

The idea of using a bridge in between is to balance the demand. So, no mater how the card designed or programmed, the power draw can still virtually evenly distributed across the two mini 6pin to avoid the self shutdown protection. And this is how my 1080Ti draw the power when under stress.
Screen Shot 2018-02-06 at 05.15.18.jpg

As you can see, despite my card has 6+8pin config, both mini 6pins always work together to share the load. So that the card won't over stress one of the mini 6pin, but under utilise the other one.

One of the common wrong assumption is that your card will draw 75W from the slot. Therefore, since your card is 215W TDP, each 8 pin will only draw 70W (70+70+75 = 215). So that, you can connect each of them to a mini 6pin individually, and still within the mini 6pin official limit.

From the above chart, you can see that there is no guarantee that your card will pull 75W from the slot (slot 2, in my case), not even when running Furmark. Especially your card has 2x 8pin, there is no point to stress the slot to 100% and only utilise 47% on each 8pin input.

I expect your card's power draw's pattern will be similar to my card. Only pull 40-50W from the slot when under stress, and the rest are from the 8pins. So, there is a very high chance that even your card's TDP is below 225W, but it will still pull more than 75W from the mini 6pins.

[Update: I found this review which shows that your card only can pull about 30W from the PCIe slot. And in extreme case (Furmark), it can pull up to 232W in total. Therefore, it means each 8pin can pull up to around 100W]

Another common wrong assumption is that because your card has dual 8pin, therefore, it can draw up to 375W from the PSU. I am 100% sure it won't, and it can't. No matter how many 8pin inputs are there, it's just the "power available", nothing to do with the "real power consumption". At the end, it's just an air cooled 1080. Manufacture said it's a 215W card, so, it's power draw should not go above this number without user OC or intentionally running power virus (e.g. Furmark).

IMO, that dual 8pin is just a marketing strategy. It make the consumer believe it's a powerful card because has dual 8pin (compare to a normal 1080 which only has single 8pin). In fact, my 1080Ti only has 6+8, but it's a 250W card, and it's definitely more powerful then the 1080FTW. That dual 8pin design is more likely for extreme user OC (which you cannot do in macOS), and to relief the slot (no need to pull all 75W from it). Therefore, you shouldn't need extra PSU or Pixlas mod. Nothing stop you to do that, but just not required.

[Again, from the same review, your card only draw ~200W in heavy gaming. For your usage, I think it can hardly touch 200W]

The 3rd common wrong concept is that the card will automatically balance the load between the two 8pin. Therefore, you can connect each 8pin to each mini 6pin. Yes, it may happen, but from what I see, it won't be that perfect most of the time. The card may still programmed to heavily rely on one 8pin, and very under utilise the other one. Therefore, you better to do the balance by yourself to avoid over stress one of the mini 6pin (which can trigger the auto shutdown).
 
Last edited:
One of the easy way would be

[dual mini 6pin -> single 8pin] + [Single 8pin (female) -> dual 6+2 pin]

I am using this bridge method to power my 1080Ti. If this cable combination can power a 250W graphic card, then it should be very safe to power your 215W 1080 FTW.

The idea of using a bridge in between is to balance the demand. So, no mater how the card designed or programmed, the power draw can still virtually evenly distributed across the two mini 6pin to avoid the self shutdown protection. And this is how my 1080Ti draw the power when under stress.
View attachment 750017
As you can see, despite my card has 6+8pin config, both mini 6pins always work together to share the load. So that the card won't over stress one of the mini 6pin, but under utilise the other one.

One of the common wrong assumption is that your card will draw 75W from the slot. Therefore, since your card is 215W TDP, each 8 pin will only draw 70W (70+70+75 = 215). So that, you can connect each of them to a mini 6pin individually, and still within the mini 6pin official limit.

From the above chart, you can see that there is no guarantee that your card will pull 75W from the slot, not even when running Furmark. Especially your card has 2x 8pin, there is no point to stress the slot to 100% and only utilise 47% on each 8pin input.

I expect your card's power draw's pattern will be similar to my card. Only pull 40-50W from the slot when under stress, and the rest are from the 8pins. So, there is a very high chance that even your card's TDP is below 225W, but it will still pull more than 75W from the mini 6pins.

[Update: I found this review which shows that your card only can pull about 30W from the PCIe slot. And in extreme case (Furmark), it can pull up to 232W in total. Therefore, it means each 8pin can pull up to around 100W]

Another common wrong assumption is that because your card has dual 8pin, therefore, it can draw up to 375W from the PSU. I am 100% sure it won't, and it can't. No matter how many 8pin inputs are there, it's just the "power available", nothing to do with the "real power consumption". At the end, it's just an air cooled 1080. Manufacture said it's a 215W card, so, it's power draw should not go above this number without use OC or intentionally running power virus (e.g. Furmark).

IMO, that dual 8pin is just a marketing strategy. It make the consumer believe it's a power card because has dual 8pin. In fact, my 1080Ti only has 6+8, but it's a 250W card, and it's definitely more powerful then the 1080FTW. That dual 8pin design is more likely for extreme user OC (which you cannot do in macOS), and to relief the slot (no need to pull all 75W from it). Therefore, you shouldn't need extra PSU or Pixlas mod. Nothing stop you to do that, but just not required.

[Again, from the same review, your card only draw ~200W in heavy gaming. For your usage, I think it can hardly touch 200W]

The 3rd common wrong concept is that the card will automatically balance the load between the two 8pin. Therefore, you can connect each 8pin to each mini 6pin. Yes, it may happen, but from what I see, it won't be that perfect most of the time. The card may still programmed to heavily rely on one 8pin, and very under utilise the other one. Therefore, you better to do the balance by yourself to avoid over stress one of the mini 6pin (which can trigger the auto shutdown).

Incredibly helpful and informative. Thank you for taking the time on this! I will get one of those 8-pin to dual 6+2 pin adapters.

My use of the card will definitely not stress it to it's max capabilities. I've been managing with a 5770 up to this point, so just having CUDA is the primary benefit for me.
 
  • Like
Reactions: h9826790
Hi guys,

I just got a Nvidia GTX1080 Founder's Edition (thank you god!) and it came with no pin adapter. The card just has one 8pin slot.

Do I need to get this?
https://www.amazon.com/Graphics-Dou...517976868&sr=8-30&keywords=6+to+8+pin+adapter

Or this?
https://www.amazon.com/6-Pin-Expres...517977277&sr=8-29&keywords=6+to+8+pin+adapter

Ultimately, I don't want my computer to shut down or anything, and play intensive game safely. I currently have EVGA GTX680 Mac Edition and am using 2 6pin adapters with no issues so far.

Please let me know!
 
Last edited:
As an Amazon Associate, MacRumors earns a commission from qualifying purchases made through links in this post.
Hi guys,

I just got a Nvidia GTX1080 Founder's Edition (thank you god!) and it came with no pin adapter. The card just has one 8pin slot.

Do I need to get this?
https://www.amazon.com/Graphics-Dou...517976868&sr=8-30&keywords=6+to+8+pin+adapter

Or this?
https://www.amazon.com/6-Pin-Expres...517977277&sr=8-29&keywords=6+to+8+pin+adapter

Ultimately, I don't want my computer to shut down or anything, and play intensive game safely. I currently have EVGA GTX680 Mac Edition and am using 2 6pin adapters with no issues so far.

Please let me know!

Both cables are wrong. You need this one

https://www.moddiy.com/products/App...tandard-PCI%2dE-8%2dPin-Video-Card-Cable.html

Dual MINI 6pin -> Single 8 pin (MALE)
 
As an Amazon Associate, MacRumors earns a commission from qualifying purchases made through links in this post.
  • Like
Reactions: Squuiid and itdk92
Thanks. Would I not be able to use a single 6 pin to single 8pin adapter? Would I need more power for 1080? Maybe 1070 only requires one 6 pin to one 8 pin?

Cannot, according to this review, your card should draw around 40W from the PCie slot and the remaining 130W from the 8pin (when under stress).

http://www.tomshardware.com/reviews/nvidia-geforce-gtx-1080-pascal,4572-10.html

The cMP's mini 6pin can only deliver up to around 120W. It's already way above the official 75W limit. Once beyond that, the shutdown protection may kick in at anytime. So, you better use both your mini 6pin to power the 8pin.
 
Last edited:
Cannot, according to this review, your card should draw around 40W from the PCie slot and the remaining 133W from the 8pin (when under stress).

http://www.tomshardware.com/reviews/nvidia-geforce-gtx-1080-pascal,4572-10.html

The cMP's mini 6pin can only deliver up to around 120W. Once beyond that, the shutdown protection may kick in at anytime. So, you better use both your mini 6pin to power the 8pin.

Got it. I remember seeing it somewhere that GTX1070 just required 1 6pin to 1 8pin...I maybe wrong or GTX1070 requires way less than GTX1080 then....Well I just purchased one off of ebay..I hope this is the right one..

https://www.ebay.com/itm/16AWG-Dual...e=STRK:MEBIDX:IT&_trksid=p2057872.m2749.l2649
 
Got it. I remember seeing it somewhere that GTX1070 just required 1 6pin to 1 8pin...I maybe wrong or GTX1070 requires way less than GTX1080 then....Well I just purchased one off of ebay..I hope this is the right one..

https://www.ebay.com/itm/16AWG-Dual-Mini-6pin-to-8Pin-PCI-e-For-Mac-Pro-Video-Card-Power-Cable-GTX1080/322859956133?ssPageName=STRK:MEBIDX:IT&_trksid=p2057872.m2749.l2649

This cable looks good to me.

The 1070 FE is a very different story. It's not just use less power, but also programmed to draw all 75W from the PCIe slot. Therefore, it can easily powered by only one mini 6pin (even though can still occasionally exceed that 75W official limit, but that's completely normal for GPU. As long as it's just a short period of time, it should not cause any issue).

The 1070 FE draw less than 100W from the 8pin most of the time, but very close to 75W from the slot

aHR0cDovL21lZGlhLmJlc3RvZm1pY3JvLmNvbS9TL04vNTg5NDE1L29yaWdpbmFsLzIxLU52aWRpYS0xMDcwLUdhbWluZy1SYWlscy5wbmc=


The 1080 FE draw less from the slot, but way more from the 8pin.

aHR0cDovL21lZGlhLmJlc3RvZm1pY3JvLmNvbS9TL08vNTg5NDE2L29yaWdpbmFsLzIyLU52aWRpYS0xMDgwLUdhbWluZy1SYWlscy5wbmc=
 
This cable looks good to me.

The 1070 FE is a very different story. It's not just use less power, but also programmed to draw all 75W from the PCIe slot. Therefore, it can easily powered by only one mini 6pin (even though can still occasionally exceed that 75W official limit, but that's completely normal for GPU. As long as it's just a short period of time, it should not cause any issue).

The 1070 FE draw less than 100W from the 8pin most of the time, but very close to 75W from the slot

aHR0cDovL21lZGlhLmJlc3RvZm1pY3JvLmNvbS9TL04vNTg5NDE1L29yaWdpbmFsLzIxLU52aWRpYS0xMDcwLUdhbWluZy1SYWlscy5wbmc=


The 1080 FE draw less from the slot, but way more from the 8pin.

aHR0cDovL21lZGlhLmJlc3RvZm1pY3JvLmNvbS9TL08vNTg5NDE2L29yaWdpbmFsLzIyLU52aWRpYS0xMDgwLUdhbWluZy1SYWlscy5wbmc=

Just received the pin and it works flawlessly! But now I can't seem to install Nvidia drivers for this card. It says

Mac OS X version is not compatible. 10.12.6 (16G1212) is required for this update.

I tried using older driver but no luck! I am using 10.12.6 the final update for Sierra though.
 
One of the easy way would be

[dual mini 6pin -> single 8pin] + [Single 8pin (female) -> dual 6+2 pin]

I am using this bridge method to power my 1080Ti. If this cable combination can power a 250W graphic card, then it should be very safe to power your 215W 1080 FTW.

The idea of using a bridge in between is to balance the demand. So, no mater how the card designed or programmed, the power draw can still virtually evenly distributed across the two mini 6pin to avoid the self shutdown protection. And this is how my 1080Ti draw the power when under stress.
View attachment 750017
As you can see, despite my card has 6+8pin config, both mini 6pins always work together to share the load. So that the card won't over stress one of the mini 6pin, but under utilise the other one.

One of the common wrong assumption is that your card will draw 75W from the slot. Therefore, since your card is 215W TDP, each 8 pin will only draw 70W (70+70+75 = 215). So that, you can connect each of them to a mini 6pin individually, and still within the mini 6pin official limit.

From the above chart, you can see that there is no guarantee that your card will pull 75W from the slot (slot 2, in my case), not even when running Furmark. Especially your card has 2x 8pin, there is no point to stress the slot to 100% and only utilise 47% on each 8pin input.

I expect your card's power draw's pattern will be similar to my card. Only pull 40-50W from the slot when under stress, and the rest are from the 8pins. So, there is a very high chance that even your card's TDP is below 225W, but it will still pull more than 75W from the mini 6pins.

[Update: I found this review which shows that your card only can pull about 30W from the PCIe slot. And in extreme case (Furmark), it can pull up to 232W in total. Therefore, it means each 8pin can pull up to around 100W]

Another common wrong assumption is that because your card has dual 8pin, therefore, it can draw up to 375W from the PSU. I am 100% sure it won't, and it can't. No matter how many 8pin inputs are there, it's just the "power available", nothing to do with the "real power consumption". At the end, it's just an air cooled 1080. Manufacture said it's a 215W card, so, it's power draw should not go above this number without user OC or intentionally running power virus (e.g. Furmark).

IMO, that dual 8pin is just a marketing strategy. It make the consumer believe it's a powerful card because has dual 8pin (compare to a normal 1080 which only has single 8pin). In fact, my 1080Ti only has 6+8, but it's a 250W card, and it's definitely more powerful then the 1080FTW. That dual 8pin design is more likely for extreme user OC (which you cannot do in macOS), and to relief the slot (no need to pull all 75W from it). Therefore, you shouldn't need extra PSU or Pixlas mod. Nothing stop you to do that, but just not required.

[Again, from the same review, your card only draw ~200W in heavy gaming. For your usage, I think it can hardly touch 200W]

The 3rd common wrong concept is that the card will automatically balance the load between the two 8pin. Therefore, you can connect each 8pin to each mini 6pin. Yes, it may happen, but from what I see, it won't be that perfect most of the time. The card may still programmed to heavily rely on one 8pin, and very under utilise the other one. Therefore, you better to do the balance by yourself to avoid over stress one of the mini 6pin (which can trigger the auto shutdown).

Nice guide. Only one thing that I don’t understand... I know that mini 6pin can deliver more than 75 watt and can reach also 150 watt each... so how it is possible to get more than 150 watt trought a single 8 pin? Is there risk to burn the cable?
 
Nice guide. Only one thing that I don’t understand... I know that mini 6pin can deliver more than 75 watt and can reach also 150 watt each... so how it is possible to get more than 150 watt trought a single 8 pin? Is there risk to burn the cable?

From experience, each mini 6pin can only deliver up to ~120W before the Mac shut itself down, not 150W.

The card shouldn’t draw more than 150W (average, not including power spike) via a single 8pin. But in the above case, there are two 8pins on the card, we combine them into one single 8pin bridge, therefore, the bridge can (and will) draw more than 150W.

The risk is always there, so, use high quality cables are always preferable. But in case of cMP, I am quite sure the Mac will shutdown before the current large enough to burn the cables.

I checked my cables, when under full stress, the cables are just little bit warm. However, I haven’t find out if the cables are warmed up by the hot air coming out from the graphic card, or by the high current. That’s just tiny bit warmer, I have no interest to investigate further.
 
But if I use...

for cable A: 2x mini 6 pin to get one 8 pin pcie

and for cable B: 4x sata ports to get one 8 pin with proper cables (two dual sata power to pcie 6+2 and one dual female 6+2 pin to one 8 pin

.. in this way I can distribuite the power trought a total of 4 sata power (2 from motherboard and 2 from optical bay) and all two mini 6 pin... And all should stat under the official eattage limits... or no? :)
 
But if I use...

for cable A: 2x mini 6 pin to get one 8 pin pcie

and for cable B: 4x sata ports to get one 8 pin with proper cables (two dual sata power to pcie 6+2 and one dual female 6+2 pin to one 8 pin

.. in this way I can distribuite the power trought a total of 4 sata power (2 from motherboard and 2 from optical bay) and all two mini 6 pin... And all should stat under the official eattage limits... or no? :)

It should be OK, but not everyone happy to sacrifice all four SATA ports especially not required to do so.
 
It should be OK, but not everyone happy to sacrifice all four SATA ports especially not required to do so.

I get a gigabyte GTX 980 ti 6GB OC windforce 3x with 2x 8 pin connector.... that should be more power hungry thAn a gtx 1080... I connect it in various ways... all work.. but sometime i got black screen at startup.... with apparently no reason.. i try to disconnect and reconnect some pcie cable and it return again to work fine... So i would to try this option to be sure to not go over limit wattage... what do U think?
 
I get a gigabyte GTX 980 ti 6GB OC windforce 3x with 2x 8 pin connector.... that should be more power hungry thAn a gtx 1080... I connect it in various ways... all work.. but sometime i got black screen at startup.... with apparently no reason.. i try to disconnect and reconnect some pcie cable and it return again to work fine... So i would to try this option to be sure to not go over limit wattage... what do U think?

This should make everything stay within limit.
 
I just ordered a flashed GTX 780 for my Mac Pro (flashed 4,1). I was worried, then I saw that there was a mini-6pin to 6pin and a mini-6pin to 8pin included, because of the previously discussed even power distribution. However as the card is an officially refurbished one, I thought, that the vendor couldn't be to wrong about this, so I went ahead and measured the Mac Pro's mini-6pin connectors: It turn's out, that they are physically connected, as each pin compared to the other connectors counterpart do not have any resistance between them. This means, that the power is always evenly distributed, and that the dual-6 to 8 pin approach is redundant (at least for the 4,1). However it should be noted, that non-the less to my knowledge the ports are officially only 75W rated each, meaning that you should only draw 150W altogether (which is essential the spec of an 8 pin connector. That said due to the Mac Pros superior build quality they apparently have enough headroom to drive at least 225W from the port which enables the use of 300W GPUs including the 75W from the PCIe port. Additionally I have seen people driving two 8pin/dual 6-pin GPUs without an additional power-supply or any modification resulting in a power draw of up to 300W on the ports. While this might work, especially as GPU rarely draw all the power they could for an extensiv time this is not recommended at all. I assume that the same is valid pro the 5,1 Mac Pros. I hope I could clarify the question of how to connect GPUs inside the Mac Pro, especially as it got more important again with the release of Mojave this autumn.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.