Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Sorry to exhume this thread, but I have a 2019 Mac Pro with a W6800X Duo MPX module. Did anyone find an RX6900XT card that only takes up the remaining two PCIe slots? It'd be a cheaper expansion than adding another Duo.

These two fit:

TinyGrab Screen Shot 6-24-23, 4.16.49 PM.jpg


Lou
 
Bottom left is a reference 6900xt card under the gigabyte name, top right which i have is the gigabyte gaming 6900xt card. The difference is the Ref card has 2 X 8 pin, the gaming card has 3 x 8 pin but both fit in the 7.1 mac pro.

Very happy with my gaming 6900xt card.
 
I've got the Sapphire NITRO+ RX 6900XT, just make sure not to mix it up with the SE (second edition) which has a much larger chassis and won't fit (thank goodness my supplier had a decent returns policy!)
 
  • Like
Reactions: ZombiePhysicist
OEM AMD 6900XT works great on mine.

I removed the 580X MPX module and put this in there instead. No issues.

Of course, I am not using XDR displays etc, so haven't dealt with Thunderbolt display issues.

However, I would like to buy dual 5k Apple displays soon.
Did encounter any issues with the 6900XT with no MPX module while connecting to an XDR or ultrafine?

my plan is to put a 6900xt in my 7,1 and remove the w5700x, but i use a 5k ultra fine, i dont know if ill have issues, i suppose i will use one of the thunderbolts of the back to connect the display
 
Does anyone here have the Gigabyte 6900 XT card that fits? Wondering what the fan noise is like, since that seems to be one of the few remaining options available to get it (though used usually)?

Otherwise, while I hate to entertain the thought, has anyone ever cut/removed one of the plastic rails to get a longer card to fit in?
 
^^^^I have the Gigabyte RX6800XT which has the same form factor and coolers. I also have the bios switch set to performance mode. It's pretty quiet for me👍🏻

Lou
 
Does anyone here have the Gigabyte 6900 XT card that fits? Wondering what the fan noise is like, since that seems to be one of the few remaining options available to get it (though used usually)?

Otherwise, while I hate to entertain the thought, has anyone ever cut/removed one of the plastic rails to get a longer card to fit in?
I have the Gigabyte RX 6900 XT OC. No noticeable fan noise. I do Creative Cloud apps from Premiere Pro, After Effects, Audition, Illustrator, etc. Of course this will most likely depend in your environment temps. The Mac Pro itself has adequate cooling as you already can see with it's 3 massive fans pushing to the internals. But I can only guess if your environment temp is warm, then all it's doing is pushing the warm air to an already warm environment and that may trigger higher fan speed and therefore, noise.
 
Was there ever a 2slot wide 6900xt? I think there was one 6800 pull from a dell or something that was only 2slot wide but don’t think there was ever a 6900 that was only to slots wide?
 
Was there ever a 2slot wide 6900xt? I think there was one 6800 pull from a dell or something that was only 2slot wide but don’t think there was ever a 6900 that was only to slots wide?
The liquid cooled models are (less than) 2 slots wide, but the cooler winds up elsewhere. I don't have a 7,1 to figure out where a radiator could go.


This was $700CAD during prime day just past.
 
  • Like
Reactions: ZombiePhysicist
Bottom left is a reference 6900xt card under the gigabyte name, top right which i have is the gigabyte gaming 6900xt card. The difference is the Ref card has 2 X 8 pin, the gaming card has 3 x 8 pin but both fit in the 7.1 mac pro.

Very happy with my gaming 6900xt card.
I really regret not knowing about the 3x 8 pin on mine. I was able to make it work alongside a W5700X, but a) there's a glitch with the 5000 series on the 3D software I use, which makes it basically unusable and b) I can't fit any more 3rd party GPU's in as I'm using 3 out of the 4 power cables… 😬
 
I really regret not knowing about the 3x 8 pin on mine. I was able to make it work alongside a W5700X, but a) there's a glitch with the 5000 series on the 3D software I use, which makes it basically unusable and b) I can't fit any more 3rd party GPU's in as I'm using 3 out of the 4 power cables… 😬

Well this is an issue with 3 x 8 pin, but i use a 4090 with external PSU, just pass cable through PCi slot at back, its a flat cable so passes through easy, no reason you cant power a 2nd 3 pin gigabyte card this way. the psu for my 4090 sits under the back of my desk, and i have a PSU switch connected to PSU to power it when needed. a small corsair SF-750w does the trick nicely and is silent. also fitted with the flat 4090 cable from corsair to power the 4090.

There are ways around it, 2 reference 6900xt cards with dual 8 pin would have been ideal for you, or expensive w6800X card to replace W5700X card. 6900xt plus PSU is much cheaper and faster.
 
Well this is an issue with 3 x 8 pin, but i use a 4090 with external PSU, just pass cable through PCi slot at back, its a flat cable so passes through easy, no reason you cant power a 2nd 3 pin gigabyte card this way. the psu for my 4090 sits under the back of my desk, and i have a PSU switch connected to PSU to power it when needed. a small corsair SF-750w does the trick nicely and is silent. also fitted with the flat 4090 cable from corsair to power the 4090.

There are ways around it, 2 reference 6900xt cards with dual 8 pin would have been ideal for you, or expensive w6800X card to replace W5700X card. 6900xt plus PSU is much cheaper and faster.
Thank you - that's really good to know 😊
 
Tell me please. I have mpx 580x and want to take gigabyte rx6900xt gaming oc.

I also have two apple thunderbolt displays connected to thunderbolt 3 on the rear panel via an I/O card (via apple thunderbolt 2 to thunderbolt 3 adapter) and one monitor connected via hdmi to 580.

How will such an assembly behave?

I would not want to remove 580 and use it simply as displaying images on monitors, but so that 6900 does all the main work. What about those applications that do not allow you to select a video card in the settings (adobe substance painter, unreal engine 5).

If you edit video, then in final cut pro and davinci resolve programs you can choose which graphics processors to use, and even some 3D editors allow you to do this (blender), but what about those programs that do not allow you to do this? Maybe this can be configured in the macOS itself from which GPU to run? I have never had two different video cards, so I don't understand how it works at all (((
 
When you connect the main display to that video card, then the Mac OS will make that your default card. I don't think there's a way to connect all your displays to the 580X and then assign the other GPU as the "main" processing card outside of the actual application that can make a selection directly. I could be wrong.

What I do is connect my multiple displays to the two video cards. That means you will have to invest in a USB-C to DisplayPort adapter, which I've done. Once you've done that, to switch to the "main" video card, you simply re-arrange the menu bar under the Arrange Displays setting. You can verify which "main" video card is being used when you click on "About This Mac."

I'm at the office right now and using my laptop, so the screenshot I attached will be slightly different but the same idea.

Screenshot 2023-08-08 at 10.10.52 AM.png
 
When you connect the main display to that video card, then the Mac OS will make that your default card. I don't think there's a way to connect all your displays to the 580X and then assign the other GPU as the "main" processing card outside of the actual application that can make a selection directly. I could be wrong.

What I do is connect my multiple displays to the two video cards. That means you will have to invest in a USB-C to DisplayPort adapter, which I've done. Once you've done that, to switch to the "main" video card, you simply re-arrange the menu bar under the Arrange Displays setting. You can verify which "main" video card is being used when you click on "About This Mac."

I'm at the office right now and using my laptop, so the screenshot I attached will be slightly different but the same idea.

View attachment 2243249

I'll add to that, that the preboot sequence will choose a display based on how fast it "wakes up" and responds with a live link to the system. For MPX GPUs, only 2 displays are recognised per GPU during startup, and when booted in safe mode.

This is regardless of which display is set as main when booing normally.

I'm not sure how that translates to non-MPX GPUs.
 
Last edited:
  • Like
Reactions: LeonPro
When you connect the main display to that video card, then the Mac OS will make that your default card. I don't think there's a way to connect all your displays to the 580X and then assign the other GPU as the "main" processing card outside of the actual application that can make a selection directly. I could be wrong.

What I do is connect my multiple displays to the two video cards. That means you will have to invest in a USB-C to DisplayPort adapter, which I've done. Once you've done that, to switch to the "main" video card, you simply re-arrange the menu bar under the Arrange Displays setting. You can verify which "main" video card is being used when you click on "About This Mac."

I'm at the office right now and using my laptop, so the screenshot I attached will be slightly different but the same idea.

View attachment 2243249
Oh, this is very helpful!

That is, if monitors 1 and 2 are connected to the ports of the rear I/O Card ports thunderbolt (mpx module 580x will be the controller of these ports), and monitor 3 is connected by hdmi to the rx6900xt card, and is assigned as the main one in the system settings, then the main video card will be rx6900xt?

Do you use programs like iStat Menu? Tell me, does it reflect the statistics of the rx6900xt GPU, video memory, and also the temperature?
 

Attachments

  • Снимок экрана 2023-08-09 в 12.37.54.png
    Снимок экрана 2023-08-09 в 12.37.54.png
    40.7 KB · Views: 49
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.