I upgraded from a RX 580 to a new RX 5700 two days ago. I thought to share my experience so far here in this thread, so that forum members/readers who are considering this upgrade have some additional data points.
As per my signature -- I have a 4,1>5,1 with dual X5680 and 96GB of RAM. I'm running macOS 12.0.1 from one NMVe blade and Windows 11 from another separate NMVe blade, both on the same dual NMVe PCIe card.
The Radeon RX 5700 (not the XT, just the reference card) was immediately recognized by macOS 12.0.1. Overall, macOS performs equally well as with the RX 580. The only thing missing in macOS 12.0.1 is the reading for the GPU temperature data. I use iStat Menus and it does not show the RX 5700 temperature info, whereas it did show the RX 580 temperature info.
These are some synthetic benchmark (Geekbench 5) comparisons for my system with the RX 580 and RX 5700 running macOS 12.0.1:
GPU | OpenCL | Metal |
RX 580 | 46,760 | 52,247 |
RX 5700 | 58,034 | 62,539 |
The only problem I encountered on macOS 12.0.1 were unexpected random system restarts
after I enabled the RadeonBoost Kext via OpenCore 0.7.4. Once I disabled RadeonBoost those random restarts stopped happening. I am not sure, but it seems to be an issue with that specific Kext. I benchmarked (Geekbench 5) the RX 5700 with both RadeonBoost enabled
and disabled, but the results were pretty much the same as those listed in the table above. Apparently, there seems to be no obvious benefit enabling it. But I wil need to do some more research on this.
The RX 5700 also works properly under Windows 11 on my cMP. I had to update the GPU drivers via AMD software with a subsequent restart for them to load, as expected. On Windows 11, I tested the system performance with two GPU intensive video games. I monitored the RX 5700 temperature and watt usage while playing each game for about 1 hour. I set the graphics quality for both games to the very max, not really aiming to make them playable but to stress the system. To my surprise, they were both playable enough at the highest possible settings. Here's some data on this test:
GAME | Average FPS | Peak Watt Reading | Peak Temperature Reading |
Cyberpunk 2077 | 45 | 130 | 80C |
Red Redemption 2 | 55 | 130 | 80C |
If anything changes in the next weeks, I will update this post. But so far, it seems to be a successful upgrade. I wish I could monitor the GPU temperature in macOS. I will contact iStat Menus dev team and inquire if adding this reading is something feasible for future updates of their software.
View attachment 1921092