The AMD Radeon RX 580 & RX 570 Review: A Second Path to Polaris
www.anandtech.com
Very interesting article—scroll down to the part about external monitors. It seems like there is a basic Radeon limitation with respect to memory clock switching. The GPU can lower or increase the memory clocks based on load. Normally this happens during the screen refresh, to avoid tearing. When there are multiple monitors with different configurations, there is no predictable moment both monitors will be blanked. So the driver keeps the memory clock at full tilt at all times.
Damm yeah I reckon this is it you know ?. If you scroll further down interestingly the article even explains that if you match your monitor to "correct timings" you'll avoid the issue.
It could certainly explain why certain monitor configurations have been reported to work fine. It could even explain why Apple support is claiming this to be "as intended", if this is indeed a limitation of AMD's architecture.
I'm a recent purchaser in a unique position of still owning my 2017 15" model with a 480. I've been monitoring this thread and others like it for the past week. I've done direct comparisons with the exact same workload on both machines connected to my Dell monitors. One monitor for each machine, both the same monitor.
As expected the GPU is pegged at 18w constantly vs 9w of the 2017 model. Temps are also 20 degrees higher when idle.
This feels like a good itch to scratch. But I'm now past my 14 day return window and pretty miffed that in a machine with better thermal architecture, Apple would go and negate those improvements by using a shoddy GPU. Don't get my wrong I can see how they aren't left with much choice due to their relationship with Nvidia. But still!
Last edited: