Hey guys, I'm diving deep into understanding the power management of my MacBook Pro. I own a mid-2012 Retina model sporting an i7@2.6, 8GB RAM, 512GB SSD, and a GeForce 650m. Given my familiarity with Windows, I decided to install Windows 10 on it, being more accustomed to the Windows testing tools.
To put things into perspective, I also have my trusty old Lenovo equipped with an i7@2.3, 16GB RAM, 512GB SSD, and a GF660m. Both CPUs are 3rd gen, and the GeForce 650 and 660 are quite similar, differing mainly in VRAM capacity. My MacBook Pro comes with an 85W charger while the Lenovo boasts a 90W charger.
I initially anticipated similar performance levels between the two laptops due to their comparable hardware, but the MacBook Pro seems to be displaying lower performance metrics. Let me illustrate with a specific example:
One major issue I've encountered is the CPU operating at significantly lower clock rates when under heavy gaming load. While a single CPU test like Cinebench impresses with nearly 40W power draw and ±3300mhz, when the GPU and CPU are both engaged, the CPU only reaches around 1200-1400mhz.
For example let's test Dishonored. Despite no overheating concerns, the CPU sticks at 1200mhz, yielding approximately 40 FPS.
Yet, if I push the CPU to higher clocks, say 2300mhz, via ThrottleStop the GPU's performance plummets, dwindling to a mere 270Mhz on the video core.
Now let's look at Lenovo, where the CPU clocks at 2300mhz resulting in a smooth 60 FPS.
If I manually throttle it down to match the MacBook Pro's 1200mhz, the performance aligns with the MacBook Pro, delivering 40 FPS.
Conclusion. Despite no overheating issues and similar charger wattages between the MacBook Pro and Lenovo, the MacBook Pro struggles to maintain a high CPU clock rate under simultaneous GPU load. This begs the question: Why does the more budget-friendly Lenovo, with near-identical specs, outperform the MacBook Pro in terms of performance? Maybe apple are forced to limit maximum power draw because MacBook should work with similar performance from battery? While Lenovo can take full power from charger?
To put things into perspective, I also have my trusty old Lenovo equipped with an i7@2.3, 16GB RAM, 512GB SSD, and a GF660m. Both CPUs are 3rd gen, and the GeForce 650 and 660 are quite similar, differing mainly in VRAM capacity. My MacBook Pro comes with an 85W charger while the Lenovo boasts a 90W charger.
I initially anticipated similar performance levels between the two laptops due to their comparable hardware, but the MacBook Pro seems to be displaying lower performance metrics. Let me illustrate with a specific example:
One major issue I've encountered is the CPU operating at significantly lower clock rates when under heavy gaming load. While a single CPU test like Cinebench impresses with nearly 40W power draw and ±3300mhz, when the GPU and CPU are both engaged, the CPU only reaches around 1200-1400mhz.
For example let's test Dishonored. Despite no overheating concerns, the CPU sticks at 1200mhz, yielding approximately 40 FPS.
Yet, if I push the CPU to higher clocks, say 2300mhz, via ThrottleStop the GPU's performance plummets, dwindling to a mere 270Mhz on the video core.
Now let's look at Lenovo, where the CPU clocks at 2300mhz resulting in a smooth 60 FPS.
If I manually throttle it down to match the MacBook Pro's 1200mhz, the performance aligns with the MacBook Pro, delivering 40 FPS.
Conclusion. Despite no overheating issues and similar charger wattages between the MacBook Pro and Lenovo, the MacBook Pro struggles to maintain a high CPU clock rate under simultaneous GPU load. This begs the question: Why does the more budget-friendly Lenovo, with near-identical specs, outperform the MacBook Pro in terms of performance? Maybe apple are forced to limit maximum power draw because MacBook should work with similar performance from battery? While Lenovo can take full power from charger?