Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

mw_bs

macrumors newbie
Nov 14, 2023
1
1
TL;DR – the CPU is throttled to ~60% during Lightroom exports, no such throttling during heavy load tests with other apps.


Details:

Lightroom Classic is what I primarily use my computer for, so when I received my 16" M3 Max 14c, I was disappointed in the test I ran (a series of 500 Sony A1 images, exported to JPEGs)... it was better than the M1 Max, but not nearly as much as I was expecting. The M1 Max did the job in 9:37, while the M3 Max took 8:59. Later, I repeated the test, and got varied results in the 6-8 minute range. Observing the temperature and CPU/GPU load data, a fairly predictable pattern is visible:

1) It starts out with both the CPU and GPU running strong... images are exported at a rate of about 2 per second

2) Temperature rapidly starts to increase, reaching about 100C after 30-60 seconds

3) Fans start to ramp up, temperature begins to come down

4) CPU usage drops dramatically, to about 20% (though on some runs this stage would be 40-60%), GPU drops a bit, temperature drops to 60-70C, fans spin down, export rate slows to under 1 per second

5) After a few minutes in that state, CPU usage tentatively starts to increase to 70-90%, temperature slowly increases in turn, but no noticeable fan increase

6) As temperature reaches about 85C, CPU usage drops to around 60%, where it seems to reach somewhat of an equilibrium, with the GPU in the 80% range, still no audible fan... export rate has leveled off at about 1 per second


Now, one run finished in a mere 4:39(!)... that one was done first thing this morning, when the computer was completely cold, which presumably enabled it to stay at a higher CPU usage for longer (for the other runs, I let it cool down to an idle temperature of 50-60C). So, this seems like this is thermal throttling, right? Heavy load, computer gets hot, CPU throttles.

A few mysteries though. When it reaches that "equilibrium" state, the fans are not audibly running, so it's got plenty of headroom to accommodate more load.

Secondly, running another test that consisted of a video export from FCPX (mostly GPU) along with another app to put an additional load on the CPU, it behaved more like you would expect it to... temperature around 100-105C, fans running, CPU and GPU both pegged, and it stayed in this consistent state indefinitely. If the Lightroom scenario was the system throttling, why did similar throttling not also occur during this other test? Or, to put it differently, if it can handle this heavy stress test without significant throttling (or even if it is throttling a bit, it's at least TRYING to keep performance up by running the fans), why didn't it do so during the Lightroom export?

One other observation... while this export is going on (with LR's CPU usage staying around 60%) if I then initiate a "build previews" in LR, the CPU usage pegs again, and stays there (with fans spinning up). So, the capacity is there, just for some reason LR is not using it continuously during export.

High power mode doesn't make an appreciable difference... if there's any improvement, it's small enough to be lost in the variability of each test.

Lastly, briefly going back to the M1 Max, I do see similar behavior, but not as severe as this (need to run some more tests on that machine though, I've been focused mainly on the M3).
same here, thanks for the detailled description. Seems like a issue Adobe could address. Was disappointed with export performance as well. Creating previews is as fast as expected. The Lightroom CC export performance proves that the gains could be had. Lets hope Adobe doesn't sleep on this one.
 
  • Like
Reactions: macphoto861

NEPOBABY

Suspended
Jan 10, 2023
697
1,688
same here, thanks for the detailled description. Seems like a issue Adobe could address. Was disappointed with export performance as well. Creating previews is as fast as expected. The Lightroom CC export performance proves that the gains could be had. Lets hope Adobe doesn't sleep on this one.

Always remember the more CPU/GPU power you get the more likely it is for users to throw even larger amounts of data at it. They often do this without realizing they are doing it, so naturally processors will be pushed even harder and get hot.

Jevon's paradox is permanent and cannot be avoided.
 

macphoto861

macrumors 6502
Original poster
May 20, 2021
496
444
Isn't throttling an OS rather than an app issue, unless the app is exhibiting bad behavior?
I feel reasonably confident that I've established it's not an OS-level throttling (because it does not occur with similar loads from other apps), so I'm assuming it's a LRC issue.
 

Chancha

macrumors 68020
Mar 19, 2014
2,313
2,141
I feel reasonably confident that I've established it's not an OS-level throttling (because it does not occur with similar loads from other apps), so I'm assuming it's a LRC issue.
If I understand you correctly, you replicate the exact same work in LRCC and didn't see the "sin wave dip" of the P-Cores? that means LRC is the problem exclusively. Which is odd, normally with a new chip LRC is the one that requires the least optimisation for the fact its code base is the newest, starting with ARM (iOS) in mind.

btw I got my M3 Max 16" binned already, later when I have time can spare to test if I can see the same thing as you did. So just exporting JPEGs or are there other tasks that also exhibit the behaviour?
 

Chancha

macrumors 68020
Mar 19, 2014
2,313
2,141
Screenshot 2023-11-15 at 18.23.51.png


Okay I did 2 tests and my results are even more interesting than yours.

Nikon D850 45MP RAW, 583 of them exporting as JPEG 2000px wide.
GPU export is enabled
Above is LRC (Classic), below is LRCC (Cloud).

1) the task takes almost double as long to finish in LRC
2) in LRC no core is running at max temp, P-Cores are under-utilised, fans never kick in
3) in LRCC, all cores are running towards or over 100C (TG Pro warnings are popping up), all cores are almost fully utilised especially P-Cores, fans kick in after 30 seconds or so

I think if I increase the number of photos to 1000 or more, the machine may heat up longer to the point of throttling will begin, on the LRCC setup. By comparing these two apps I gotta say LRCC actually is better optimised for M3 Max for now, its utilisation is better than on LRC.

Edit: mixed up LRCC and LRC terminology, now fixed.
 
Last edited:

macphoto861

macrumors 6502
Original poster
May 20, 2021
496
444
Okay I did 2 tests and my results are even more interesting than yours.

Nikon D850 45MP RAW, 583 of them exporting as JPEG 2000px wide.
GPU export is enabled
Above is LRCC (Classic), below is LRC.

1) the task takes almost double as long to finish in LRCC
2) in LRCC no core is running at max temp, P-Cores are under-utilised, fans never kick in
3) in LRC, all cores are running towards or over 100C (TG Pro warnings are popping up), all cores are almost fully utilised especially P-Cores, fans kick in after 30 seconds or so

I think if I increase the number of photos to 1000 or more, the machine may heat up longer to the point of throttling will begin, on the LRC setup. By comparing these two apps I gotta say LRC actually is better optimised for M3 Max for now, its utilisation is better than on LRCC.
Thanks for posting that! At first I was confused, because it seemed like your experience was the exact opposite as mine (full performance in Lightroom Classic, lackluster in Lightroom). But then I read it more carefully, and I see that your results are fairly consistent with mine... just to clarify the terminology and make sure there's no confusion, I believe LRC usually refers to Lightroom Classic, while LRCC refers to the simpler and "cloud-centric" version of Lightroom (the C stands for Classic while the CC stands for Creative Cloud).
 

Chancha

macrumors 68020
Mar 19, 2014
2,313
2,141
You are right, I mixed them up…

The results above without fans or cores blasting is Classic.
The one below with everything blasting is the Cloud version.
 

Onimusha370

macrumors 65816
Aug 25, 2010
1,039
1,506
Yes the names are confusing aren't they! So just to be clear, we think LightRoom Classic (LRC) is having some issues with performance, but LightRoom Creative Cloud (LRCC) appears to be maximising the potential of the M3 Max?
 

Chancha

macrumors 68020
Mar 19, 2014
2,313
2,141
Yes the names are confusing aren't they! So just to be clear, we think LightRoom Classic (LRC) is having some issues with performance, but LightRoom Creative Cloud (LRCC) appears to be maximising the potential of the M3 Max?
Yes, thats what I saw with my brief tests.

Now I come to think of it, the source of my confusion actually came from Adobe. Before they rolled out the Cloud version, there was a brief period where the Classic version was called Creative Cloud (CC)… I have been a “Classic” user since version 2, never really switched to the Cloud version unless working in iOS.
 

Onimusha370

macrumors 65816
Aug 25, 2010
1,039
1,506
Yeah horrendous naming haha, I've only started using LightRoom in the last few months and went for the Creative Cloud version, mainly because it looks much more modern and friendly for someone who's never used it before. Noticed a big speed bump with the M3 Max but its amazing how quickly you get used to the speed and want more!
 

HDFan

Contributor
Jun 30, 2007
7,290
3,341
I feel reasonably confident that I've established it's not an OS-level throttling (because it does not occur with similar loads from other apps), so I'm assuming it's a LRC issue.

Don't think I've ever heard of an app that monitors system temperatures and changes the cpu clocking. Don't think it is even possible. I have only heard the term "throttling", i.e. slowing down the cpu, in OS performance contexts. As you say it is likely a Lightroom issue though.
 

Chancha

macrumors 68020
Mar 19, 2014
2,313
2,141
I repeated the above test using LRC (Cloud), this time with 3500 RAWs so it takes a considerably longer time to finish. There is no throttling, all E/P/G cores runs like the above test with just 500 photos.
 

Cabin

macrumors member
Oct 11, 2013
54
32
Pa.
So what is your conclusion?
I repeated the above test using LRC (Cloud), this time with 3500 RAWs so it takes a considerably longer time to finish. There is no throttling, all E/P/G cores runs like the above test with just 500 photos
 

macphoto861

macrumors 6502
Original poster
May 20, 2021
496
444
Don't think I've ever heard of an app that monitors system temperatures and changes the cpu clocking. Don't think it is even possible. I have only heard the term "throttling", i.e. slowing down the cpu, in OS performance contexts. As you say it is likely a Lightroom issue though.

Yes, I believe the actual definition of the term involves the OS slowing down the clock speed... you can actually see a bit of this in my graph screen shot, with the clock speed dropping slightly as the temperature peaks, and then going back up as the temperature drops. I don't think that's Lightroom, that's the OS. But the big CPU (and to a lesser extent, GPU) dip is likely the app's doing.
 

Chancha

macrumors 68020
Mar 19, 2014
2,313
2,141
So what is your conclusion?
You only get to see the sin wave dip when using LR Classic right? Combining our results it seems to be a case of worse optimzation than the Cloud version.

The M3 Max of course should have a limit as to when throttling happens, but this time the ceiling looks to be really high, not easily reachable with “just Lightroom”.
 

spinergy

macrumors member
Oct 16, 2016
44
22
could you @macphoto861 compare other functions in LR Classic like Panorama Stitching, AI denoise, Auto masking subjects and write up a little comparison between M1 Max and M3 Max? Thanks in advance!
 

macphoto861

macrumors 6502
Original poster
May 20, 2021
496
444
could you @macphoto861 compare other functions in LR Classic like Panorama Stitching, AI denoise, Auto masking subjects and write up a little comparison between M1 Max and M3 Max? Thanks in advance!
I don't do panoramas, so I don't have any appropriate source images to try.

To test AI masking, I made a subject mask on the first of 50 images, and synced to the other 49 (so, IOW, it created subject masks on those 49 images)... 51 seconds on the M1 Max, 41 seconds on the M3 Max.

I've never used the AI denoise thing before, but applying it to 10 images, the M1 Max finished in 4:35, while the M3 Max took 4:40 (yes, 5 seconds longer). GPU was fully utilized in both instances, with just a little CPU activity.
 
  • Like
Reactions: jido

macphoto861

macrumors 6502
Original poster
May 20, 2021
496
444
could you @macphoto861 compare other functions in LR Classic like Panorama Stitching, AI denoise, Auto masking subjects and write up a little comparison between M1 Max and M3 Max? Thanks in advance!
A few more tidbits.

I applied the "Subject Pop" AI filter to 353 images... M1 Max 4:01, M3 Max 3:24. Faster, but not mind-blowingly so.

Though I'm assuming the mechanism behind calculating masks for an AI preset like this (or syncing an AI mask to multiple images) is the same as when you hit the Select Subject button, and similar speed improvements would be expected. Although it "feels" faster on the M3 Max, it's hard to measure that difference. So, what I did was just step through a series of 50 images, creating a Subject mask for each, as quickly as I could. Of course, that's an imprecise test, but it's the best I could do. I did several runs on each machine, and the average time to complete this was 1:37 on the M1 Max and 1:14 on the M3 Max.

The Denoise result (in my previous post) is puzzling. I ran it again, and got the same time.
 
  • Like
Reactions: spinergy

macphoto861

macrumors 6502
Original poster
May 20, 2021
496
444
By comparing these two apps I gotta say LRCC actually is better optimised for M3 Max for now, its utilisation is better than on LRC.
I did some more extensive testing over the past few days. The "sine wave" behavior (where the temp and CPU/GPU ramp up quickly, then drop drastically, before finally leveling off a few minutes later) is inconsistent... sometimes I can get it to happen, other times not. Still trying to figure out exactly what triggers it, but I have an idea (which I'll mention below). But I'll set that aside for a moment and concentrate on LRC's performance when it's NOT doing that.

LRCC performs MUCH better on exporting, not just on the M3 Max but also on the M1 Max. But the difference is significantly greater on the M3 Max. Example, exporting 1000 images:

LRC M1 Max – 26:58
LRCC M1 Max – 14:54

LRC M3 Max – 20:16
LRCC M3 Max – 9:57

So, LRCC is twice as fast on the M3 Max, and faster (still significantly, but to a lesser degree) on the M1 Max.

Here's a screen shot that shows the last several minutes of the LRC export and the first several minutes of the LRCC export on the M3 Max:

Screenshot 2023-11-17 at 12.43.43 PM.png


The difference in power and CPU/GPU usage is quite stark. The clock speed drop and fluctuation in the beginning of the LRCC export is, I believe, how the actual OS-level throttling is implemented. The fans began screaming at this point (it was on "high power" mode), and the clock speed stabilizes as an equilibrium is reached.

Regardless, in all the testing I've done over the past few days, LRC seems to behave like a somewhat tentative driver, on a wide open highway out in the middle of Texas with no cops in sight but still afraid to even slightly exceed the speed limit, while LRCC simply floors the gas pedal.

I have to wonder... is this poor optimization in LRC, or is it intentional (to keep fan noise down and/or retain more resources for the user to work while exporting)? The observations I've made lead me to believe that it might very well be the latter. In particular, I don't think it's OS-level throttling, because if it were, why wouldn't the system respond the same way during LRCC exports? With LRCC, clock speed is temporarily reduced a bit as the fans catch up to the surge in activity, but the CPU stays pegged... that's clearly OS-level throttling. With LRC, it seems like the app itself is reducing the workload it's placing on the CPU.

Also, LRC's performance (at least initially) seems to be heavily dependent on what the temperature of the CPU when starting. If the computer is well-rested and is not hot at all, the "sine wave" behavior is more likely to occur... my theory is that LRC is monitoring the computer's temperature, and it says "great, no heat here, step on the gas!", but then it overshoots as the temperature rises quickly, and it slams on the brakes. After a while, it tentatively starts to accelerate again, reaching what appears to be its target of keeping the CPU at around 80-85 degrees. Similarly, if the temperature is already in the 80-90 range when I begin an export, LRC's CPU usage will be moderated from the very beginning. All this makes me think that it's deliberate... because if it were just poor optimization of the export code, it wouldn't matter what the CPU temperature was (as long as it wasn't excessively hot, so as to trigger the OS to throttle)... whether it was 50C or 90C, I would expect to see identical resource usage when exporting the same batch of images over and over.

I'm no programmer, and I have no idea if this kind of active resource management based on temperature is something that would be incorporated into an app like this. But it's the best explanation I can come up with at present. On the other hand, if it IS the case that Adobe is intentionally limiting LRC's use of resources while exporting, why would Adobe not also choose to apply this same "throttling" (for lack of a better term) to LRCC?
 

Chancha

macrumors 68020
Mar 19, 2014
2,313
2,141
I am more inclined to think that Classic just needs manual optimizations, where Cloud does not. Classic has inherited codebase from decade(s) ago where multi-thread on PC wasn’t even mainstream yet. Ever since the split of Classic and Cloud, every time there is a new generation of x86 chips or in this case Apple Silicon, we stand a good chance to see the Classic version not keeping up right off the bat. It looks as if someone has to manually tell Lightroom how many cores are present or else it will just assume you are on a previous gen chip. This was more obviously the case a few years ago when most of Classic‘s tasks was only taking advantage of 4 cores despite throwing like 16 cores at it.
 
  • Like
Reactions: Adult80HD
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.