Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

danano

macrumors member
Original poster
Dec 3, 2017
32
19
Is there a way to get Apple Photos to use GPU acceleration? I suspect there may be some parameter which can be set via the command line.
 

danano

macrumors member
Original poster
Dec 3, 2017
32
19
Why do you think that Photos is not using hardware acceleration?

Watch Activity Monitor. In the main window the % CPU and % GPU columns indicate which processor is doing the work. The windows which show CPU History and GPU History confirm for a period of time what the main window reports.

Note that this observation applies to the Mac Pro 7,1 (2019). The same observation on a Mac Book Pro 18,2 (2021, Apple M1 Max) shows CPU and GPU cores being used as expected.

Confirming this observation, I asked Apple Support and was told there was no way they could tell me to turn on GPU processing in Photos. This contrasts with "professional" software such as Final Cut Pro or DxO Photo Lab, both of which have preference settings which allow the user to choose what GPU, if any, is used by the app.

I suspect some parameter(s) in Photos can be set by command line, which will allow Photos to use the GPU. Probably Apple's official policy is to not discuss this, just as they do not discuss the settings which allow older hardware to run newer versions of macOS.
 

danano

macrumors member
Original poster
Dec 3, 2017
32
19
If Photos can't use GPU acceleration, you can't even scroll smoothly.
GUI scrolling is a function of the Finder, not Photos. Since the Finder clearly does use the GPU, why not Photos? I suspect Apple made this arbitrary choice.
 

Slartibart

macrumors 68040
Aug 19, 2020
3,145
2,819
I like to suggest something: if hardware acceleration is systemwide enabled (you can e.g. check with VideoProc) my expectation is that you can replay e.g. this HEVC video via Quick Look (or the Quicktime Player) with low CPU usage. If you then could kindly add it to Photos, play it there and check with Activity Monitor?

I find it difficult to imagine that Apple turns off the use of hardware accelerated system frameworks for Photos. Your reference to e.g. Final Cut Pro seems to be a bit a non sequitur - FCP offers an option to decide which GPU to use when rendering or sharing; the real-time effects use the default GPU… something which Photos probably does if possible*¹. There is not much you can do in Photos to stress the system - okay, maybe that’s a different thing when curating a large photo library - besides face embedding. I am not sure how this is implemented on Intel-Macs, maybe you like to check? My hypothesis would be that Photos uses partly Metal*¹ - which seems to be GPU accelerated. I mean Apple is having a crack on GPU-accelerated composing for a while (Quartz Composer anyone? 🤓)

Regarding to possible settings you hope for to de/activate in the form of something like: $ defaults write com.apple.Photos GpuAcceleration -bool true or similar - you can check the XML property lists of Photos which you should find in ~/Library/Preferences, or /Library/Preferences to see what you can set.
To check the systems global defaults and save them to a text file, use:
defaults read NSGlobalDomain > DefaultsGlobal.txt
and then open DefaultsGlobal.txt in the text editor of your choice. But I doubt that there is something there, otherwise e.g. Tinkertool would probaly already provide some GUI for it, but who knows?
 
Last edited:

h9826790

macrumors P6
Apr 3, 2014
16,656
8,587
Hong Kong
GUI scrolling is a function of the Finder, not Photos. Since the Finder clearly does use the GPU, why not Photos? I suspect Apple made this arbitrary choice.
Not really, you can try Aperture, it won't give you smooth scrolling. Finder is Finder. Photo is Photo.
Watch Activity Monitor. In the main window the % CPU and % GPU columns indicate which processor is doing the work. The windows which show CPU History and GPU History confirm for a period of time what the main window reports.

Note that this observation applies to the Mac Pro 7,1 (2019). The same observation on a Mac Book Pro 18,2 (2021, Apple M1 Max) shows CPU and GPU cores being used as expected.

Confirming this observation, I asked Apple Support and was told there was no way they could tell me to turn on GPU processing in Photos. This contrasts with "professional" software such as Final Cut Pro or DxO Photo Lab, both of which have preference settings which allow the user to choose what GPU, if any, is used by the app.

I suspect some parameter(s) in Photos can be set by command line, which will allow Photos to use the GPU. Probably Apple's official policy is to not discuss this, just as they do not discuss the settings which allow older hardware to run newer versions of macOS.
That GPU usage isn't accurate. e.g. For GPU decoding, that usage may not be included.

A similar example is macOS can show RX580 GPU temperature, but not Radeon VII GPU temperature.

This also application on GPU usage, macOS may have M1 GPU decoding usage included in Activity Monitor GPU usage, but the GPU on your 7,1 doesn't.

Of course, I am not saying your complain isn't there. But we better have some more solid direct evidence (e.g. No smooth scrolling in Photos). If Photos works as expected, and no apparent performance issue. It's hard to conclude that the GPU isn't working.

Another possible reason is that Photos use the M1 neural engine, which may be macOS count that as part of the M1 GPU usage, but not on the 7,1.
 

goMac

macrumors 604
Apr 15, 2004
7,663
1,694
Photos will automatically use the GPU that is attached to the monitor your Photos window is on.

Some operations work better on the CPU than GPU - those will be done through Apple's Accelerate framework. Because it takes time to transfer a photo to a GPU (and then back) for processing - sometimes the CPU is best.

Let Photos make the decision that is best for your hardware. It's not worth second guessing. The assumption that the GPU is always faster for photo work is wrong.
 

danano

macrumors member
Original poster
Dec 3, 2017
32
19
Many thanks to the three of you who responded. Following your suggestions I think I have learned a good bit about my computer, in addition to the particular issues I brought up. I will summarize my current understanding.

I looked everywhere suggested and did not find any preference setting that would allow gpu use to be modified. Probably the Photos code is what it is, and is not adjustable in this way.

I downloaded VideoProc and the Swordsmith 4k HEVC video suggested for testing. Hardware Info Detected by VideoProc Converter tells me acceleration is available for all things involving H264 video, but not for HEVC. I get the same whether Use High Quality Engine is checked or not checked. Among the various outputs which can be selected, it seems hardware acceleration is available for forms which use H264, but almost nothing else. The one exception listed, output to HEVC video, is so rare as to lead me suspect this is an error in the information provided.

It is very interesting that according to VideoProc Converter my Radeon Pro Vega II, which I thought was fairly current, is not capable of working with HEVC graphics. I also found this post: HEVC Hardware Accelerated encoding no longer works in Monterey, so I am not alone in my observation. I suspect Apple might want to obscure this significant limitation.

The face embedding link was very informative. The described new complex algorithm creates and ranks many pair-wise comparisons, and is said to greatly increase the speed and accuracy of face recognition. I believe this is processed at least in part by the Neural Engine present in Apple Silicon/iOS devices, but lacking in Intel-based Macs such as my Mac Pro. This would help explain face recognition proceeding faster on my MacBook Pro than on my Mac Pro. This also suggests that Apple’s software focus has been on Photos in iOS, and not on Photos in macOS.

Perhaps the lack of hardware acceleration which I suggested is due to relatively feeble implementation of Neural Engine function in macOS software, plus my gpu’s inability to process HEVC images. I do not know the extent of this limitation, but expect it reduces potential the benefit from this software implementation.

This may partially explain instability I recently observed in Photos. Supposedly Photos analyzes images for faces, then assigns the faces to the person through background processing. My Mac has been working on this most of the time since February, 2020. It has not completed the analysis, but it often appeared to have stopped working in the background, so I gave up waiting for Photos to automatically identify and name faces. For the last month or so I manually tagged faces, and Photos crashed once or twice a day, or more.

My guess now is that the code used in macOS Photos and iOS Photos is similar, but differs in its calls for hardware support since the available hardware differs. For example, relatively simple iOS calls to Apple Silicon’s Neural Engine would correspond in macOS to much more complex calls to Intel-based hardware. If Apple is more focussed on iOS, we could expect the more complex programming required for macOS on Intel to be less optimized and debugged. This might explain what I saw as poorer performance and lack of use of hardware acceleration.

A few days ago macOS 12.5 was released, with few changes noted. However, since updating, I have noticed a remarkable change. Photos now runs constantly in the background, with Activity Monitor showing cpu use steady near 100%. My guess is that from 12.4 to 12.5, among other things, Apple debugged and improved Photos. I hope the constant processing I see now reflects more functional application of the face recognition process.

Lots of speculation and guesses here. Please correct me where I am wrong. Again, thanks to all for your help.
 

danano

macrumors member
Original poster
Dec 3, 2017
32
19
Many thanks to the three of you who responded. Following your suggestions I think I have learned a good bit about my computer, in addition to the particular issues I brought up. I will summarize my current understanding.

I looked everywhere suggested and did not find any preference setting that would allow gpu use to be modified. Probably the Photos code is what it is, and is not adjustable in this way.

I downloaded VideoProc and the Swordsmith 4k HEVC video suggested for testing. Hardware Info Detected by VideoProc Converter tells me acceleration is available for all things involving H264 video, but not for HEVC. I get the same whether Use High Quality Engine is checked or not checked. Among the various outputs which can be selected, it seems hardware acceleration is available for forms which use H264, but almost nothing else. The one exception listed, output to HEVC video, is so rare as to lead me suspect this is an error in the information provided.

It is very interesting that according to VideoProc Converter my Radeon Pro Vega II, which I thought was fairly current, is not capable of working with HEVC graphics. I also found this post: HEVC Hardware Accelerated encoding no longer works in Monterey, so I am not alone in my observation. I suspect Apple might want to obscure this significant limitation.

The face embedding link was very informative. The described new complex algorithm creates and ranks many pair-wise comparisons, and is said to greatly increase the speed and accuracy of face recognition. I believe this is processed at least in part by the Neural Engine present in Apple Silicon/iOS devices, but lacking in Intel-based Macs such as my Mac Pro. This would help explain face recognition proceeding faster on my MacBook Pro than on my Mac Pro. This also suggests that Apple’s software focus has been on Photos in iOS, and not on Photos in macOS.

Perhaps the lack of hardware acceleration which I suggested is due to relatively feeble implementation of Neural Engine function in macOS software, plus my gpu’s inability to process HEVC images. I do not know the extent of this limitation, but expect it reduces potential the benefit from this software implementation.

This may partially explain instability I recently observed in Photos. Supposedly Photos analyzes images for faces, then assigns the faces to the person through background processing. My Mac has been working on this most of the time since February, 2020. It has not completed the analysis, but it often appeared to have stopped working in the background, so I gave up waiting for Photos to automatically identify and name faces. For the last month or so I manually tagged faces, and Photos crashed once or twice a day, or more.

My guess now is that the code used in macOS Photos and iOS Photos is similar, but differs in its calls for hardware support since the available hardware differs. For example, relatively simple iOS calls to Apple Silicon’s Neural Engine would correspond in macOS to much more complex calls to Intel-based hardware. If Apple is more focussed on iOS, we could expect the more complex programming required for macOS on Intel to be less optimized and debugged. This might explain what I saw as poorer performance and lack of use of hardware acceleration.

A few days ago macOS 12.5 was released, with few changes noted. However, since updating, I have noticed a remarkable change. Photos now runs constantly in the background, with Activity Monitor showing cpu use steady near 100%. My guess is that from 12.4 to 12.5, among other things, Apple debugged and improved Photos. I hope the constant processing I see now reflects more functional application of the face recognition process.

Lots of speculation and guesses here. Please correct me where I am wrong. Again, thanks to all for your help.
One more note to add.

The new and improved face recognition processes use much information in addition to the face image. Examples are the time, date and location of the photo, and the presence of other factors in adjacent photos, where adjacent essentially means in the same moment. Most of my identified Persons are in photos with a date and time stamp applied when the photo was taken. However, some photos of the same Person are scans of analog photos. For scans, often the time and date are when the scan occurred, not when the photo was taken. If not corrected, the date and location can be inaccurate and off by decades and continents. After I realized the importance of this data, I manually corrected many photos. After correction, Photos has processed these individuals much more thoroughly.

Based on this observation, I believe before the corrections, Photos was choking bad data. GIGO is the old-time description - Garbage In Garbage Out. If the function of Photos seems lacking, make sure all the information in the data sets used for training is accurate.
 
  • Like
Reactions: h9826790

goMac

macrumors 604
Apr 15, 2004
7,663
1,694
Perhaps the lack of hardware acceleration which I suggested is due to relatively feeble implementation of Neural Engine function in macOS software, plus my gpu’s inability to process HEVC images. I do not know the extent of this limitation, but expect it reduces potential the benefit from this software implementation.

Correct. Neural Engine is Apple Silicon only.

Neural Engine is also not GPU acceleration - As I said before, don't assume GPU is faster. There are technologies faster than GPU. Neural Engine on Apple Silicon is faster than the onboard GPU.

HVEC video and images are also two different things.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.