Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

cookies!

macrumors 6502
Original poster
Jul 3, 2011
456
132
I've heard that in CC (or in general) 2D performance isn't really affected by how powerful the GPU is as long as it is supported for hardware acceleration in the program. Is this true, and does it also apply to Lightroom CC?

I kind of want to downsize to a 2016 13" from my mid-2014 15" for portability, but I'm hesitant to give up my discrete GPU until I know for sure.
 

bent christian

Suspended
Nov 5, 2015
509
1,966
I've heard that in CC (or in general) 2D performance isn't really affected by how powerful the GPU is as long as it is supported for hardware acceleration in the program. Is this true, and does it also apply to Lightroom CC?

I kind of want to downsize to a 2016 13" from my mid-2014 15" for portability, but I'm hesitant to give up my discrete GPU until I know for sure.

dGPU is not a requirement. My Iris Pro 6200 is supported. I think Intel GPUs that utilize at least 1GB of RAM (shared or discrete) will work. The Iris 540 and 550 use up to 1.5GB of system RAM, so they should work for acceleration.

I can't speak to the importance. I am still on LR5. Some people have had issues. I know that.
 

MCAsan

macrumors 601
Jul 9, 2012
4,587
442
Atlanta
I've heard that in CC (or in general) 2D performance isn't really affected by how powerful the GPU is as long as it is supported for hardware acceleration in the program. Is this true, and does it also apply to Lightroom CC?

I kind of want to downsize to a 2016 13" from my mid-2014 15" for portability, but I'm hesitant to give up my discrete GPU until I know for sure.


What does Apple say?
 

thekev

macrumors 604
Aug 5, 2010
7,005
3,343
Lightroom doesn't benefit greatly from your gpu. Some people are paranoid. It has been bottlenecked by everything other than the gpu since its inception, yet people still spread paranoid nonsense suggesting otherwise.
 

thekev

macrumors 604
Aug 5, 2010
7,005
3,343
What do you mean by this?

I mean that the time you spend waiting in Lightroom and even lag you may encounter has very little to do with the GPU. People asked whether they needed a powerful gpu for lightroom as soon as Apple supported OpenCL. The gpu never provided any benefit prior to recent versions. Now it provides minimal benefit. It's not worth anyone's concern. In fact there have been complaints about certain things lagging when gpu acceleration is enabled.
 
  • Like
Reactions: phrehdd

Ray2

macrumors 65816
Jul 8, 2014
1,170
489
I'm totally with thekev on this one. On a max spec'd rmbp 13" I have gpu acceleration turned off (it's actually slower with imports with it turned on). Perhaps on the 15" it might make a difference. But given Lightroom's lethargic approach at just about anything, I doubt one would ever notice it. Certainly nothing worth paying for. Adobe still has a long way to go with writing quick, efficient apps.
 
  • Like
Reactions: thekev

Padaung

macrumors 6502
Jan 22, 2007
470
104
UK
From the research I've done in the past the optimal cost vs speed configuration for LR is a quad core CPU, 8Gb+ of memory and a SSD (if you are only processing jpg files then a regular HDD would be more than fine too, although LR will take an age to actually start!). The GPU would seem to have minimal affect on performance.

More than 4 cores does increase performance, but minimally and the increase in cost would off-set the benefit of the increase in performance for many people.

The performance increase from 2 cores to 4 cores was significant though.

I'm fairly sure I found this all out by reading multiple articles on http://barefeats.com/
I did this research a year or so ago, so unless Adobe has made a significant change to how LR handles images since then I doubt much has changed...
 
  • Like
Reactions: phrehdd

Kidago

macrumors newbie
Apr 21, 2014
2
1
I run Lightroom daily on a MacPro 2013 8-core with the highest end GPU's. Having the switch turned ON telling lightroom to use GPU's makes the experience unbearable. Takes a few extra seconds to load previews and to see changes. LR runs much faster without GPU support (for now) and on this specific setup.
 
  • Like
Reactions: phrehdd

Ray2

macrumors 65816
Jul 8, 2014
1,170
489
More than 4 cores does increase performance, but minimally and the increase in cost would off-set the benefit of the increase in performance for many people.

The performance increase from 2 cores to 4 cores was significant though....

Hmmm. Lightroom 6 still employs a single core for most operations, including preview generation, one of the most time consuming tasks. Unless one is doing a lot of CPU intensive background processing (nothing to do with Lightroom), I'd rather have 2 big cores than 4 smaller ones with 2 to 3 of them sitting idle most of the time.

Filed in the "Lightroom needs ram bin".
 

monokakata

macrumors 68020
May 8, 2008
2,063
605
Ithaca, NY
Hmmm. Lightroom 6 still employs a single core for most operations, including preview generation, one of the most time consuming tasks. Unless one is doing a lot of CPU intensive background processing (nothing to do with Lightroom), I'd rather have 2 big cores than 4 smaller ones with 2 to 3 of them sitting idle most of the time.

Filed in the "Lightroom needs ram bin".
Lightroom CC 2017 (and earlier versions, as I remember) uses all 4 cores on my 5k iMac, when importing and preparing 1:1 previews. Perhaps LR6 used only one core (I can't remember) but CC will use multiple cores.

For me, that's the phase I want to have completed ASAP.
 

thekev

macrumors 604
Aug 5, 2010
7,005
3,343
I'm totally with thekev on this one. On a max spec'd rmbp 13" I have gpu acceleration turned off (it's actually slower with imports with it turned on). Perhaps on the 15" it might make a difference. But given Lightroom's lethargic approach at just about anything, I doubt one would ever notice it. Certainly nothing worth paying for. Adobe still has a long way to go with writing quick, efficient apps.

Most of these things have a few very very expensive calls to actually do the work. Metal's goal was to reduce driver overhead, so it might improve if they go that route. Lightroom itself was written long before metal, so it's not like they could have anticipated it. Lightroom itself broke a lot of old photoshop's raw processing conventions, because Adobe learned from past development.
 

MCAsan

macrumors 601
Jul 9, 2012
4,587
442
Atlanta
All of this more than begs the questions about very poor performance in Lr. Why does Lr need to build previews at all? Luminar does not. Photo RAW, released next week, does not. Does CiP or Affinity or even Photos need to do that? Lr is well past due for a ground up rebuild.....or replacement.
 

bent christian

Suspended
Nov 5, 2015
509
1,966
All of this more than begs the questions about very poor performance in Lr. Why does Lr need to build previews at all? Luminar does not. Photo RAW, released next week, does not. Does CiP or Affinity or even Photos need to do that? Lr is well past due for a ground up rebuild.....or replacement.

Lightroom has always been a taped-together meld of ACR and Bridge.
 

thingstoponder

macrumors 6502a
Oct 23, 2014
916
1,100
All of this more than begs the questions about very poor performance in Lr. Why does Lr need to build previews at all? Luminar does not. Photo RAW, released next week, does not. Does CiP or Affinity or even Photos need to do that? Lr is well past due for a ground up rebuild.....or replacement.

Don't they all do previews? Unless I'm wrong, a preview is just a temporary JPG from the edits you've done to a RAW file, as once you edit a RAW file it ceases to be a RAW file.
 

MCAsan

macrumors 601
Jul 9, 2012
4,587
442
Atlanta
once you edit a RAW file it ceases to be a RAW file.

You can use the raw file to create raster bit maps such as jpg or tif that you can edit on the bit map level. The raw file is still in the file system unless you delete it. Each time you make a new version of the raw file, you make a new jpg, tif or other bit map file.

or

you use the raw file plus sidecar files where all the editing instructions are stored in the sidecar. You never touch the raw file, you just use it as a data source. You can have N number of small sidecars each for a seperate version of the image. All the sidecars all point back to the same raw file which you do not need to duplicate.
 

Cheese&Apple

macrumors 68010
Jun 5, 2012
2,004
6,606
Toronto
Don't they all do previews? Unless I'm wrong, a preview is just a temporary JPG from the edits you've done to a RAW file, as once you edit a RAW file it ceases to be a RAW file.

You can use the raw file to create raster bit maps such as jpg or tif that you can edit on the bit map level. The raw file is still in the file system unless you delete it. Each time you make a new version of the raw file, you make a new jpg, tif or other bit map file.

or

you use the raw file plus sidecar files where all the editing instructions are stored in the sidecar. You never touch the raw file, you just use it as a data source. You can have N number of small sidecars each for a seperate version of the image. All the sidecars all point back to the same raw file which you do not need to duplicate.

I understand that a RAW file remains as a RAW file unless deleted. But, I'm confused...without a preview, what are you seeing? What is an application presenting to the user on screen if not a preview?

~ Peter
 

MCAsan

macrumors 601
Jul 9, 2012
4,587
442
Atlanta
I understand that a RAW file remains as a RAW file unless deleted. But, I'm confused...without a preview, what are you seeing? What is an application presenting to the user on screen if not a preview?

~ Peter
either the small jpg preview that is embedded inside the raw file or, the computer has to make a new one that is larger and usually has better quality. That preview can be done by a CPU (old school) or GPU (new school). A good GPU should be able to generate the large preview very quickly while the CPU does other stuff. As time goes on, having a good GPU will be more and more important for image processing apps.
[doublepost=1480467908][/doublepost]
well if you have 2~5k monitor or with multiple monitors, then you probably gonna need a great GPU like RX 480.

or if you want a large high quality preview and don't want spinning beachballs waiting for the preview to appear.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.