Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

TechnoMonk

macrumors 68030
Oct 15, 2022
2,603
4,110
Weird, over the years my windows laptops with high-end i7 processors, plenty of RAM, and Nvidia GPUs have worked well with loads over time just fine. And my RTX 4060, i7-13700HX, and 32GB of RAM (which I paid a pittance for and upgraded myself) does fantastic over time. It definitely outperforms my M1 MBP 13" with 16GB of RAM. Granted it uses like 10X the power to do so. LOL.
13 MBP is not the one to compare with higher end laptop GPUs. I have tried the ones with 3090, 3080,3070 or even the ones with 4080 or 4090. My M1 Max 64 GB MBP may be slower but doesn’t throttle nor run out of memory like those gimped up 16 GB VRAM Nvidia GPU in laptops. Like I said they do ok for gaming, but sustained GPU loads, they are duds. For my use case i7 won’t cut it, and those 12X or 13X series I9s are heaters.
Oddly enough my workstation with an RTX 4090/ 24 GB runs out of memory and I am forced to use my M1 Max with 64 GB unified memory. It’s lot slower but gets the job done. I am hoping by M5 or M6, Apple can get out a laptop with 256 GB Unified memory and GPU comparable to RTX XX80 series GPU.
 
  • Like
Reactions: Ethosik

Xiao_Xi

macrumors 68000
Oct 27, 2021
1,627
1,101
Well I think I can see partly what he is coming from. Specs can be better, but to me most of the time the software on the Mac side just beats out the windows side. Especially the OS. I don’t even mind paying $500 more just for the ability to use macOS. Final Cut Pro, Logic Pro, Transmit, and many other Mac apps are so beautifully designed and are great compared to windows equivalent which more than likely are recommended as open source with 2000 era UI.
MacBooks may have intangible advantages over PC laptops that are difficult to explain in a review. So, I firmly believe that it is up to Apple's marketing department to explain those advantages. Anyway, what should the comparison between MacBooks and PC laptops look like to better reflect the intangible advantages of one over the other?
 
  • Like
Reactions: TechnoMonk

TechnoMonk

macrumors 68030
Oct 15, 2022
2,603
4,110
MacBooks may have intangible advantages over PC laptops that are difficult to explain in a review. So, I firmly believe that it is up to Apple's marketing department to explain those advantages. Anyway, what should the comparison between MacBooks and PC laptops look like to better reflect the intangible advantages of one over the other?
I think Apple does a good job when it comes to notebooks. Apple is never gonna win the spec war. They rarely get in to comparison of specs, most marketing is on the other factors. The advantages are a toss up when it comes to desktop/workstation Macs.
 
  • Like
Reactions: Xiao_Xi

Basic75

macrumors 68020
May 17, 2011
2,099
2,446
Europe
So your argument is that Macbooks and Windows laptops shouldn't be compared because most consumers have already decided on mac vs PC and therefore comparisons within the mac range and within in the PC range are more useful.

I would disagree, most people I know who buy laptops have not decided on mac or windows and go into the market looking at both machines. Ultimately it's exactly these mac vs pc reviews that help them decided which route to go.
Here's my experience. I know a lot of people that have been buying laptops over multiple decades. Not a single one of them has ever chosen between Mac or PC based on the hardware specs. It always came down to a choice of operating system.
 

eltoslightfoot

macrumors 68030
Feb 25, 2011
2,545
3,092
13 MBP is not the one to compare with higher end laptop GPUs. I have tried the ones with 3090, 3080,3070 or even the ones with 4080 or 4090. My M1 Max 64 GB MBP may be slower but doesn’t throttle nor run out of memory like those gimped up 16 GB VRAM Nvidia GPU in laptops. Like I said they do ok for gaming, but sustained GPU loads, they are duds. For my use case i7 won’t cut it, and those 12X or 13X series I9s are heaters.
Oddly enough my workstation with an RTX 4090/ 24 GB runs out of memory and I am forced to use my M1 Max with 64 GB unified memory. It’s lot slower but gets the job done. I am hoping by M5 or M6, Apple can get out a laptop with 256 GB Unified memory and GPU comparable to RTX XX80 series GPU.
What are you doing that uses those kind of workloads? I would venture to say that your situation is not that of basically the rest of the population...
 

eltoslightfoot

macrumors 68030
Feb 25, 2011
2,545
3,092
Here's my experience. I know a lot of people that have been buying laptops over multiple decades. Not a single one of them has ever chosen between Mac or PC based on the hardware specs. It always came down to a choice of operating system.
This is a great point. I looked at the fact that I would always have a gaming PC/laptop and decided that I can use that for rendering or anything of that nature as well as gaming.

This means I can definitely get by with an M1 13" Macbook Pro with 16GB RAM for everything else (and even creating videos and the like--its a great processor).

But never did I think to use a mac for gaming and there is a lot of software that is better or solely on a mac (such as Vellum)...so the software definitely comes first.
 

TechnoMonk

macrumors 68030
Oct 15, 2022
2,603
4,110
What are you doing that uses those kind of workloads? I would venture to say that your situation is not that of basically the rest of the population...
I do lot of ML/DL inferences on custom models. Occasional Upscaling videos to 4k/8k with custom models. I wouldn’t call it unique situation, lot of folks started using ML/DL for lot of upscaling work and other tasks. Of course any training of model needs a high end Nvidia GPU in cloud. It’s been lot better with Apple silicon in the past year or so, it’s not perfect but a good compliment to my AMD/Nvidia Workstation. It saved me bunch of money from buying more expensive 48 GB or higher GPU.
 
  • Like
Reactions: eltoslightfoot

JouniS

macrumors 6502a
Nov 22, 2020
638
399
I am hoping by M5 or M6, Apple can get out a laptop with 256 GB Unified memory and GPU comparable to RTX XX80 series GPU.
I don't think that will ever happen, because Apple does not have a significant efficiency advantage in GPUs. 256 GB RAM could be possible, but the GPU will be weaker than what Nvidia is offering.

The M2 Ultra GPU is comparable to a mobile 4090, which is effectively a downclocked desktop 4080. And if you clock the mobile 4090 close to the M2 Ultra, it should not use much more power. The reasonable expectation is that the situation will remain roughly the same and an Mx Max will offer one third of the GPU performance of a desktop RTX XX80.
 
  • Like
Reactions: eltoslightfoot

TechnoMonk

macrumors 68030
Oct 15, 2022
2,603
4,110
I don't think that will ever happen, because Apple does not have a significant efficiency advantage in GPUs. 256 GB RAM could be possible, but the GPU will be weaker than what Nvidia is offering.

The M2 Ultra GPU is comparable to a mobile 4090, which is effectively a downclocked desktop 4080. And if you clock the mobile 4090 close to the M2 Ultra, it should not use much more power. The reasonable expectation is that the situation will remain roughly the same and an Mx Max will offer one third of the GPU performance of a desktop RTX XX80.
I would be happy with 50% performance of a XX90 Series NVidia, if future MBP Mx Max can support 256 GB unified memory. It’s no good to have a powerful GPU which is capped at 24 GB, and runs out of memory.
 
  • Like
Reactions: ArkSingularity

JouniS

macrumors 6502a
Nov 22, 2020
638
399
I would be happy with 50% performance of a XX90 Series NVidia, if future MBP Mx Max can support 256 GB unified memory. It’s no good to have a powerful GPU which is capped at 24 GB, and runs out of memory.
That would require a laptop with Mx Extreme, because the Ultra is too small for that kind of performance.

The GPU market is really weird right now. Nvidia intentionally cripples its consumer GPUs with limited memory and low FP64 performance, because the data center market is far too lucrative. Meanwhile Apple makes compromises with weak GPUs, because it's more interested in making consumer devices. Nobody wants to make powerful but cost-effective GPUs with enough memory anymore.
 
  • Like
Reactions: eltoslightfoot

TechnoMonk

macrumors 68030
Oct 15, 2022
2,603
4,110
That would require a laptop with Mx Extreme, because the Ultra is too small for that kind of performance.

The GPU market is really weird right now. Nvidia intentionally cripples its consumer GPUs with limited memory and low FP64 performance, because the data center market is far too lucrative. Meanwhile Apple makes compromises with weak GPUs, because it's more interested in making consumer devices. Nobody wants to make powerful but cost-effective GPUs with enough memory anymore.
Not really, M3 has closed the gap much more than M1 and M2. Apple has a great opportunity with unified memory, now with more optimizations starting from M3, they can focus on improving the GPU further. Not to mention the support for Apple silicon libraries is day and night compared to 1-2 years ago. M5 is still couple of years away, it’s not far fetched if they keep improving.
 

JouniS

macrumors 6502a
Nov 22, 2020
638
399
Not really, M3 has closed the gap much more than M1 and M2. Apple has a great opportunity with unified memory, now with more optimizations starting from M3, they can focus on improving the GPU further. Not to mention the support for Apple silicon libraries is day and night compared to 1-2 years ago. M5 is still couple of years away, it’s not far fetched if they keep improving.
Apart from ray tracing acceleration, the jump in GPU performance from M2 to M3 was not that significant. GPU performance is not something Apple can really improve, as it's mostly about brute force. If you want a faster GPU, you add more cores and run them at a higher clock rate. It's more about TSMC than Apple. Architectural improvements do matter, but they are less important than on the CPU side.
 
  • Like
Reactions: eltoslightfoot

vigilant

macrumors 6502a
Aug 7, 2007
715
288
Nashville, TN
Dell specifically typically build generic boxes. Especially if you buy under the XPS line.

I’d say it’s really up to people and their use case. I value quality of life, high performance, long lasting battery life, not having performance drop because it’s not plugged in, and most importantly the software stack and ecosystem.

I think for the vast majority of Windows users they boot up Windows, and start using their browser like a ChromeBook (probably why Chromebooks are popular). Thats not meant as a dig to Microsoft or Google. The consumer development of apps for Windows specifically seems to have dried up, and turned into Progressive Web Apps. Even Corporations and Enterprise can get access to bread and butter apps like Office through a browser now.

The only exception for general consumers I see is gaming. Gaming without a PC is much less expensive due to consoles.

To be clear, I don’t hate Microsoft or Google. On the Enterprise side, especially with Azure I think they have a very compelling story. But for probably 60%+ of what Windows users need, they’ll end up running things in a browser. It won’t have deep integration like Continuity, virtually little to no support for Machine Learning accelerated tasks that again 60% of the consumer base needs.

No one is “wrong” for buying a Windows or ChromeOS system. It’s just different priorities.
 
  • Like
Reactions: TechnoMonk

TechnoMonk

macrumors 68030
Oct 15, 2022
2,603
4,110
Apart from ray tracing acceleration, the jump in GPU performance from M2 to M3 was not that significant. GPU performance is not something Apple can really improve, as it's mostly about brute force. If you want a faster GPU, you add more cores and run them at a higher clock rate. It's more about TSMC than Apple. Architectural improvements do matter, but they are less important than on the CPU side.
Not true at all. Apple added lot of optimizations. GPU focus so far has been increasing Tflops and cores. Optimizing GPU is literally a low hanging fruit at the moment. The Dynamic caching Apple started with M3 is basic implementation with lot more scope for improvement. I have a friend who is running similar workflows as mine and his M3 Max is comparable to M2 Ultra and a 4060/70 in some use cases. I am excited to see more optimizations in next 2-3 iterations. Hopefully Apple can keep increasing Unified memory every iteration.
Yep. It’s a good start along with Apple patents on those optimizations.
 

JouniS

macrumors 6502a
Nov 22, 2020
638
399
Not true at all. Apple added lot of optimizations. GPU focus so far has been increasing Tflops and cores. Optimizing GPU is literally a low hanging fruit at the moment. The Dynamic caching Apple started with M3 is basic implementation with lot more scope for improvement.
Apple added a lot of marketing material about it. Actual GPU performance obviously depends on the software, the drivers, and the firmware, and it has been like that for decades. Everyone has been optimizing everything related to GPUs as long as GPUs have existed, because that's just basic competence. There is no reason to believe Apple would have a lasting advantage there. Software just doesn't work like that.

In the end, GPUs are just massively parallel processors. If you want to make them faster, you have to make them even more massively parallel.
 
  • Like
Reactions: pdoherty

leman

macrumors Core
Oct 14, 2008
19,517
19,664
Apart from ray tracing acceleration, the jump in GPU performance from M2 to M3 was not that significant. GPU performance is not something Apple can really improve, as it's mostly about brute force. If you want a faster GPU, you add more cores and run them at a higher clock rate. It's more about TSMC than Apple. Architectural improvements do matter, but they are less important than on the CPU side.

I really disagree. The changes to the GPU are substantial and the compute engine behaves very differently. It has more internal parallelism, better execution efficiency, and what probably most important, sets a solid foundation for future improvement.


In the end, GPUs are just massively parallel processors. If you want to make them faster, you have to make them even more massively parallel.

That‘s what they did with G16. But pure parallelism is just one part of equation, another one is how easy that parallelism is to harness. And Apple is focusing on that second part with their architecture.
 

ChrisA

macrumors G5
Jan 5, 2006
12,917
2,169
Redondo Beach, California
What if one doesn't need a gaming rig? What if one is, say, a software developer who needs a fast laptop to build some non-OS specific code? Or a data scientist? Or a video editor? Or an artist? Or a researcher who works with LaTeX and Python? You can use different tools to solve the same problem. It makes perfect sense to evaluate different options and their pros and contras.
That would be me. Buy a used, off-lease Xeon workstation with 16-core and 64GB RAM for like $550. For embedded software development, a 3 to 5 years old top-tier Intel system that runs Linux is perfect. I'm typing this on an M2-Pro Mini because the Mac is quite a lot better at everything else.

For most users, they would do best on a Chromebook.

I agree that wht is best depends on what you do and in some cases you might end up with multiple computers.
 

Basic75

macrumors 68020
May 17, 2011
2,099
2,446
Europe
But pure parallelism is just one part of equation, another one is how easy that parallelism is to harness.
That is very true, and examples abound. Take the Cell from the PS3, a lot of horsepower but nearly impossible to fully harness. And GPUs that hardly ever run anywhere near their peak theoretical performance in real world scenarios. Or 4/6/8-wide processor cores that are lucky to average more than 1, let alone 2, IPC over a wide range of loads.
 
  • Like
Reactions: TechnoMonk

Rafterman

Contributor
Apr 23, 2010
7,267
8,809
For what it's worth, the 38 Core M2 Max ran at 13 teraflops graphics performance. The Nvidia 4060 (found in the Dell XPS 15) is 15 tflops. The 40 core M3 Max is 16 tflops.
 

leman

macrumors Core
Oct 14, 2008
19,517
19,664
For what it's worth, the 38 Core M2 Max ran at 13 teraflops graphics performance. The Nvidia 4060 (found in the Dell XPS 15) is 15 tflops. The 40 core M3 Max is 16 tflops.

This used to mean at least something for M1 and M2 series, but now that M3 can execute different types of instructions concurrently, these are much less useful. E.g. I measure 20 TFLOPS on my 30-core M3 Max by interleaving FP32 and FP16 operations.

And to make things even more complicated, M3 dramatically changes the hardware utilisation side of the equation. It needs both fewer concurrent kernels to effectively hide the execution latency than M1/M2; AND it will be able to fit more concurrent kernels with its new resource allocation scheme. The end effect is that the new hardware is less likely to waste hardware cycles. E.g. if you look at Blender 3.6 benchmarks, M3 series manage to get 40-60% higher performance over M2 with the same nominal TFLOPS figures, and that's without any hardware raytracing.
 

Rafterman

Contributor
Apr 23, 2010
7,267
8,809
This used to mean at least something for M1 and M2 series, but now that M3 can execute different types of instructions concurrently, these are much less useful. E.g. I measure 20 TFLOPS on my 30-core M3 Max by interleaving FP32 and FP16 operations.

And to make things even more complicated, M3 dramatically changes the hardware utilisation side of the equation. It needs both fewer concurrent kernels to effectively hide the execution latency than M1/M2; AND it will be able to fit more concurrent kernels with its new resource allocation scheme. The end effect is that the new hardware is less likely to waste hardware cycles. E.g. if you look at Blender 3.6 benchmarks, M3 series manage to get 40-60% higher performance over M2 with the same nominal TFLOPS figures, and that's without any hardware raytracing.

Oh, I agree that measurements like tflops is no final word on anything, as there are a lot of factors that go into graphics performance. But it's an interesting starting point for discussion.
 
Last edited:
  • Like
Reactions: 3Rock
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.