Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Philip Turner

macrumors regular
Dec 7, 2021
170
111
You ever just move your cursor really fast on your MBP screen, to flex your 120 Hz on battery? Or perhaps click back and forth between Stage Manager windows as fast as physically possible, to flex the tight integration between the GPU and disk storage.
 

sam_dean

Suspended
Original poster
Sep 9, 2022
1,262
1,091
Oof.

ABCD
EFGH
Doing so on Reddit takes an extra middle step by copy pasting to http://tableit.net/

Again... if the whole Mac system has superior performance per watt, same/similar benchmark results, spew out far less waste heat even at low RPM HSF, very low at wall outlet power consumption then it beats PC parts that requires more power than the whole system.

If this was 2002 when chips were 180nm I'd probably buy that time's equivalent of a Mac Studio M1 Ultra. But today M2 Pro 32GB is more than sufficient because it's 5nm.
 

Philip Turner

macrumors regular
Dec 7, 2021
170
111
because it's 5nm
It's not actually 5 nm - that's misleading advertising from semiconductor companies. Transistors stopped scaling after their tiniest components reached 22 nm. Now it's just clever techniques to improve density, which could theoretically reach "smaller than a silicon atom". TSMC N3 just killed SRAM and area scaled much less than 2x 5 nm, so Moore's Law is on its last stretches.

Meanwhile, we're still completely ignoring a radical rethink to computing power. Molecular mechanical computing, scrapping those inefficient crowds of electrons, could improve power efficiency by 1,000,000x compared to modern computers! I'd love to see Apple utilize that when it becomes feasible (though probably just for matrix multipliers). It's so efficient that the Landauer limit is your I/O bottleneck.
 

sam_dean

Suspended
Original poster
Sep 9, 2022
1,262
1,091
It's not actually 5 nm - that's misleading advertising from semiconductor companies. Transistors stopped scaling after their tiniest components reached 22 nm. Now it's just clever techniques to improve density, which could theoretically reach "smaller than a silicon atom". TSMC N3 just killed SRAM and area scaled much less than 2x 5 nm, so Moore's Law is on its last stretches.

Meanwhile, we're still completely ignoring a radical rethink to computing power. Molecular mechanical computing, scrapping those inefficient crowds of electrons, could improve power efficiency by 1,000,000x compared to modern computers! I'd love to see Apple utilize that when it becomes feasible (though probably just for matrix multipliers). It's so efficient that the Landauer limit is your I/O bottleneck.
I'll stick to industry terms so I do not have to do a Ted Talk. ;-)
 

Joe Dohn

macrumors 6502a
Jul 6, 2020
840
748
Better question... can you run a PC with just a GPU part or CPU part?

Erm... yes?
GPU acceleration only came much later, and many systems still run with software rendering only.

What's trickier is having a GPU-only system, but it's also possible. GPUs are actually CPUs with special instructions for video rendering and acceleration. They have become increasingly more powerful and complex – so much so you CAN run a whole operating system in them. It's just that people don't normally try that.
 
  • Like
Reactions: iPadified

leman

macrumors Core
Oct 14, 2008
19,521
19,673
Molecular mechanical computing, scrapping those inefficient crowds of electrons, could improve power efficiency by 1,000,000x compared to modern computers!

Wow, that's incredibly cool stuff! Would be really funny if we indeed went back to mechanical stuff... still, transistors are likely to be more area-efficient, no?

What's trickier is having a GPU-only system, but it's also possible. GPUs are actually CPUs with special instructions for video rendering and acceleration. They have become increasingly more powerful and complex – so much so you CAN run a whole operating system in them. It's just that people don't normally try that.

I am not sure if I'd agree with this categorisation. GPUs are parallel processors optimised to run programs that operate on multiple data elements simultaneously, and run multiple of those programs concurrently. I don't think a modern GPU can run a full-fledged OS on its own, but if one adds some means to deal with external devices that should be possible. But it would also be fairly dumb as the performance would be very very bad.
 

kasakka

macrumors 68020
Oct 25, 2008
2,389
1,074
How about we quantify the power and price, so we have an honest answer comparing Mac to PC?

DeviceMax PowerCompute PowerCost
M2 SoC~35 W0.8 + 3.6 TFLOPS$600 (Mac Mini)
M2 Pro SoC~67 W1.2 + 6.8 TFLOPS$1,300 (Mac Mini)
M2 Max SoC~96 W1.2 + 13.6 TFLOPS~$2,800 (Mac Studio)
M2 Ultra SoC~200 W2.4 + 27.2 TFLOPS~$5,000 (Mac Studio)
RTX 4050~130 W~13 TFLOPS~$3,00
RTX 4060~170 W~16 TFLOPS~$450
RTX 4070~220 W~29 TFLOPS~$700
RTX 4080320 W48.7 TFLOPS$1,200
RTX 4090450 W82.6 TFLOPS$1,600
Intel i3-13300HE65 W0.9 TFLOPS~$150
Intel i5-13600K181 W2.0 TFLOPS$320
Intel i7-13700K253 W2.4 TFLOPS$410
Intel i9-13900K253 W3.6 TFLOPS$590
* ~96 W (entire SoC), ~50 W (GPU only)

Next, we need the power and price of SSD, CPU RAM, thermal system, and case. That's already included with a Mac, but not a PC. For a fairer comparison, let's use the pre-built Alienware x17 R2 with an i9-12900HK and RTX 3080 Ti Mobile. To avoid generational advantage, use the M1-series Macs.

17" Alienwave x17 R216" M1 Max MBP
$3,799$3,500
300 W~96 W
18.71 TFLOPS GPU*10.61 TFLOPS GPU
* The "mobile" RTX 3080 Ti is half as powerful as the desktop version.
TFLOPS is honestly a poor metric for performance because each architecture is best suited for specific things and has different design goals. We have seen this thrown around with PC vs consoles and it never ends up being effective for comparison.

It doesn't help that it's often not easy to compare PC vs Mac when your available comparable options are e.g video/photo editing suites and synthetic benchmarks. Macs don't usually dominate in the synthetics but do very well considering their perf/watt. Meanwhile they tend to do very well for the video/photo editing stuff, which also means that on YT you see a lot of those comparisons because - surprise surprise - YouTubers are video editors so it matters to them a lot.

In Digital Foundry's Resident Evil Village tests the Mac Studio with M1 Ultra performed similarly to a Windows laptop with a 3080M mobile chip. The Mac system that currently costs way more than my PC with a 4090, doesn't perform even at a fraction of its performance at least for GPU heavy gaming tasks. Hell, even my previous, much cheaper 3700X + 2080 Ti system outperforms it.

"But think of the power usage!" It's not honestly a real concern for a desktop system for most users, even with the very high electricity prices we have been having in the EU. For the record the 4090 is often not getting anywhere even close to 450W in actual use either, a more common figure is around 300W despite pushing native 4K at 200 fps figures. In general the Nvidia 40 series is a pretty power efficient architecture, it's just by default pushed to the max to look good in benchmarks and you can reduces its power limit by 20-30% without a significant drop in performance.
 
  • Like
Reactions: Basic75

Joe Dohn

macrumors 6502a
Jul 6, 2020
840
748
I am not sure if I'd agree with this categorisation. GPUs are parallel processors optimised to run programs that operate on multiple data elements simultaneously, and run multiple of those programs concurrently. I don't think a modern GPU can run a full-fledged OS on its own, but if one adds some means to deal with external devices that should be possible. But it would also be fairly dumb as the performance would be very very bad.
Here is a proof of concept of an OS that runs directly on a GPU: https://os-projects.eu/dawn-os

It's a toy at this state, but it proves it can be done.
 

leman

macrumors Core
Oct 14, 2008
19,521
19,673
Does it run on all GPUs or is it CUDA-only (like most GPU software)? If the former, I'd love to try it with OpenCL.

There is a post by the author who announced an OpenCL emulator. But the entire project website is down, so who knows.
 

Philip Turner

macrumors regular
Dec 7, 2021
170
111
Wow, that's incredibly cool stuff! Would be really funny if we indeed went back to mechanical stuff... still, transistors are likely to be more area-efficient, no?
It's strangely because precise-enough molecules can avoid any friction, just rotating around a single chemical bond. I imagine it's very fragile but redundancy makes up for that. Hence, use it for neural networks where one broken weight isn't the end of the world. Alternatively, it's the only choice for any sort of computer the size of a cubic micron.

It's probably less area (2D) efficient but more volume (3D) efficient. Current semiconductors don't stack well into the 3rd dimension, and when they do, there's no escaping heat dissipation.
 

Manzanito

macrumors 65816
Apr 9, 2010
1,189
1,954
I bugs me that some (actually, many) people condemn or martyr a computer depending on how well it plays games. I'm not telling you what you should do in your spare time but frankly that's a ridiculous metric. Grow up.
For people interested in playing games, it’s not a ridiculous metric.
 
  • Like
Reactions: seek3r

Joe Dohn

macrumors 6502a
Jul 6, 2020
840
748
I bugs me that some (actually, many) people condemn or martyr a computer depending on how well it plays games. I'm not telling you what you should do in your spare time but frankly that's a ridiculous metric. Grow up.


PCs are usually GENERAL purpose machines. This means that they can be used for several purposes, from saving lives to entertainment (for example, nowadays, consoles are specialized computers).

If I'm buying a PC for the purpose of gaming, of course that I'm going to judge it based on that criterion.
 

MajorFubar

macrumors 68020
Oct 27, 2021
2,174
3,825
Lancashire UK
For people interested in playing games, it’s not a ridiculous metric.
It's not something i've any hope of understanding because I grew out of playing computer games when my pubes sprouted. I just have to try to understand there's a huge market out there of adult-sized kids.
 
Last edited:

diamond.g

macrumors G4
Mar 20, 2007
11,438
2,663
OBX
It's not something i've any hope of understanding because I grew out of playing computer games when my pubes sprouted. I just have to try to understand there's a huge market out there of adult-sized kids.
It is just another form of entertainment, no different from folks that watch tv/movies/sports.
 
  • Like
Reactions: Scarrus

Joe Dohn

macrumors 6502a
Jul 6, 2020
840
748
It's not something i've any hope of understanding because I grew out of playing computer games when my pubes sprouted. I just have to try to understand there's a huge market out there of adult-sized kids.


Humans are artistic beings. Just because you don't appreciate a form of art, it doesn't mean other people will not. You wouldn't tell someone who watch the movies to grow up, even though watching a movie is a more elaborate version of listening to a bedtime story.

Your comment is an insult not only to games as a medium for telling stories, but also to the many professionals who work in an industry that made U$ 100 billion in 2015. There are concept artists, big composers, and even choreography artists all working in AAA games.
 

Philip Turner

macrumors regular
Dec 7, 2021
170
111
Don't trash talk projects like Unreal Engine!

To be honest, if something can play Minecraft, that's the only game that matters ;)
 

Manzanito

macrumors 65816
Apr 9, 2010
1,189
1,954
It's not something i've any hope of understanding because I grew out of playing computer games when my pubes sprouted. I just have to try to understand there's a huge market out there of adult-sized kids.
There’s a market out there for virtually everything.
 

seek3r

macrumors 68030
Aug 16, 2010
2,560
3,771
It's not something i've any hope of understanding because I grew out of playing computer games when my pubes sprouted. I just have to try to understand there's a huge market out there of adult-sized kids.
"Games are for kids" is a pretty lame dismissal of not just a massive set of industries but also our very nature. Play is an essential component of mental stimulation, for us and other animals, and no one grows out of it. Just because your methods of play, whatever they are, differ from folks that like video games doesnt make video games any less valid. I love strategy games like Civilization personally for ex, the challenge is, in the end, a puzzle that keeps me engaged through a long game, both against the computer and in regular games with friends. It's also often a social activity - all through the closures in the pandemic I had a regular gaming group every week, where we'd play online while on a discord call, that helped keep me engaged with my friends.

So if computer games don't appeal to you that's fine, but if you think they're for kids and you outgrow the need for play you're very very very wrong.
 

sam_dean

Suspended
Original poster
Sep 9, 2022
1,262
1,091
I bugs me that some (actually, many) people condemn or martyr a computer depending on how well it plays games. I'm not telling you what you should do in your spare time but frankly that's a ridiculous metric. Grow up.
A lot of people here depend on games for their source of income whether they be artists, programmers, managers, merchandisers, top management, business owners or shareholders.

I myself only play DOSBox games from 1991 and I do not mind other people playing up to date titles.

For my sib and people I care about I wish they'd reduce their gaming hours by over 80% and take care of their physical, mental and spiritual health as I do not want them to prematurely suffer and die from a NCD.

I've observed that entertainment whether it be games, social networks/media, streaming video, etc eat into time that would have been used for

- sleeping earlier and longer
- making whole food plant-based meals that are macro & micro nutritionally complete
- increased physical activity
- mindfullness
- quality in-person time with people within the household and other people that actually mean something to you
- sun exposure to generate Vitamin D & not to the extent of skin tanning
 

AirpodsNow

macrumors regular
Aug 15, 2017
224
145
Windows on ARM has been rather stagnant. They aren't moving at a pace that Apple did with moving from Intel to Apple chips.

When Microsoft/Qualcomm/ARM figures how to accelerate to the pace of Apple then Intel/AMD will end up depending on legacy support as their selling point.

Android chips are at a node more advance than AMD.
I do wonder how they 'can' move to ARM, even if Microsoft is able to port everything to ARM by tomorrow, they don't have the proper ARM silicon story/product roadmap to support it. It seems that Intel might develop something and Qualcomm seems to 'struggle' to compete with the Apple A series chips. Currently it seems that Apple is really a few years ahead on the competition to be able to provide this platform. I think they really played out their unique core strength well in this market. Having the very profitable iPhone business supporting their other product ranges.
 

sam_dean

Suspended
Original poster
Sep 9, 2022
1,262
1,091
I do wonder how they 'can' move to ARM, even if Microsoft is able to port everything to ARM by tomorrow, they don't have the proper ARM silicon story/product roadmap to support it. It seems that Intel might develop something and Qualcomm seems to 'struggle' to compete with the Apple A series chips. Currently it seems that Apple is really a few years ahead on the competition to be able to provide this platform. I think they really played out their unique core strength well in this market. Having the very profitable iPhone business supporting their other product ranges.
The Qualcomm/ARM Android chips that are 5nm/7nm/10nm are superior to any Intel/AMD Windows chip on a basis of performance per watt.

It just isnt given proper marketing, supply chain, distribution and software support as Apple had with their Macs

PC laptops and desktops are essentially smartphones with, hardware keyboard/mouse/trackpad, larger mAh batteries or powered at a wall socket. Screen resolution of Windows & Android are roughly the same. What makes the screen difference is the size of the pixels and the physical dimension of the screen sizes. They're less thermally constrained but have more physical I/O ports to contend with.

And you are correct iPhone business paid for >90% of the R&D spend on Apple chips. So any Android chip has Android business paying >90% of the R&D spend of an ARM Windows chip.

Any Mac-specific tech for the Apple chip was added to the iPhone chip platform so it would do better on macOS.
 
Last edited:

leman

macrumors Core
Oct 14, 2008
19,521
19,673
It's not something i've any hope of understanding because I grew out of playing computer games when my pubes sprouted. I just have to try to understand there's a huge market out there of adult-sized kids.

You must be fun at parties. Games are for kids, sports are for hooligans, movies are for losers, books are for nerds, art galleries for decadents. Real adult knows only work and occasionally a good old traditional newspaper complaining about young people.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.