Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

jjcs

Cancelled
Oct 18, 2021
317
153
Ok. What was the non “low-end” GPU hardware of 2002?
One example was the InfiniteReality series. Consumer-grade (to include the Nvidia Quadro series in that) caught up in the early 2000s, but no, a card you could fit in a Powerbook wasn't the "fastest in the world" at the time. Yes, these were expensive machines, but so is a modern MacPro.
 

leman

macrumors Core
Oct 14, 2008
19,521
19,677
One example was the InfiniteReality series. Consumer-grade (to include the Nvidia Quadro series in that) caught up in the early 2000s, but no, a card you could fit in a Powerbook wasn't the "fastest in the world" at the time. Yes, these were expensive machines, but so is a modern MacPro.

Oh, come on. When I am talking about GPUs I mean, well, GPUs, stuff available to consumers, supporting the usual consumer systems and priced in the consumer range. Not niche professional workstation multi-board systems that easily add up to half a million $ taking inflation into account.

In the consumer hardware space, GPUs were a novelty in early 2000th and their power consumption was very low by todays standards. Making a laptop with a GPU almost as fast as a desktop back then was not a big deal.
 
  • Like
Reactions: Adult80HD

jjcs

Cancelled
Oct 18, 2021
317
153
Oh, come on. When I am talking about GPUs I mean, well, GPUs, stuff available to consumers, supporting the usual consumer systems and priced in the consumer range. Not niche professional workstation multi-board systems that easily add up to half a million $ taking inflation into account.
It wasn't just desksides. Workstation systems (desktops) had everything from MXI to VPro to Sun's Elite3D. Sure I'm being pedantic, but the "most powerful GPUs could be in a laptop back then" is nonsense. Plus, 50W TDP along with the rest of the system is a lot to have in a Powerbook. The G4 got hot enough on its own.
 

leman

macrumors Core
Oct 14, 2008
19,521
19,677
It wasn't just desksides. Workstation systems (desktops) had everything from MXI to VPro to Sun's Elite3D. Sure I'm being pedantic, but the "most powerful GPUs could be in a laptop back then" is nonsense. Plus, 50W TDP along with the rest of the system is a lot to have in a Powerbook. The G4 got hot enough on its own.

Ok, then let’s limit the discussion to consumer (gaming hardware). Not to mention that if I remember correctly, mid-range professional workstation solutions were hardly faster than contemporary Nvidia GPUs. You keep referring to the “graphics workstations” of late nineties - early two thousands, but those systems quickly became obsolete due to rapid improvements in the consumer graphics. “Graphics workstation” is just not a thing anymore.

So no, I don’t see how your interjection invalidates my point. Twenty years ago top consumer GPUs used less power than modern-day mobile GPUs, and modern day desktop GPUs approach the power usage of a microwave oven. So talking about how Apple used to have industry-leading graphics on its laptops back in the day is misleading at best.
 

jjcs

Cancelled
Oct 18, 2021
317
153
Ok, then let’s limit the discussion to consumer (gaming hardware). Not to mention that if I remember correctly, mid-range professional workstation solutions were hardly faster than contemporary Nvidia GPUs. You keep referring to the “graphics workstations” of late nineties - early two thousands, but those systems quickly became obsolete due to rapid improvements in the consumer graphics. “Graphics workstation” is just not a thing anymore.

Gamers aren't buying systems with Quadro RTX cards, but whatever.
 

leman

macrumors Core
Oct 14, 2008
19,521
19,677
Gamers aren't buying systems with Quadro RTX cards, but whatever.

Quadro RTX are basically the same GPUs as the gaming ones and aren’t any faster. You pay more for drivers that support legacy stuff as well as better supporting circuitry. That’s about it. Of course, there are also GPUs geared towards deep learning, but that’s a different story.

P.S. But some of these things are probably going to change with Grace and similar products (if they ever end up in customers hands)
 

jjcs

Cancelled
Oct 18, 2021
317
153
Quadro RTX are basically the same GPUs as the gaming ones and aren’t any faster. You pay more for drivers that support legacy stuff as well as better supporting circuitry. That’s about it. Of course, there are also GPUs geared towards deep learning, but that’s a different story.
My point with that is that there is, in fact, such a thing as a "graphics workstation". Never mind.
 

leman

macrumors Core
Oct 14, 2008
19,521
19,677
My point with that is that there is, in fact, such a thing as a "graphics workstation". Never mind.

Which is very very different from SGI boxes of the past. Let’s focus on the essence of things and not the labels. The nature of GPUs has changed a lot in the last 20 years, as did the technology surrounding them. Things like Iris GL firmly belong in the past, even professional 3D software uses consumer (gaming) APIs nowadays - and it got better and faster in return.
 

jjcs

Cancelled
Oct 18, 2021
317
153
Which is very very different from SGI boxes of the past. Let’s focus on the essence of things and not the labels. The nature of GPUs has changed a lot in the last 20 years, as did the technology surrounding them. Things like Iris GL firmly belong in the past, even professional 3D software uses consumer (gaming) APIs nowadays - and it got better and faster in return.

I don't see a lot of professional software using something like DirectX or other more gaming-oriented graphics APIs.

Most of it is still OpenGL. Whatever. Claim whatever you want to claim.
 

leman

macrumors Core
Oct 14, 2008
19,521
19,677
I don't see a lot of professional software using something like DirectX or other more gaming-oriented graphics APIs.

Most of it is still OpenGL. Whatever. Claim whatever you want to claim.

Autodesk for example moved to DirectX couple of years ago and achieved major performance and stability speedups. Using OpenGL post 2020 is like using a steam engine.
 

Homy

macrumors 68030
Jan 14, 2006
2,507
2,459
Sweden
A good news is that Mac Studio Max 24c seems to perform closer to MBP 16" Max 32c in gaming than MBP 14" 24c, according to Miani. It performs like a 29.5-core MBP 16". :)

Skärmavbild 2022-03-20 kl. 01.46.41.png
 
Last edited:

-narcan-

macrumors regular
Sep 29, 2011
178
213
Interestingly I’m finding very little difference compiling projects in Xcode between my M1 Ultra and my M1 Mac Mini. Current project has the ultra compiling at 90 seconds, m1 mini at 97.
 
Last edited:

jeanlain

macrumors 68020
Mar 14, 2009
2,461
954
Don't know why since he's not testing M1 8c but still interesting. I hope it will be the same at higher res and in other games. It would be like a 29.5-core MBP 16".
The verge tested the game at 1440p and the M1 ultra gave good results (a bit less than an RTX 3090) despite the game being released before Macs had Apple GPUs. They even tested at 4K and the M1 Ultra did very well. Unfortunately, they didn't test the 3090 at that resolution.

Luke Miani also tested basemark GPU (at tool that no one uses but which has an apple silicon version). The 48-core Ultra scores a bit higher than a typical RTX 3070 (https://powerboard.basemark.com/top...ult-os/default-environment/median/20/1?s=3070). I say "typical" because, as always, results in these databases can be all over the place with overclocked cards and uncontrolled conditions.
I expect the big M1 Ultra to be about as fast as an RTX 3080 in this test.
 
  • Like
Reactions: Homy

diamond.g

macrumors G4
Mar 20, 2007
11,438
2,665
OBX
The verge tested the game at 1440p and the M1 ultra gave good results (a bit less than an RTX 3090) despite the game being released before Macs had Apple GPUs. They even tested at 4K and the M1 Ultra did very well. Unfortunately, they didn't test the 3090 at that resolution.

Luke Miani also tested basemark GPU (at tool that no one uses but which has an apple silicon version). The 48-core Ultra scores a bit higher than a typical RTX 3070 (https://powerboard.basemark.com/top...ult-os/default-environment/median/20/1?s=3070). I say "typical" because, as always, results in these databases can be all over the place with overclocked cards and uncontrolled conditions.
I expect the big M1 Ultra to be about as fast as an RTX 3080 in this test.
I thought the results that the verge got were kind of low for a 3090. Maybe they were hitting cpu limits?


Guru 3d was seeing 95fps at 4k with a founders (the slowest) cards….
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.