The 3DLabs cards I suppose...Ok. What was the non “low-end” GPU hardware of 2002?
The 3DLabs cards I suppose...Ok. What was the non “low-end” GPU hardware of 2002?
The 3DLabs cards I suppose...
You probably could make a PC Game on the Mac. Not sure about making a console game on the Mac though.No, not these ones. And that's the problem.
I'm talking about PC/console game developers.
One example was the InfiniteReality series. Consumer-grade (to include the Nvidia Quadro series in that) caught up in the early 2000s, but no, a card you could fit in a Powerbook wasn't the "fastest in the world" at the time. Yes, these were expensive machines, but so is a modern MacPro.Ok. What was the non “low-end” GPU hardware of 2002?
One example was the InfiniteReality series. Consumer-grade (to include the Nvidia Quadro series in that) caught up in the early 2000s, but no, a card you could fit in a Powerbook wasn't the "fastest in the world" at the time. Yes, these were expensive machines, but so is a modern MacPro.
It wasn't just desksides. Workstation systems (desktops) had everything from MXI to VPro to Sun's Elite3D. Sure I'm being pedantic, but the "most powerful GPUs could be in a laptop back then" is nonsense. Plus, 50W TDP along with the rest of the system is a lot to have in a Powerbook. The G4 got hot enough on its own.Oh, come on. When I am talking about GPUs I mean, well, GPUs, stuff available to consumers, supporting the usual consumer systems and priced in the consumer range. Not niche professional workstation multi-board systems that easily add up to half a million $ taking inflation into account.
It wasn't just desksides. Workstation systems (desktops) had everything from MXI to VPro to Sun's Elite3D. Sure I'm being pedantic, but the "most powerful GPUs could be in a laptop back then" is nonsense. Plus, 50W TDP along with the rest of the system is a lot to have in a Powerbook. The G4 got hot enough on its own.
Ok, then let’s limit the discussion to consumer (gaming hardware). Not to mention that if I remember correctly, mid-range professional workstation solutions were hardly faster than contemporary Nvidia GPUs. You keep referring to the “graphics workstations” of late nineties - early two thousands, but those systems quickly became obsolete due to rapid improvements in the consumer graphics. “Graphics workstation” is just not a thing anymore.
Gamers aren't buying systems with Quadro RTX cards, but whatever.
My point with that is that there is, in fact, such a thing as a "graphics workstation". Never mind.Quadro RTX are basically the same GPUs as the gaming ones and aren’t any faster. You pay more for drivers that support legacy stuff as well as better supporting circuitry. That’s about it. Of course, there are also GPUs geared towards deep learning, but that’s a different story.
My point with that is that there is, in fact, such a thing as a "graphics workstation". Never mind.
Which is very very different from SGI boxes of the past. Let’s focus on the essence of things and not the labels. The nature of GPUs has changed a lot in the last 20 years, as did the technology surrounding them. Things like Iris GL firmly belong in the past, even professional 3D software uses consumer (gaming) APIs nowadays - and it got better and faster in return.
I don't see a lot of professional software using something like DirectX or other more gaming-oriented graphics APIs.
Most of it is still OpenGL. Whatever. Claim whatever you want to claim.
Testing a game at 1080p medium?A good news is that Mac Studio Max 24c seems to perform closer to MBP 16" Max 32c in gaming than MBP 14" 24c, according to Miani.
Of course you could, for example Ultimate Chicken Horse was created in Unity for Playstation & XBox.You probably could make a PC Game on the Mac. Not sure about making a console game on the Mac though.
Don't know why since he's not testing M1 8c but still interesting. I hope it will be the same at higher res and in other games. It would be like a 29.5-core MBP 16".Testing a game at 1080p medium?
M1 Max Mini?Interestingly I’m finding very little difference compiling projects in Xcode between my M1 Ultra and my M1 Max Mini. Current project has the ultra compiling at 90 seconds, m1 mini at 97.
there is no Ultra on laptops like there is no Max MiniM1 Ultra MacBook Pro!
The verge tested the game at 1440p and the M1 ultra gave good results (a bit less than an RTX 3090) despite the game being released before Macs had Apple GPUs. They even tested at 4K and the M1 Ultra did very well. Unfortunately, they didn't test the 3090 at that resolution.Don't know why since he's not testing M1 8c but still interesting. I hope it will be the same at higher res and in other games. It would be like a 29.5-core MBP 16".
I thought the results that the verge got were kind of low for a 3090. Maybe they were hitting cpu limits?The verge tested the game at 1440p and the M1 ultra gave good results (a bit less than an RTX 3090) despite the game being released before Macs had Apple GPUs. They even tested at 4K and the M1 Ultra did very well. Unfortunately, they didn't test the 3090 at that resolution.
Luke Miani also tested basemark GPU (at tool that no one uses but which has an apple silicon version). The 48-core Ultra scores a bit higher than a typical RTX 3070 (https://powerboard.basemark.com/top...ult-os/default-environment/median/20/1?s=3070). I say "typical" because, as always, results in these databases can be all over the place with overclocked cards and uncontrolled conditions.
I expect the big M1 Ultra to be about as fast as an RTX 3080 in this test.