Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Bodhitree

macrumors 68020
Apr 5, 2021
2,085
2,216
Netherlands
Because no one buys Nintendo hardware for the fancy graphics (which has been true since the Nintendo 64 days) and at this point Nintendo would rather not lose money even trying (they have admitted as much before).

I can’t say that I agree. Nintendo have looked for other ways to make their hardware products interesting but they have paid attention to performance, why else go for an Nvidia Tegra rather than something more standard such as a Qualcomm SoC. All these aspects of a hardware device are important, and contribute to making an impact in the market. Something like this might be an opportunity.

But i do wonder if the ray tracing has enough oomph to fill a smartphone screen. The iPhone 13 is 2532 x 1170 pixels, which is a pretty sizeable screen area. The latest iPad Air is 2360 x 1640 pixels. It might be necessary to use an upsampling solution.
 

diamond.g

macrumors G4
Mar 20, 2007
11,438
2,661
OBX
I can’t say that I agree. Nintendo have looked for other ways to make their hardware products interesting but they have paid attention to performance, why else go for an Nvidia Tegra rather than something more standard such as a Qualcomm SoC. All these aspects of a hardware device are important, and contribute to making an impact in the market. Something like this might be an opportunity.
Qualcomms GPU Performance (compared to Nvidia in 2017) sucks.
But i do wonder if the ray tracing has enough oomph to fill a smartphone screen. The iPhone 13 is 2532 x 1170 pixels, which is a pretty sizeable screen area. The latest iPad Air is 2360 x 1640 pixels. It might be necessary to use an upsampling solution.
You an always render the rays at lower resolution, but TBDR should make resolution not as important, right?
 

leman

macrumors Core
Oct 14, 2008
19,520
19,670
That would have been low hanging fruit for Blizzard to implement with their native port of World of Warcraft.

You think so? WoW RT is targeting GPUs with hardware RT support as far as I understand, it's a very different performance expectation. And they would need to port the DX12 RT path to Metal or implement a Metal RT path from scratch. Not really a low hanging fruit IMO if you already know that the result is going to suck.
 

diamond.g

macrumors G4
Mar 20, 2007
11,438
2,661
OBX
You think so? WoW RT is targeting GPUs with hardware RT support as far as I understand, it's a very different performance expectation. And they would need to port the DX12 RT path to Metal or implement a Metal RT path from scratch. Not really a low hanging fruit IMO if you already know that the result is going to suck.
But they already ported everything else to Metal, using the Metal RT API shouldn't have been a bridge too far IMO. Especially since it is just shadows (last time I looked).
 

throAU

macrumors G3
Feb 13, 2012
9,198
7,348
Perth, Western Australia
Interesting, so per watt it seems like they would be processing more rays than an equivalent Nvidia solution.

Because no one buys Nintendo hardware for the fancy graphics (which has been true since the Nintendo 64 days) and at this point Nintendo would rather not lose money even trying (they have admitted as much before).

Nintendo did have the most powerful console around at the time with the Gamecube.

Look how well that did them.
 
  • Like
Reactions: JMacHack

diamond.g

macrumors G4
Mar 20, 2007
11,438
2,661
OBX
Nintendo did have the most powerful console around at the time with the Gamecube.

Look how well that did them.
giphy.gif
 
  • Like
Reactions: JMacHack and throAU

Boil

macrumors 68040
Oct 23, 2018
3,477
3,173
Stargate Command
Not really surprising. The performance is simply not there for RT to be practically viable. I mean, you could probably sprinkle some soft shadows here and there and it will work well enough on M1-class hardware, but why would you bother?
That would have been low hanging fruit for Blizzard to implement with their native port of World of Warcraft.

Yeah that paper is interesting, I wonder what changed for Ampere to be so much faster than Turing at RT then.
You think so? WoW RT is targeting GPUs with hardware RT support as far as I understand, it's a very different performance expectation. And they would need to port the DX12 RT path to Metal or implement a Metal RT path from scratch. Not really a low hanging fruit IMO if you already know that the result is going to suck.
But they already ported everything else to Metal, using the Metal RT API shouldn't have been a bridge too far IMO. Especially since it is just shadows (last time I looked).

The entirety of RT in WoW (on Windows) IS soft shadows...?
 

diamond.g

macrumors G4
Mar 20, 2007
11,438
2,661
OBX
What would be the power draw implications of raytracing? Do we have any idea of how this affects the Nvidia cards?
On AMd/Nvidia cards there doesn't seem to be a difference in power draw (assuming you were already GPU limited).
 

jmho

macrumors 6502a
Jun 11, 2021
502
996
What would be the power draw implications of raytracing? Do we have any idea of how this affects the Nvidia cards?
Dedicated hardware almost always reduces power required to perform a task, but then obviously you can perform more work, so essentially the power draw would stay the same but you'd do a lot more work.
 
  • Like
Reactions: JMacHack

JMacHack

Suspended
Mar 16, 2017
1,965
2,424
Because no one buys Nintendo hardware for the fancy graphics (which has been true since the Nintendo 64 days) and at this point Nintendo would rather not lose money even trying (they have admitted as much before).
Incorrect, the Nintendo 64 had one of the most powerful graphics chips in the world at the time and was developed in partnership with SGI.

The Gamecube was the most powerful console of its gen.

What hurt Nintendo was their president at the time pissed off Sony and other third party developers that made the playstation successful in those gens.

Nintendo found their modern niche with the Wii, having a less powerful console but having strong first party IP with wide appeal. But before that they were definitely playing the power game.
 
  • Like
Reactions: T'hain Esh Kelch

JMacHack

Suspended
Mar 16, 2017
1,965
2,424
What would be the power draw implications of raytracing? Do we have any idea of how this affects the Nvidia cards?
The power draw typically comes from clock speeds rather than any dedicated hardware. NVidia cards still use similar raster circuitry to Maxwell iirc, but since Turing they’ve added more AI cores and RT cores (which I believe they said supplement each other, the AI fills in any gaps so they don’t have to cast as many rays).

The real power problem comes from a similar issue to Intel, in that they’re unwilling to give up the performance crown, efficiency be damned.

Vega was a ******** for AMD, so I think they underestimated RDNA, which was a huge improvement over GCN. So they pushed their cards more. (Ironically, GCN as a whole got its problems with heat and power draw from being pushed far past their efficiency point.)

Apple Silicon kinda “cheats” with its architecture. It doesn’t have to copy any data from ram to vram, and uses an aggressive tile based renderer(iirc, anyone more knowledgeable feel free to correct me) to run with lower clocks. Likewise it has robust media encoders, which for applicable tasks gives it an edge.

In short, I’d expect any rt cores that Apple adds to add a negligible amount of power consumption.
 

diamond.g

macrumors G4
Mar 20, 2007
11,438
2,661
OBX
Incorrect, the Nintendo 64 had one of the most powerful graphics chips in the world at the time and was developed in partnership with SGI.

The Gamecube was the most powerful console of its gen.

What hurt Nintendo was their president at the time pissed off Sony and other third party developers that made the playstation successful in those gens.

Nintendo found their modern niche with the Wii, having a less powerful console but having strong first party IP with wide appeal. But before that they were definitely playing the power game.
IIRC the Xbox was more powerful than the GameCube. The N64 was powerful (I didn't say it wasn't) I am just saying that Nintendo basically let Sony/Microsoft take over the "haz moar powa" fight while they focused on "new experiences" and fun to play 1st party titles (I guess really starting from the GameCube). That and the resistance to CD based games at the time (remember when consoles had no load time) was a huge issue.
 

JMacHack

Suspended
Mar 16, 2017
1,965
2,424
IIRC the Xbox was more powerful than the GameCube. The N64 was powerful (I didn't say it wasn't) I am just saying that Nintendo basically let Sony/Microsoft take over the "haz moar powa" fight while they focused on "new experiences" and fun to play 1st party titles (I guess really starting from the GameCube).
Fair enough, I probably misunderstood.

Truth be told, I’m a fan of how Nintendo does things. They focus on the end experience rather than pure specs.
 

diamond.g

macrumors G4
Mar 20, 2007
11,438
2,661
OBX
Fair enough, I probably misunderstood.

Truth be told, I’m a fan of how Nintendo does things. They focus on the end experience rather than pure specs.
Yeah that was their big play for why staying with cartriges is better than going disc. It failed until the Switch (for "home consoles") with portables flash cartridge was always better, as Sony figured out (I still loved my PSP and I have a Vita).
 

diamond.g

macrumors G4
Mar 20, 2007
11,438
2,661
OBX
The power draw typically comes from clock speeds rather than any dedicated hardware. NVidia cards still use similar raster circuitry to Maxwell iirc, but since Turing they’ve added more AI cores and RT cores (which I believe they said supplement each other, the AI fills in any gaps so they don’t have to cast as many rays).

The real power problem comes from a similar issue to Intel, in that they’re unwilling to give up the performance crown, efficiency be damned.

Vega was a ******** for AMD, so I think they underestimated RDNA, which was a huge improvement over GCN. So they pushed their cards more. (Ironically, GCN as a whole got its problems with heat and power draw from being pushed far past their efficiency point.)

Apple Silicon kinda “cheats” with its architecture. It doesn’t have to copy any data from ram to vram, and uses an aggressive tile based renderer(iirc, anyone more knowledgeable feel free to correct me) to run with lower clocks. Likewise it has robust media encoders, which for applicable tasks gives it an edge.

In short, I’d expect any rt cores that Apple adds to add a negligible amount of power consumption.
I think they do work on efficiency (at least from Vega -> RDNA). Admittedly RDNA2 seems to get most of it's boost from increased clocks over RDNA.

Truth be told for the kinds of workloads Apple users use GCN is better than RDNA. It is a shame AMD decided not to bring CDNA (GCN successor for GPGPU) "down market" and chose to keep it in the data center.
 

JMacHack

Suspended
Mar 16, 2017
1,965
2,424
Yeah that was their big play for why staying with cartriges is better than going disc. It failed until the Switch (for "home consoles") with portables flash cartridge was always better, as Sony figured out (I still loved my PSP and I have a Vita).
From what I’ve read about the n64 (it’s got a lot of oddities in its architecture), it would never have worked with discs. For example some games (Ocarina of Time for example) used cart memory as swap to compensate for the low amount of RAM. Pretty smart in my opinion.

And of course there’s the fact that it helped curtail piracy and such, but that was only to the benefit of Nintendo.
 

JMacHack

Suspended
Mar 16, 2017
1,965
2,424
I think they do work on efficiency (at least from Vega -> RDNA). Admittedly RDNA2 seems to get most of it's boost from increased clocks over RDNA.
From a deep dive I read (I find the failure of Vega fascinating) the reason RDNA is more efficient at faster clocks is that it uses a narrower and longer instruction bus, like VLIW.

I recall that GCN was ridiculously wide, so wide that the hardware scheduler had trouble keeping up in some tasks. Most notably how raster was so slow on GCN yet it had insane compute power.

For example, GCN was super happy as the RX400 series. Nicely efficient. But when clocked higher for the 500 and eventually Vega series it just went off the rails for power draw.

On stage Raja Koduri said, before introducing Vega that “everyone said discrete was going away and that integrated was the future”. Reading too much into it I think that he meant that GCN was always designed to be very wide to match nicely as an integrated gpu.
Truth be told for the kinds of workloads Apple users use GCN is better than RDNA. It is a shame AMD decided not to bring CDNA (GCN successor for GPGPU) "down market" and chose to keep it in the data center.
True, but considering most people are using gpus for graphics over compute it’s a sound decision.
 

diamond.g

macrumors G4
Mar 20, 2007
11,438
2,661
OBX
From what I’ve read about the n64 (it’s got a lot of oddities in its architecture), it would never have worked with discs. For example some games (Ocarina of Time for example) used cart memory as swap to compensate for the low amount of RAM. Pretty smart in my opinion.

And of course there’s the fact that it helped curtail piracy and such, but that was only to the benefit of Nintendo.
It never would have worked with a disc the same way Ratchet and Clank Rift Apart would never work on the PS4? ;)
 

satcomer

Suspended
Feb 19, 2008
9,115
1,977
The Finger Lakes Region
The X-Box was in the planning stages at MacWorld 2001 in NYC Javits Center where Bungie was showing a Mac game in booth right across the Microsoft booth! I was their with my VMUG for that Friday and noticed that ALL the techs at Microsoft were staring at that budgie Mac Game called Halo running on new Mac OS X Macs! One moth Microsoft aggressively bought out Bungie at that time to get that game for X-Box they will release just before Christmas season!
 
  • Like
Reactions: T'hain Esh Kelch

throAU

macrumors G3
Feb 13, 2012
9,198
7,348
Perth, Western Australia
From a deep dive I read (I find the failure of Vega fascinating) the reason RDNA is more efficient at faster clocks is that it uses a narrower and longer instruction bus, like VLIW.
Vega isn't a bad architecture at all (I have two of them), however it does depend on what you're using it for.

Crypto or general parallel GPGPU processing? They're still efficient cards today and mine have paid for themselves and and the 6900XT i use as my primary card today (i mined with them a bit while i wasn't using them for desktop use).

AMD just did what AMD has done a lot with them, and clocked them too high to try and compete with the 1080+ at 3d rendering without enough ROPs to do so, and they were also kinda hindered by the 12nm GloFo process which wasn't great. If you run Vega 64 (for example) in balanced mode you get like 95% of the performance in much less power. AMD just pushed them too hard with a crappy blower cooler and they weren't as optimised for 3d graphics, didn't have the developer support, etc.

HBM also expensive, etc.
 
  • Like
Reactions: JMacHack
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.