That's relevant to this conversation how?Have you guys see the Hot Chips presentation by MS for the XSX?
That's relevant to this conversation how?Have you guys see the Hot Chips presentation by MS for the XSX?
Earlier in either this thread or another there were talks about Apple being able to match the TFLOP numbers of the new stuff coming out by just scaling how many GPU cores are present. Nevermind actual gaming performance (because Nvidia tends to have lower TFLOP count but better rendering performance than AMD).That's relevant to this conversation how?
I guess will see if Apple is willing to make/pay for that much silicon for their GPU's in their products, considering that chip is basically 3x the size of their current largest Apple Silicon chip the A12x....Earlier in either this thread or another there were talks about Apple being able to match the TFLOP numbers of the new stuff coming out by just scaling how many GPU cores are present. Nevermind actual gaming performance (because Nvidia tends to have lower TFLOP count but better rendering performance than AMD).
I know this forum isn't as graphics geeky as Beyond3D is, but it seemed like some here would appreciate the die shot.
For Apple executives 30 FPS is enough.End of dedicated GPUs = end of any hopes for AAA gaming. Sorry. You know this is true. But no one can forbid you to live in your fantasy world, though.
Just to give you an idea quite a few CAD developers are actually considering transitioning from OpenGL to either D3D12 or Vulkan because they are interested in using mesh shaders meanwhile Metal is not on their radar since they don't have a competing feature ...
Metal is not even suitable for machine learning or data science either. If you look at RadeonML and the models supported by MPS (Metal performance shaders), it only supports models that do either denoising or upscaling. Meanwhile on DirectML, you can inference all models in RadeonML and with some form of Tensorflow you can even train models using DirectML ...
What do you guys think about this quote/post in B3D Forum?
Apple (PowerVR) TBDR GPU-architecture speculation thread
They're unlikely to license any microarchitecture which the recent press release talked about due to its lack of "multi-use" language, it's probably again a wide-range architecture and IP license as an extension of what they're already had for half a decade. Also note people saying Apple broke...forum.beyond3d.com
What do you guys think about this quote/post in B3D Forum?
Apple (PowerVR) TBDR GPU-architecture speculation thread
They're unlikely to license any microarchitecture which the recent press release talked about due to its lack of "multi-use" language, it's probably again a wide-range architecture and IP license as an extension of what they're already had for half a decade. Also note people saying Apple broke...forum.beyond3d.com
Mid range, like RTX 3070 midrange, or midrange like GTX 1070?CAD developers who care about performance have transitioned to DX long time ago. Metal is unlikely to support mesh shaders any time soon as they don’t do much for TBDR (although Apple could hack around it I suppose), but Metal is perfectly capable of generating geometry on the GPU.
Regarding ML or HPC, of course it’s capable of doing it. There is just no tooling. Apple’s ML accelerators are state of art, but they focus on the common app developer needs, so the API is limited. There is no doubt that frameworks like CUDA offer a vastly superior infrastructure, simply because there are so many more people using it.
I believe that the most important adoption factor is going to be performance. If Apple manages to pack a mid-range GPU equivalent in their popular laptops across the board, third party support will follow.
As a whole, B3D seems to be down on TBDR for desktop. I wish someone there could give better insights as to why.I think that professional users and tech enthusiasts do a great job convincing regular users to use computers.
If they are left out of the process, the idea that Apple devices are a toy will only be strengthened.
Apparently, even with the new GPUs, Apple devices are not as impressive as Apple paints them to be, so...
Mid range, like RTX 3070 midrange, or midrange like GTX 1070?
Thanks for the reply. I appreciate the die shot too. I'm into all that stuff also. I just wanted to know how it related to the topic at hand which you answered quite nicely.Earlier in either this thread or another there were talks about Apple being able to match the TFLOP numbers of the new stuff coming out by just scaling how many GPU cores are present. Nevermind actual gaming performance (because Nvidia tends to have lower TFLOP count but better rendering performance than AMD).
I know this forum isn't as graphics geeky as Beyond3D is, but it seemed like some here would appreciate the die shot.
It seems to be a decent look at why Macs have fallen behind in 3D applications. Even if a Mac had a capable GPU, if the frameworks that this software needs aren't there, then the point is moot.What do you guys think about this quote/post in B3D Forum?
Apple (PowerVR) TBDR GPU-architecture speculation thread
They're unlikely to license any microarchitecture which the recent press release talked about due to its lack of "multi-use" language, it's probably again a wide-range architecture and IP license as an extension of what they're already had for half a decade. Also note people saying Apple broke...forum.beyond3d.com
Feral is probably still going to do ports for as long as Rosetta 2 is a thing.well let us hope by the rumored October event Apple decides to reveal a list of software developers they are working with to insure their is a vibrant selection of software to select from and that it includes game companies, both that have games on Mac now and those that do not.
I don't need a faster chromebook and that is what this will be to me if I have to have a separate machine to play games when I want to relax.
It seems to be a decent look at why Macs have fallen behind in 3D applications. Even if a Mac had a capable GPU, if the frameworks that this software needs aren't there, then the point is moot.
Maybe this is why Apple seems to have a renewed focus on expanding Metal. Who knows if anyone's gonna adopt it or not but it's at least a strategy to expand your own API with frameworks.
I don't think the comments from this poster are unbiased. The guy appears to dismiss TBDR and fails to note its advantage in perf/W or perf/TFLOPs. He says that the direct renderers already work optimally, but they don't, hence why TBDR exists. Apple seems to think that TBDR is simply better and is the future, I suppose they know better than this guy.What do you guys think about this quote/post in B3D Forum?
Apple (PowerVR) TBDR GPU-architecture speculation thread
They're unlikely to license any microarchitecture which the recent press release talked about due to its lack of "multi-use" language, it's probably again a wide-range architecture and IP license as an extension of what they're already had for half a decade. Also note people saying Apple broke...forum.beyond3d.com
He implies that it would be hard for Apple to implement ray tracing, but I can easily see Apple adding ray-tracing hardware.
If I have to buy a separate machine to game; frankly I think anyone who suggests that and still buying a Mac is more caught up in image than reality
If I have to buy a separate machine to game; frankly I think anyone who suggests that and still buying a Mac is more caught up in image than reality; I will just buy a different solution entirely. A computer with these supposed capabilities should be something everyone wants to write software for and people want to buy for said software because its price and performance is on par or better than the competition.
I don't necessarily disagree with that, but I think everyone here buys Macs for a myriad of other purposes, and gaming is a bonus. It all depends on how important gaming is to you, if you're a big gamer, and can only afford one machine, well then a PC is for you. That's kinda the way it's always been, for better or worse.If I have to buy a separate machine to game; frankly I think anyone who suggests that and still buying a Mac is more caught up in image than reality; I will just buy a different solution entirely. A computer with these supposed capabilities should be something everyone wants to write software for and people want to buy for said software because its price and performance is on par or better than the competition.
That's relevant to this conversation how?