There is no way that Apple would allow Nvidia to have CUDA on Mac. Apple is investing heavily to have their own GPGPU ecosystem and CUDA is poison to that.
Why do you say this? I've always assumed that Apple were pissed at Nvidia because of the endless GPU failures that cost them a fortune in repairs to generations of MacBooks.
CUDA is mostly for heavy duty number crunching isn't it?
As far as know Apple doesn't have an equivalent to bind together GPUs into computing clusters.
They want you to use Metal and CoreML that are optimized for their hardware. But given how much tools are readily available that utilize CUDA, if Apple gave Nvidia access to the Mac, nobody would bother using Apple's frameworks (and hence Apple's hardware)
Nvidia's RTX 2060 tensor cores offer over 50 TFLOPS (not sure if this is 32bit or 16bit).
You can use Metal for this. Not for the distributed case though, that is a logic layer you will have to implement yourself. I am not too familiar with the capabilities of modern CUDA in this regard, it was a while that I had a look.
Agree that the navi will and maybe support the intel macs.. but cant imagine Apple suddenly releasing a dGPU as great as the one from Nvidia and AMD.. and definitely not an iGPU that competes with dGPU.. would probably take them a few more years.. of course that being said, it would be nice if they can come out their own dGPU so soon
Anyhow, if the iGPU is 2x more powerful than the one on crappy intel will be most welcome
The “crappy” Intel iGPU is faster than anything Apple currently has available.
And next year, Intel is launching Tiger Lake which has an iGPU that is supposed to be 3 times faster than the Ice Lake iGPU (that currently already beats everything Apple has).
Might want to check your information. The iPad Pro GPU (2 years old chip at this point) is about twice as fast that current Iris Plus. Leaked benchmarks of Tiger Lake G7 would place it around 20-30% faster than A12Z GPU. I very much doubt it will outperform the final Mac chip running at higher power.
Curious if anyone got information on the TDP for latest intel iris and a12z..? Just want to imagine how much further apple gpu can push themselves below it takes a big hit on battery life.. like for the mbp16 5600m running at TDP 50w..
That is not true. The iGPU in the 2020 13" MBP scores over 10.000 in the metal benchmark, which is higher than the iPad Pro.
The “crappy” Intel iGPU is faster than anything Apple currently has available.
And next year, Intel is launching Tiger Lake which has an iGPU that is supposed to be 3 times faster than the Ice Lake iGPU (that currently already beats everything Apple has).
The “crappy” Intel iGPU is faster than anything Apple currently has available.
And next year, Intel is launching Tiger Lake which has an iGPU that is supposed to be 3 times faster than the Ice Lake iGPU (that currently already beats everything Apple has).
but it would also mean Apple would have the fastest integrated GPU on the market.
I'm sorry, but this doesn't seem to agree with the data I have seen. Looking at Geekbench 5 results for OpenCL and Metal, the A12Z exceeds any Intel iGPU (Iris Plus seems to be the best one) by a healthy margin.
The Anandtech reviews of the 2018 iPad Pro also show the A12X doing well well against laptops, even beating those with an MX150 dGPU, and not that far behind those with a GTX1060: https://www.anandtech.com/show/13661/the-2018-apple-ipad-pro-11-inch-review/6
Which benchmarks lead to you conclude otherwise? Please provide some evidence of your claim - I'm open to learning something new!
Tiger Lake iGPUs will no doubt be better, but then so will whatever Apple produces in their new SoCs.
The argument will be academic soon in any case, because what is certain is that after 2 years no Macs will be using Intel iGPUs. I'm confident that Apple will make Apple Silicon competitive with Intel CPU / iGPU performance to ensure its market position.
The more interesting question is how long Apple will take to match current discrete mobile and desktop GPUs. Will they even try to go up against NVidia and AMD in the desktop market? It would shake things up if they did!
I hope my research has helped you!
Gizmodo seems to disagree with you
So Just How Powerful Are Apple's New Laptop Chips Gonna Be
Apple officially confirmed in its WWDC keynote yesterday that, yes, it would be using its own processors in its laptops and desktops. Not a terriblygizmodo.com
You are looking at a review from 2018. In 2020, Intel has released new 10nm chips with a new Ice Lake iGPU. That one scores 10241 in Geekbench 5 (metal) (the high-end 13" 2020 MBP uses this 10nm chip from Intel). The iPad Pro is around 8900.
Considering that Apple ARM will be released next year, it makes sense to compare it to Intel and AMD products in 2021 at least.
Thanks.. understand that this is just guessing till we see the product.. but well.. the process of guessing and hoping is fun on its own rights 3 time the performance of a12z.. hmm.. this should be awesome rightA12Z is rated at 15W TDP. So I guess 3 times of the current the performance if they increase the TDP to 50W? (assuming performance scales linear with TDP)
Again, iPad Pro with a 2 year old GPU is 11k-12k in Geekbench 5 (metal) that Ice Lake G7. And 50-60% faster in graphical benchmarks.
Tiger Lake is undoubtedly a big jump for Intel. But 80% performance increase over Iris Plus is not enough.
[automerge]1593940294[/automerge]
Apple ARM laptops are out this year.
Seriously, please stop for a moment and check your data. Your last couple of posts was you just making random claims.
There are benchmark indicating that Intel iGPU's are even more powerful.
I’m sure they are aware of it yet they come to this conclusion. Wouldn’t that tell you something about your data?
In fact I have never found any publishers tell otherwise (that Intel iGPU is faster than A12Z).
I'd say, let's wait for Tiger lake then. It is supposed to be released this year (not sure when it will actually end up in laptops). Also will be interesting to see what AMD will do on the iGPU side.
It does not. Performance scales linearly with the frequency, but power dissipation grows roughly cubically with the frequency, because in order to increase the frequency you typically also have to increase the voltage (which by itself causes the power to grow quadratically). In other words, the performance scales logarithmically with the power.A12Z is rated at 15W TDP. So I guess 3 times of the current the performance if they increase the TDP to 50W? (assuming performance scales linear with TDP)
It does not. Performance scales linearly with the frequency, but power dissipation grows roughly cubically with the frequency, because in order to increase the frequency you typically also have to increase the voltage (which by itself causes the power to grow quadratically). In other words, the performance scales logarithmically with the power.
It is also not at all clear how far Apple's CPUs will actually be able to scale (i.e. up to which frequencies and power levels they are able to work reliably), since so far Apple has only made low-power variants. It's not as simple as slapping a bigger cooler on the CPU and pumping in more power.