I will worry about tomorrow when it comes.
For now, I will enjoy today.
For now, I will enjoy today.
The RTX 4060 will perform like a RTX 3080. Sure the RTX 4060 will use 180-350 watts but it will be like $450.
and Apple charges $1000 to get the full 64 cores.
Who cares?!Nvidia current GPU is based on Samsung's 8nm node, they will move to 5nm TSMC this year.
AMD will be doing the same.
Both Lovelace and RDNA3 will be really powerful.
Both will destroy the M1 Ultra GPU. The Achilles heel is Mac Studio is not have having PCEi solts allowing for dGPU AMD support.
Apple will likely update the Mac Studio with M2 Ultra in 2 years by then AMD and Nvidia will have another major GPU arch update.
at 350W besides $450 you will have to count the energy bill difference between M1 ultra and this ))The RTX 4060 will perform like a RTX 3080. Sure the RTX 4060 will use 180-350 watts but it will be like $450.
and Apple charges $1000 to get the full 64 cores.
I read somewhere that the RTX3090 TFLOPS numbers are computed based on the number of ALUs and clock rates to arrive at the 36 TFLOPS. But due to the design of their architecture, some of the ALUs have dual functions, as in they can either do integer or floating point math, but not both at the same time. So the theoretical 36 TFLOPs cannot be achieved.A RTX3090 has already about 36 TFLOPS in computational power.
On top of course the integrated memory of Apple's M1 is an advantage. 64GB or 128GB of GPU memory is quite a statement, especially for this price.I read somewhere that the RTX3090 TFLOPS numbers are computed based on the number of ALUs and clock rates to arrive at the 36 TFLOPS. But due to the design of their architecture, some of the ALUs have dual functions, as in they can either do integer or floating point math, but not both at the same time. So the theoretical 36 TFLOPs cannot be achieved.
I read that is why the AMD RT6900XT with only 24 TFLOPS can beat the RTX3090 in some raster tasks.
So it seems that comparing TFLOPS across architecture may not tell the entire picture.
Add in the fact that Apple's SoC uses TBDR compared to IMR for the RTX3090, I'll not be surprised that the M1 Ultra will come out ahead in raster workload.
I think more importantly is that software are being optimised for the current generation of AS Macs to make them more performant. I think there are a lot of hidden potential even in the M1 SoC (e.g. efficiently using the TBDR architect of the GPU) that has yet to be unlocked by developers.Apple’s GPUs are still gaining in performance significantly with each generation
raw computational power.
yes, and think about if the M2 pro max and ultra gain the gpu counts and the gpu core improvement from the A15What I find amazing is that the M1 architecture is basically Apple’s first shot at creating a desktop-class GPU and they’ve done a pretty awesome job at supplying performant hardware, competing with the best discrete GPUs out there. If your use case requires this kind of power, you can buy it.
Apple’s GPUs are still gaining in performance significantly with each generation, and they’ve done a new deal with Imagination for a very nice set of raytracing technology, which I am sure we will see incorporated soon.
Newsflash! In 7 years computers will be way faster!Nvidia current GPU is based on Samsung's 8nm node, they will move to 5nm TSMC this year.
AMD will be doing the same.
Both Lovelace and RDNA3 will be really powerful.
Both will destroy the M1 Ultra GPU. The Achilles heel is Mac Studio is not have having PCEi solts allowing for dGPU AMD support.
Apple will likely update the Mac Studio with M2 Ultra in 2 years by then AMD and Nvidia will have another major GPU arch update.
Thing is you can upgrade the Mac Studio, a PC and Mac Pro you can. The Studio will be paper weight in 3-4 years.Newsflash! In 7 years computers will be way faster!
It won't. It will be within 5-7 years of initial launch like all other non-upgradable Macs, in line with when the new MacOS can no longer be downloaded to it legitimately without some DosDude patch. But in any case, certainly the ultra version is aimed at professional customers who can justify replacing it with whatever the latest thing is every three to four years.The Studio will be paper weight in 3-4 years.
Powerful at what power consumption? I believe Apple is still banking on having more performance per watt.Nvidia current GPU is based on Samsung's 8nm node, they will move to 5nm TSMC this year.
AMD will be doing the same.
Both Lovelace and RDNA3 will be really powerful.
Both will destroy the M1 Ultra GPU. The Achilles heel is Mac Studio is not have having PCEi solts allowing for dGPU AMD support.
Apple will likely update the Mac Studio with M2 Ultra in 2 years by then AMD and Nvidia will have another major GPU arch update.
Powerful at what power consumption? I believe Apple is still banking on having more performance per watt.
I’ll agree that perf/watt matters the least on desktops/workstations as compared to laptops and servers buuut it still affects things like chassis size, cooling, noise, (electricity costs), etc ….Power consumption does not matter nearly as much on the desktop side of things. Users who want/need more performance won't mind paying for it with more watts.
It does as electricity is not free. There's also cost in heat management of the room.Power consumption does not matter nearly as much on the desktop side of things. Users who want/need more performance won't mind paying for it with more watts.
Thing is you can upgrade the Mac Studio, a PC and Mac Pro you can. The Studio will be paper weight in 3-4 years.