My 12th gen i9 [12900KS] is noticeably faster than my M1 Studio Ultra when utilising the cores [on the same apps]. Not that the studio is a slouch.Holy crap. Not only did the i9 obliterate the M2Ultra, the i7 beat it. And am I understanding that right, the previous gen i9 is basically even with the M2 ultra?
At some point Apple will have to design a chip that is allowed to use more power as nobody seems to care about performance / watt on desktop computers.
Except if it's between graphic cards with nearly the same performance, then for some reason it matters.
Along with a low power tv app could stream to AppleTV without requiring a Mac to be turned on.You know where apple could use performance/watt to some benefit? If they went into making something like a Synology. It would be StoragePod or something like that. Could be a Home Hub, but also provide Time Machine storage. Give you your own private iCloud on your hardware with limitless iCloud@home Photos. Put in some Wifi AP stuff in there.
Would be a hit, and there, the low power serves a function as people would keep it on 24/7...
Of course it's so practical there is zero chance they would do something like that.
That's because Cinebench has optimized SSE/AVX code but the (ARM) NEON code is translated by a library and is not optimized.
It is to be expected.
The shared memory of the M1 did a lot to boost many benchmarks. However, Apple can not bend the laws of physics and come out with both low power usage AND top performance at heavy load. To think Apple could beat Intel, AMD and Nvidia in both areas would be unrealistic. Computer design is still the art of compromise.
At some point Apple will have to design a chip that is allowed to use more power as nobody seems to care about performance / watt on desktop computers.
Except if it's between graphic cards with nearly the same performance, then for some reason it matters.
If by kicking, you mean kicking @$$, then I agree. I think the real comparison is between the M2 Ultra in the Mac Pro and the new Xeons and Threadripper CPU's that Apple could've used instead. I think there's no comparison in that situation.Good to see the competitors are kicking…
Apple’s departure from Intel’s ecosystem dates back to the introduction of the T2 chip, perhaps earlier. I don’t think Apple is going back.
Apple’s departure from Intel’s ecosystem dates back to the introduction of the T2 chip, perhaps earlier. I don’t think Apple is going back.
This is still a segment where computing power is more valuable than energy efficiency.When it comes to workstations/raw power: NOBODY CARES ABOUT LOW POWER DRAW, and this is why Mac Silicon fails (currently).
NOBODY CARES ABOUT LOW POWER DRAW
You know where apple could use performance/watt to some benefit? If they went into making something like a Synology. It would be StoragePod or something like that. Could be a Home Hub, but also provide Time Machine storage. Give you your own private iCloud on your hardware with limitless iCloud@home Photos. Put in some Wifi AP stuff in there.
Would be a hit, and there, the low power serves a function as people would keep it on 24/7...
Of course it's so practical there is zero chance they would do something like that.
So in 3 years you can put that saving of $720.00 towards a new Studio M3 or M4 where i will upgrade My CPU mem and GFX card in my workstation and save alot more that the new M4 studio maxed out will cost.I do. My Studio saves me ~$20 a month in electric bills compared to my iMac Pro.
i will upgrade My CPU mem and GFX card in my workstation and save alot more that the new M4 studio maxed out will cost. ... Power Efficiency is not a priority in the work place.
For a Pro Machine, it needs to be up gradable
That just is not how real professionals work. They purchase or lease machines with support. No one who is getting paid to use a computer can afford to sit around while they diagnose a problem with a machine. That is someone else’s problem.
Data centers tend to look for ways to cut down on overhead. One of which is power consumption via performance per watt. This is especially important in locations with very high $/kWh.This is still a segment where computing power is more valuable than energy efficiency.
When you move upwards to HPC systems energy efficiency rules. The fastest system at the Top500 list use most power, more than 20 MW. At the same time they are among the most energy efficient. The energy efficiency have improved more than 20 times the last 10 years, in 2013 the best one got 3 Gflops/w, today that has increased to 65 Gflops/w. No one in their right mind buy an old HPC system, the electricity bill is horrific.
You know where apple could use performance/watt to some benefit? If they went into making something like a Synology. It would be StoragePod or something like that. Could be a Home Hub, but also provide Time Machine storage. Give you your own private iCloud on your hardware with limitless iCloud@home Photos. Put in some Wifi AP stuff in there.
Would be a hit, and there, the low power serves a function as people would keep it on 24/7...
Of course it's so practical there is zero chance they would do something like that.
Absolutely. None of the companies with whom I work buys third party RAM or does CPU/GPU upgrades over the life of a machine. What good would AppleCare+ be if in order to use it one needed to first remove the third party RAM? While adding RAM would not void the coverage, Apple requires it be removed before service. Just not worth the hassle/risk.Slightly out of context but applies as well to upgrades.
At some point these desktop chips are not going to be able to take the shortcut of increasing heat to increase performance. All the heatsink technology in the world won't prevent them from hitting a thermo-wall. They will have to reengineer, and Apple is well ahead in that game.At some point Apple will have to design a chip that is allowed to use more power as nobody seems to care about performance / watt on desktop computers.
Apple's limited to the Mac segment of the whole PC market.At some point these desktop chips are not going to be able to take the shortcut of increasing heat to increase performance. All the heatsink technology in the world won't prevent them from hitting a thermo-wall. They will have to reengineer, and Apple is well ahead in that game.
Yes, but it is a hard area to enter, it is so much more than CPU efficiency. Interconnect, coprocessors, storage and not at least how well it handles the type of calculation made. It also must run very well on Linux. Usually it is a question of how much data the system can crunch per hour for X M$, including overhead.Data centers tend to look for ways to cut down on overhead. One of which is power consumption via performance per watt. This is especially important in locations with very high $/kWh.
A hope I had for Apple was for them to enter the server space with Apple Silicon. The power savings it brings would be enticing.