Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
GPU drivers are being worked on so far.

No GPU acceleration, sound, video decode, etc. GPU driver isn't trivial to reverse engineer so how many more years? Not as complete nor open as Raspberry Pi. VM is 2nd class to bare metal due to limitations or lack of hardware passthrough. Then, what happens if Apple decides to slap a cease and desist like with Corellium?

https://github.com/AsahiLinux/docs/wiki/Tasks
 
Last edited:
When Apple cranked the entry price of a Mac Pro up 100% ( MP 2013 entry -> MP 2019 entry) Apple paid a price in unit volume. You claim that the all of the Apple customers in a product segment are perfectly elastic in ability to pay doesn't match reality.
I did not claim anything even remotely like that.

I claimed that if Apple orders components for N devices, it tries to set the prices to the level where there is barely enough demand for N devices. If Apple gets the components for free but cannot order more in the near term, the prices will still be just as high, because someone is willing to buy the devices for that price. And if the costs are suddenly 50% higher due to an unpredictable event that affects Apple but not its competitors, Apple cannot pass the costs to customers. The prices are already as high as they can be and further price hikes would reduce the demand to below N devices.

Prices depend on the supply and the demand, and indirectly on the competition. Costs only determine whether the product is commercially viable at the price you are able to sell it.
 
No GPU acceleration, sound, video decode, etc. GPU driver isn't trivial to reverse engineer so how many more years? Not as complete nor open as Raspberry Pi. VM is 2nd class to bare metal due to limitations or lack of hardware passthrough. Then, what happens if Apple decides to slap a cease and desist like with Corellium?

https://github.com/AsahiLinux/docs/wiki/Tasks
GPU driver is already available.


And that was in May of last year.
 
Nope. M1 Max uses 0.2 watts in idle and M1 Max only has 2 E cores. M1 and A15 should have better idle usage due to having 4 E cores.

To be expected since it's castrated of native x64 compatibility jewels and access to the largest software ecosystem but is being ~0.2W lighter worth it?

If you look at Apple history, Macs end of 2006 faced dire marketshare and recovered with switch to x64. Why risk repeating that instead of just continuing to offer two product lines, refreshed x64 Macbook Pro(fessional) and M1 Macbook Pro(sumer)?
 
To be expected since it's castrated of native x64 compatibility jewels and access to the largest software ecosystem but is being ~0.2W lighter worth it?

You don’t get it. Those 0.2 watt include RAM and SSD controller. That’s how M1 Mac can get to 15+ hours battery life despite a power hungry display where ADL struggles to get half of that with a full HD panel.

If you look at Apple history, Macs end of 2006 faced dire marketshare and recovered with switch to x64. Why risk repeating that instead of just continuing to offer two product lines, refreshed x64 Macbook Pro(fessional) and M1 Macbook Pro(sumer)?

Because their professional series debut are already more professional than anything in the x86 market. Why would I trade my M1 for a a slower (for my usage at least) and more power-hungry Alder Lake?
 
  • Like
Reactions: JMacHack
Why would I trade my M1 for a a slower (for my usage at least) and more power-hungry Alder Lake?
inb4 he brings out the Cinebench R24 screen from HWUB.

Also inb4 some mental gymnastics about “muh games” or “muh compatibility with Windows” it’s old, it’s wrong and it’s annoying.
 
I think he’s trying to insult me by comparing me to Keller or something, but when both Jim and I worked on the same products, I think I come out strong.
Somehow he knows better, he just does, that x86 is superior to anything Apple came up with.

We, Anandtech, the tech press, are just blinded by our fanboyism to see it.
 
Will be interesting to see what happens in 10 years.
I'll still be running mostly Windows machines that run now old software and happy about it.

Meanwhile, after another Apple processor change (maybe deprecating 64-bit?), all old Mac software is irrelevant once again and nobody will care, after all, it's all about advancing the hardware. (same as 10 years ago)

:)



Actually, I want a direct mind/computer interface and AI software assistants to write software for any task I might want to do.
 
OpenGL API not GPU accelerated driver plus OpenGL is being superseded by Vulkan API. Look at the driver status link updated 12 days ago.
"There's no Vulkan API" is a long way from "Reverse engineering a GPU is impossible it will take years!"

Also it's a GPU accelerated driver. That's why it's an APX driver.

You can read the blog post here talking about how it's a GPU accelerated driver:
 
  • Like
Reactions: ahurst and JMacHack
I'll still be running mostly Windows machines that run now old software and happy about it.

Meanwhile, after another Apple processor change (maybe deprecating 64-bit?), all old Mac software is irrelevant once again and nobody will care, after all, it's all about advancing the hardware. (same as 10 years ago)
Hopefully, in 10 years, wether you're running Windows or some other OS, the hardware will be capable enough of perfectly emulating any platform for older software flawlessly (or, with all the inherent flaws of the older hardware), via sheer compute power, or through the use of reprogrammable hardware (e.g., integrated FPGAs).
 
Intel acquired Altera FPGA while AMD acquired Xilinx FPGA. What's Apple's FPGA roadmap? FPGA doesn't synthesize software memory leaks though.
 
Hopefully, in 10 years, wether you're running Windows or some other OS, the hardware will be capable enough of perfectly emulating any platform for older software flawlessly (or, with all the inherent flaws of the older hardware), via sheer compute power, or through the use of reprogrammable hardware (e.g., integrated FPGAs).
That would certainly be nice!
 
Why does Apple need FPGA on their roadmap?

Performance wise ASIC (non-reprogrammable) > FPGA > GPU > CPU. Plus, why would you not want, for example, reconfigurable hardware such as video decoder/encoder h.264 > VP9 > AV1 > AV2 and future codecs without buying new hardware. Nevermind, answered my own question. Apple's FPGA roadmap is just buy new hardware.
 
Performance wise ASIC (non-reprogrammable) > FPGA > GPU > CPU. Plus, why would you not want, for example, reconfigurable hardware such as video decoder/encoder h.264 > VP9 > AV1 > AV2 and future codecs without buying new hardware. Nevermind, answered my own question. Apple's FPGA roadmap is just buy new hardware.

So apple would buy a COTS FPGA and put it in a product like its afterburner card maybe? Why would they need to manufacture them?
 
Performance wise ASIC (non-reprogrammable) > FPGA > GPU > CPU. Plus, why would you not want, for example, reconfigurable hardware such as video decoder/encoder h.264 > VP9 > AV1 > AV2 and future codecs without buying new hardware. Nevermind, answered my own question. Apple's FPGA roadmap is just buy new hardware.

FPGA’s are not faster than GPUs or CPUs. What are you saying?

We use FPGA’s to logically verify GPUs and CPUs, and they operate at a fraction of the speed.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.