Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

R!TTER

macrumors member
Jun 7, 2022
58
44
Neither AMD/Intel have that level of knowledge or flexibility.
Yet, they're both getting dedicated accelerators for next gen with Xilinx IP being used in AMD & something similar for Intel. By the way how's that AV1 hardware encoder/decoder working out for Apple, oh wait!
Well, without a price or release date, doesn’t look like they’re coming ANYWHERE anytime soon :)
They'll be here eventually once the actual chips hit retail. As for date they're already coming for enterprise customers -

Prices will definitely be very high because they'll be rather niche as of now -
 

Unregistered 4U

macrumors G4
Jul 22, 2002
10,610
8,628
I see no evidence for this. If that were true why bother with the prosumer chips, scalable interconnects, miniLED technology and thunderbolt? They are literally spending billions to cater to those users you say they don’t care about.
Anyone can use Thunderbolt. Anyone can appreciate miniLED (as they do in the iPad). Anyone can appreciate systems with more performance. One does not have to be a “Mac developer developing non-Mac apps” for those features to appeal to the vast majority of buyers. Apple is providing purchasers of the lowest end M2 MacBook Air with hardware-accelerated video enabling multiple streams of high resolution video editing for the cost of the machine. Do those users even understand what that means? No, but it’s still there for their benefit.

They haven’t even deemed the Mac Pro important enough to release an Apple Silicon version of it, that’s just how focused they are on your average buyer.

It doesn’t matter how their revenue is structured, what matters is their culture and the brand perception. The Mac is central to all of that. And they are very much computing company at heart.

I mean, by that logic you must conclude that Microsoft doesn’t care about Windows since it’s just 15% of their revenue.
Ask someone born within the last 12 years to “Name a product made by Apple”. The answer may likely shock you! The Mac isn’t as central as you think it might be. Doesn’t care about? No. “Is not as important to them as it once was?” Absolutely. Same with the Mac.

Apple never lost the PC war.
An ex-Apple employee might disagree on that point.
Back in 1996, Steve Jobs declared, "The PC wars are over. Done. Microsoft won a long time ago." Winning a majority of the PC market from that point was never Apple’s goal, the goal was to be profitable on the sliver of market they have. And, they HAVE been. Profitable enough to continue to manufacture and sell 20-30 million Macs a year, mostly laptops.
 
  • Like
Reactions: eltoslightfoot

Unregistered 4U

macrumors G4
Jul 22, 2002
10,610
8,628
Yet, they're both getting dedicated accelerators for next gen with Xilinx IP being used in AMD & something similar for Intel. By the way how's that AV1 hardware encoder/decoder working out for Apple, oh wait!
By knowledge I mean that there’s no way for AMD or Intel to know exactly the systems their solutions are expected to go in. Apple’s chip makers know the dimensions of the systems (and even has say in it) they know ALL the compilers that will be used to write software (and has say in that) and even aware of the specific API’s their solution is expected to support (again, with input to that as well). AMD and Intel will never have that level of knowledge. And, they’ll never have the flexibility to do something like announce “no more 32-bit instructions. They will always have to support the years and years of cruft that are out there. It doesn’t matter what they try to shoehorn in, it’ll be crippled by the same adherence to backwards compatibility that stifles mobile innovation.

They'll be here eventually once the actual chips hit retail. As for date they're already coming for enterprise customers -

Prices will definitely be very high because they'll be rather niche as of now -
Neither story lists costs, so it’s a stretch to say they’re “coming” when prospective buyers don’t even know how much they’ll cost. So, Apple won’t have access to super hot, heat sink required, super expensive memory? I think they’ll survive. Especially as, for those who need it, Apple has 400 GB/s memory on the low end Studio and 800 GB/s on the high end Mac Studio. PCIE5 comes close, though. At 13 and 14 GB/s. :)
 

pshufd

macrumors G4
Oct 24, 2013
10,147
14,573
New Hampshire
By knowledge I mean that there’s no way for AMD or Intel to know exactly the systems their solutions are expected to go in. Apple’s chip makers know the dimensions of the systems (and even has say in it) they know ALL the compilers that will be used to write software (and has say in that) and even aware of the specific API’s their solution is expected to support (again, with input to that as well). AMD and Intel will never have that level of knowledge. And, they’ll never have the flexibility to do something like announce “no more 32-bit instructions. They will always have to support the years and years of cruft that are out there. It doesn’t matter what they try to shoehorn in, it’ll be crippled by the same adherence to backwards compatibility that stifles mobile innovation.


Neither story lists costs, so it’s a stretch to say they’re “coming” when prospective buyers don’t even know how much they’ll cost. So, Apple won’t have access to super hot, heat sink required, super expensive memory? I think they’ll survive. Especially as, for those who need it, Apple has 400 GB/s memory on the low end Studio and 800 GB/s on the high end Mac Studio. PCIE5 comes close, though. At 13 and 14 GB/s. :)

Apple makes iMovie and FCP and can add silicon that specifically benefits those two programs and others that take advantage of whatever they use for an API.
 

Sydde

macrumors 68030
Aug 17, 2009
2,563
7,061
IOKWARDI
Apple could find out via analysis that there’s one process that’s in a lot of the code compiled for the app store that, if sped up, could be an improvement across the board, design the part for that process, and have it released in the next iteration.

AAUI, they did in fact already, probably long ago, add some kind of feature in there that makes Objective-C object-method calls happen almost as fast as a straight subroutine call. For macOS and iOS, that may be a massive performance gain that someone like Intel or AMD would simply not put into a mainstream CPU.

My kink would be hardware-based memory resource management, where malloc(), realloc(), free(), etc, would be handled by a logic block, with almost no software involvement. That could lead to really big performance gains, I think.

You have to look around at the little stuff. Just sweep the floor and go through the dustpan, you will find something good.
 
  • Like
Reactions: Unregistered 4U

altaic

Suspended
Jan 26, 2004
712
484
AAUI, they did in fact already, probably long ago, add some kind of feature in there that makes Objective-C object-method calls happen almost as fast as a straight subroutine call. For macOS and iOS, that may be a massive performance gain that someone like Intel or AMD would simply not put into a mainstream CPU.

My kink would be hardware-based memory resource management, where malloc(), realloc(), free(), etc, would be handled by a logic block, with almost no software involvement. That could lead to really big performance gains, I think.

You have to look around at the little stuff. Just sweep the floor and go through the dustpan, you will find something good.
WTFDAAUIM? Apple Attachment Unit Interface? Automobile Association of Upper India?
 
Last edited:

Xiao_Xi

macrumors 68000
Oct 27, 2021
1,627
1,101
Neither AMD/Intel have that level of knowledge or flexibility.
Hardware accelerators make more sense in phones than in laptops or desktops, which is why Apple has started using them first. But others are catching up very quickly. For example, Intel has developed OneAPI, probably the best API for heterogeneous computing, and will include some purpose-built hardware accelerators soon.

Intel-Meteor-Lake-VPU.png

 

leman

macrumors Core
Oct 14, 2008
19,521
19,677
Which one do you think is the best?

I have no idea. Personally, I think that these kind of initiatives are a fools errand. Common abstraction end up costing a lot. Im in favor of having high-level frameworks that fall back on vendor-specific stuff. But that’s just my opinion, I don’t have a horse in this race as it’s hardly my field.
 

pshufd

macrumors G4
Oct 24, 2013
10,147
14,573
New Hampshire
I have no idea. Personally, I think that these kind of initiatives are a fools errand. Common abstraction end up costing a lot. Im in favor of having high-level frameworks that fall back on vendor-specific stuff. But that’s just my opinion, I don’t have a horse in this race as it’s hardly my field.

Apple really has a big advantage in owning the whole stack in a good chunk of cases or dictating development rules where they have frameworks. But you shouldn't diss Intel. They come up with some truly fantastic technologies like Optane to provide efficiency and performance to their customers for the next decade.

Oh.

Ooops.

Intel's Q2 2022 earnings report today was uncharacteristically disappointing, but it also hid a new announcement: Intel is ending its Optane business entirely. During the earnings call, Intel CEO Pat Gelsinger clarified the vaguely worded announcement in the earnings documents, confirming that Intel will wind down its Optane business. The move incurs a $559 million inventory impairment/write-off. We reached out to Intel for comment on the matter:


This has always been a tech company earnings strategy. If you're going to have a really bad quarter, just take every loss that you can and your stock will get buried and then you don't have that fantastic technology albatross hanging around.
 

diamond.g

macrumors G4
Mar 20, 2007
11,438
2,665
OBX
Apple really has a big advantage in owning the whole stack in a good chunk of cases or dictating development rules where they have frameworks. But you shouldn't diss Intel. They come up with some truly fantastic technologies like Optane to provide efficiency and performance to their customers for the next decade.

Oh.

Ooops.

Intel's Q2 2022 earnings report today was uncharacteristically disappointing, but it also hid a new announcement: Intel is ending its Optane business entirely. During the earnings call, Intel CEO Pat Gelsinger clarified the vaguely worded announcement in the earnings documents, confirming that Intel will wind down its Optane business. The move incurs a $559 million inventory impairment/write-off. We reached out to Intel for comment on the matter:


This has always been a tech company earnings strategy. If you're going to have a really bad quarter, just take every loss that you can and your stock will get buried and then you don't have that fantastic technology albatross hanging around.
Intel seems to think CXL can take it's place. I just assume that they couldn't get OEM buy in for Optane to flourish...
 

Sydde

macrumors 68030
Aug 17, 2009
2,563
7,061
IOKWARDI
Oracle was a user of Optane and they would have a customer base of Oracle Cloud systems for it. I guess that it wasn't enough though.

In theory, Optane could be used as a plug-in replacement for a NAND Flash SSD, with some improvement in speed and durability. But Intel was more interested in establishing the paradigm of memory = storage, which most systems are, I think, not quite ready to implement. Add to that the fact that Optane raises you power budget, which makes it less appealing for anything that spends significant time not plugged into the wall, which is a lot of things.
 

altaic

Suspended
Jan 26, 2004
712
484
In theory, Optane could be used as a plug-in replacement for a NAND Flash SSD, with some improvement in speed and durability. But Intel was more interested in establishing the paradigm of memory = storage, which most systems are, I think, not quite ready to implement. Add to that the fact that Optane raises you power budget, which makes it less appealing for anything that spends significant time not plugged into the wall, which is a lot of things.
Curious, how does Optane compare with RAM or SSDs power-wise? Both of those get bloody hot if using high throughput. Is it actually worse?

I always imagined Optane being great for data center applications, so I’m surprised they killed it during the great chip shortage— all while they apparently had excess stock to claim a big loss for tax purposes.
 

Abazigal

Contributor
Jul 18, 2011
20,392
23,892
Singapore
Maybe they mean it is missing feature(s) that the competition has?
Only feature I can think of is gaming, which we all know Apple has made a conscious decision to not support. The end result is that people instead get a laptop with long battery life, and also being capable of great, sustained performance even when not plugged in to an external power source. For the target market, I will say the tradeoff is more than acceptable.

Which is more than can be said for pretty much every other windows laptop out there.
 

Colstan

macrumors 6502
Jul 30, 2020
330
711
Its not gimped? Oh my bad then, guess it handles games perfectly fine then, right?
Games designed for Apple Silicon, then yes, perfectly fine. However, I suspect you're tongue-in-cheek referring to game availability, not the functions of the actual silicon. If the hardware were gimped, then they'd be in the same situation that Intel is apparently facing with Alchemist.
 

diamond.g

macrumors G4
Mar 20, 2007
11,438
2,665
OBX
Games designed for Apple Silicon, then yes, perfectly fine. However, I suspect you're tongue-in-cheek referring to game availability, not the functions of the actual silicon. If the hardware were gimped, then they'd be in the same situation that Intel is apparently facing with Alchemist.
I don't think ARC's hardware is gimped. They for sure have driver problems though.
 

leman

macrumors Core
Oct 14, 2008
19,521
19,677
Only feature I can think of is gaming, which we all know Apple has made a conscious decision to not support.

Apple has invested considerable resources into building gaming grade GPUs and APIs. I think you are confusing caning capability with games availability. There are not many high quality games available for the Mac, that is true, but even a passively cooled M1 (not to mention M2) is a fairly capable everyday gaming machine.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.