Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

koyoot

macrumors 603
Jun 5, 2012
5,939
1,853
No, I understand. I'm just keeping my feet on the ground. You can dream all you like but you need to keep perspective, this is in no way definitive of any real purchasable product to be available in the near term.
And who said that there is?

I think people missed the most important part of my posts. We will get AMD CPU based Mac when they will have USB4, at the earliest. Not before this.

But the Van Gogh is still problematic for this view.
 

deconstruct60

macrumors G5
Mar 10, 2009
12,493
4,053
Threadripper meant to compete with HEDT. EPYC would be comparable to Xeon tho.

Intel's Xeon line up expands way past the Xeon SP variants. Xeon W , Xeon E , and Xeon D. Xeon SP does pair up with EPYC but Threadripper doesn't play in the zone that Xeon W 2200 and 3200 cover. Xeon E is bascially the mainstream Core i CPUs with with ECC turned on. ( like the high end (core count) HEDT -X are W with ECC turned off. ). D is for a network and edge servers.

Intel just makes a wider variety of stuff than AMD does. AMD has basically outsourced even chipset development ( they are smaller and were not financially profitable over last 4-6 years so that made sense but illustrative that they don't do anywhere near what Intel does in scope. )


Also, macOS Catalina 15.2 Beta added AMD APU code names which means Mac will eventually gonna use AMD CPU.
...

Those lists included Picasso ( a 2018 chip). Really think Apple is coming with a 2018 AMD chip in 2020? They extremely likely aren't. There is a more than a decent chance they are screwing with Intel moved some stuff off of a "skunk works" branch and onto the mainline as a negotiating point. (Really serious about leaving so we want a much bigger discount to stay. )

It could be that some work done on Picasso is a foundational layer for what is in Van Gough so a APU that will never see light of day gets dragged in. That seems a bit sloppy code practices , but macOS Catalina is also a "hot mess" on release so it is possible.

AMD is possible and Apple could be leaning that way for 2H 2020 (or later ) products. But it isn't the slam dunk that folks are trying to spin it as. Van Gough isn't absolutely required to be solely commissioned by Apple. Aspects of Van Gough look like it would be good for just a single Mac product and single Mac products aren't particularly good match for significantly custom CPUs. (and Apple probably would be more than happy to share the R&D costs as long as they were "first in line" to get as many as they wanted. )
[automerge]1574957911[/automerge]
2) APPLE ORDERED THE DESIGN OF VAN GOGH! Nobody else could, and it wont appear anywhere else, if it is product for Apple, for which, everything points to. You get it right now, what Van Gogh is, and what is happening?

Nobody else could? There is about zero evidence of that present so far other than lots of hand waving.
 
  • Like
Reactions: thisisnotmyname

koyoot

macrumors 603
Jun 5, 2012
5,939
1,853
Nobody else could? There is about zero evidence of that present so far other than lots of hand waving.
Who else could order Vega M GPU, other than Intel? And in which products, other than Intels they appeared?
Who else could order Xbox and PS4 APUs, other than Sony and Microsoft? And in which products we have seen them, apart from Sony's and Microsoft's?
Who else could order semi-custom APU from that chinese console/compuiter from few years ago? And in who else's products we have seen them?

Semi-Custom product means that it appears ONLY in products that vendor ordering it offers. That is why it is called Semi-Custom.

If it was Apple who ordered Van Gogh, and if we already have kexts for it in Mac OS Catalina beta means one thing. Apple has working hardware, and they needed an OS to test it. The fact that other APUs are in catalina, means that Apple is testing how OS and Apps runs on AMD hardware, particularly - CPUs.

Denying that is at the very least ridiculous. Ordering design of this APU, and then not putting it in the comptuers means that basically 200 millions of dollars in physical design and manufacturing design costs were put in the toilet, by Apple. Good luck with proving this is natural way how Apple works.
 

deconstruct60

macrumors G5
Mar 10, 2009
12,493
4,053
....

AMD aren't standing still on process tech either. They're using TSMC who have 7nm+ in advanced stage and are working on 5nm right now. By the time intel get 10nm shipping in volume, AMD will be on 7nm+ or 5nm TSMC.

Intel didn't stop 7nm work when 10nm went sideways. As they made adjustments with 10nm for being overzelous with density targets, those probably got folded into 7nm. However, They are substantively concurrent work (different fab tools ). TMSC's 5nm "right now" work is right up there with Intel's 7nm "right now" work.

Basically intel need to do a ground up re-design (like AMD did with Zen, which took 4-6 years) to fix the various problems they have with core count scaling and being able to build many smaller dies instead of monolithic dies to improve yield to be able to compete.

chuckle. Back in Feburary 2017

"..Update 1: On speaking with Diane Bryant, the 'data center gets new nodes first' is going to be achieved by using multiple small dies on a single package. But rather than use a multi-chip package as in previous multi-core products, Intel will be using EMIB as demonstrated at ISSCC: an MCP/2.5D interposer-like design with an Embedded Multi-Die Interconnect Bridge (EMIB). ..."
https://www.anandtech.com/show/1111...n-core-on-14nm-data-center-first-to-new-nodes

That was almost three years ago, so 4 years ( if started before talking about this in 2017) would be around 2H 2020. The notion that AMD is more than a couple years ahead here because Intel was "too dumb to notice" are misguided. Intel certainly has some things to fit, but it wasn't as if they were totally off the road and in the swamp. They coupled about 2-3 too many more things than necessary to 10nm ( not just a shrink but other "insanely great" ideas and it got too complicated). Throwing decoupling complexity that was the foundation of "tick-tock" out the window was a bad idea.

The notion that Intel can't build a very large number of smaller dies is comical. The volume of Intel's mainstream Core i shipped is way higher than AMD's. If the packaging the dies was slow and cumbersome that would be a problem, but cranking out smaller dies in volume not a huge "ask" for Intel.


The SkyLake/Cascade lake High Core Count Die had 18 cores. I suspect they hoped to get that 18 cores into a 10nm smaller die. One recent roadmap slide for Ice Lake SP maxes out 26 cores. So Intel probably slid back to 14 cores (with two 2 spares ). ( 2 * 13 = 26). [ Outside chance that this is three 10 cores with more spares ( 4 spares across three dies. ). ]. But 2-3 dies would fit with the plan that Intel laid out a couple of years ago.

They won't win the max core count 'war' but they'll have usable product that would work fine in an Mac Pro or probably a iMac Pro with an update.


They have been caught napping by AMD in terms of design - who are currently firing on all cylinders, and 10nm is a disaster.

AMD's smaller chiplet core count helps but it isn't like it doesn't have issues at the moment either. The 64 TR3 is missing in action until later. And TR3 slid pretty far away from EPYC launch. AMD doesn't have the wafer start bandwidth to close the door on Intel. They can't do anywhere need the quantity that Intel can.

The 10nm is substantially different from what Intel was talking several years ago. But server products coming in at 10nm+ is about what Intel laid out server years ago. The yield probably isn't as much of a problem as the wafer processing rate. Intel can't close the supply gap that AMD can't make either.


Things are going to be ugly for intel for some time. They have the cash to (eventually) dig themselves out most likely, but all the money in the world simply can't buy development time.

But they can use that money to pay off system vendors to buy their product at cheaper rates. AMD really can't afford a price war so they aren't likely to respond with a deep undercut on prices. AMD will take some share, but no where near most of it.

AMD doesn't have enough money to outbid Apple for wafer starts. AMD will get more next year but if can't buy manufacturing capacity, AMD is limited too.
 
Last edited:
  • Like
Reactions: Zdigital2015

Pro7913

Cancelled
Sep 28, 2019
345
102
It is possible. Intel had Kaby Lake-G with Vega M well before AMD's own APUs had Vegas. Then again, it probably won't mean much more than placeholders for something else.

I am very aware current Ryzen APUs are on 12nm. But Ryzen APU's problem is not about what nm it is based on, but it keeps drawing higher than idle power and it has always been a problem for AMD in their laptop APUs. I don't expect 7nm to be much different, and Ice Lake mobile showed great gains in terms power efficiency. So I don't expect Apple to move to Ryzen APU, and even if they switch, they will move to their own ARM solution.

10nm delay is well known since 2016. But this time, Intel actually has product out and are actually available to purchase. There is a big difference between only talking about it and actually having a product out. It means 10nm situation is actually improving.

It really doesn't matter if the 10nm performance isn't as good as their 14nm (Actually, there is no process better than Intel's process in terms of maximum clock speed). Intel needs 10nm for its density so it can increase the core count without shooting power efficiency over the moon. Willow Cove is going to be major improvement over Skylake in terms of IPC so it should be fine even with some clockspeed sacrifice.

They dont draw too much of power with 3000 series.
 

deconstruct60

macrumors G5
Mar 10, 2009
12,493
4,053
They dont draw too much of power with 3000 series.

For desktops? Relatively to higher core count Intel solutions. Yes. But otherwise, not really.

Ryzen 3 3200GE 35W 4 cores ( 8.75W / core ... Yes there is a Vega 8 there to but it is also just a Vega 8 compared to what Apple is using).

Ryzen 5 3600 65W 6 cores ( 10.8W / core )

Ryzen 7 3700X 65W 8 cores ( 8.1W/ core )

in the mobile space that isn't good. AMD's chiplet design is skewed toward doing better in the EYPC and high end desktop space where the margins are quite good and Intel didn't tune to as well. Zen 1-2 didn't particularly target mobile. I'm sure AMD will do some mobile products that also a bit out of their wheelhouse was tuned for.

Unless Apple does something quite substantive with the Mini's case, it is closer to being in the mobile space. AMD will dial back clock for the mobile solutions but unless Apple loves the GPU from then the CPU those will probably be an issue. Even more so when get down close the the 10W budget range and Apple is eyeing their own A-series solutions and what AMD has (or Intel).
 

Pro7913

Cancelled
Sep 28, 2019
345
102
For desktops? Relatively to higher core count Intel solutions. Yes. But otherwise, not really.

Ryzen 3 3200GE 35W 4 cores ( 8.75W / core ... Yes there is a Vega 8 there to but it is also just a Vega 8 compared to what Apple is using).

Ryzen 5 3600 65W 6 cores ( 10.8W / core )

Ryzen 7 3700X 65W 8 cores ( 8.1W/ core )

in the mobile space that isn't good. AMD's chiplet design is skewed toward doing better in the EYPC and high end desktop space where the margins are quite good and Intel didn't tune to as well. Zen 1-2 didn't particularly target mobile. I'm sure AMD will do some mobile products that also a bit out of their wheelhouse was tuned for.

Unless Apple does something quite substantive with the Mini's case, it is closer to being in the mobile space. AMD will dial back clock for the mobile solutions but unless Apple loves the GPU from then the CPU those will probably be an issue. Even more so when get down close the the 10W budget range and Apple is eyeing their own A-series solutions and what AMD has (or Intel).

Be aware that Intel CPU have different TDP and TGP in real life.

Ryzen 3rd gen has better power consumption than Intel CPU for sure because of 7nm technology.

At this point, 3rd gen ryzen mobile cpu is base on 12nm and yet in terms of performance, it can compete with 10th gen 10nm CPU series at the same TDP. Also, the power consumption depends on the computer brand.

And despite what you are believing, Apple added AMD APU in macOS Catalina. Why would they?
 

Korican100

macrumors 65816
Oct 9, 2012
1,213
617
Wow, by that logic, the trashcan mac pro was a super winner... oh yea, except it wasnt. It was a total disaster to the point that apple had to do an apology tour for what a complete disaster it was...
I got the Mac Pro 2013 when it first hit and it served me well in Final Cut Pro. Almost 6 years later of great use and I’m ready to sell it to get the next iteration. And am getting $2,000.
How is that a complete disaster lol?
[automerge]1574980800[/automerge]
You could always run MacOS in a kernel-based VM with full hardware acceleration. Sure that would purr along nicely on an 64-core Threadripper.
As an avid fcpx user, hell no. I need support throughout the lifecycle
 

goMac

macrumors 604
Apr 15, 2004
7,663
1,694
Apple is already leaving Intel. There will probably be a few more Intel based Macs, but I don’t understand the fretting over future Intel architectures. Whatever Apple does, most future Intel architectures probably won’t find their way into Macs anymore.

It could be AMD, it could be ARM. Whatever happens Apple will no longer be linked to Intel’s shortcomings.

Same thing about “Doesn’t Apple know Intel is falling behind?” Uhhhh yeah. That’s why they’re dropping Intel. Duh.
 

Kpjoslee

macrumors 6502
Sep 11, 2007
417
269
They dont draw too much of power with 3000 series.

It is not about power draw on peak use. It is about leaking power on idle. Its the main reason Ryzen laptops doesn't have great battery life compared to similarly configured Intel counterparts.
 

jinnyman

macrumors 6502a
Sep 2, 2011
762
671
Lincolnshire, IL
I got the Mac Pro 2013 when it first hit and it served me well in Final Cut Pro. Almost 6 years later of great use and I’m ready to sell it to get the next iteration. And am getting $2,000.
How is that a complete disaster lol?
[automerge]1574980800[/automerge]

As an avid fcpx user, hell no. I need support throughout the lifecycle
You know, for some G4 cube was also a success. If you liked 6.1, good for you. But the market didn’t like it.
 
  • Like
Reactions: fendersrule

fendersrule

macrumors 6502
Oct 9, 2008
423
324
Currently, no "Zen 2" mobile processors exist until early of next year. Anything Ryzen mobile processor is based on Zen 1+ architecture.
 

ApfelKuchen

macrumors 601
Aug 28, 2012
4,335
3,012
Between the coasts
If Apple's roadmap is to move to in-house silicon, then it doesn't matter who the momentary leader in the commercial CPU horse race happens to be. It's all a symptom of being seduced by the spec sheet/benchmark. What really matters is how well a system has been optimized to incorporate its individual components.

I also see a certain irony in a call to embrace AMD CPUs when Mac power users have long been calling for Apple to drop AMDs GPUs. If you think you'll get a Mac with an AMD CPU and an NVIDIA GPU... Uh huh, yeah, right.

Vendor/platform changes of this magnitude take years of planning and development work. By the time the switch is made, will AMD performance have plateaued, or will they continue to improve at a faster rate than Intel? Will the short-term effort bring long-term benefit?

Meantime, the users who can actually make use of the additional power are an edge case. Web browsing, email, building spreadsheets and Powerpoint/Keynote presentations... are those users currently suffering with their Intel-equipped Macs? If they have an older Mac and upgraded to the current crop of Intel Macs, they're still likely to be delighted with the improvement.

The fact is, the vast majority of PC and Mac users are sold on the Intel name. It's a brand reputation carefully built and nurtured from the very beginning of the microprocessor revolution. Where was AMD when Intel was producing the 8080, 8088 and the earlier 80x86 processors? Cloning those designs under license from Intel. Yes, the past is the past, and the present is different, but when it comes to marketing, I'd wager less than 10% of any PC-maker's customer base would say, "Hell yes, thanks for switching to AMD!" The rest would say, "AM who?"

Any manufacturer who dropped Intel in favor of any other maker (including Apple switching to an in-house design) has a major marketing effort ahead of it. The question in the boardroom would be, "Will it be worth the risk?" For Apple, simply switching x86 vendors would be too risky - some edge-case users made happy, mainstream users fearful of change and suspicious of Apple's motive. The bigger reward is in doing for Mac what they've already done for iOS - take it all in-house. It's hard to beat the combination of proprietary silicon + proprietary OS, both for task-tuned performance and for reduced cost.
 

throAU

macrumors G3
Feb 13, 2012
9,234
7,396
Perth, Western Australia
I love this logic, that can appear only on Apple forums.

It is more expensive and more work demanding to go from Intel(x86) to AMD(x86) for CPUs on their desktop and mobile computer lineup, than to go from x86, to ARM.

Come on, guys. Stuff like this can only appear only on this forum.


More expensive? Sure. However they can build the silicon they want. Like they did with the ipad, iphone, etc. instead of using off the shelf parts.

Apple right now are beholden to intel’s product catalogue. Or AMD if they switch. But then they will be stuck with whatever AMD pump out.

Apple do a lot of stuff that isn’t the cheapest way of doing things, purely in the interest of being agile.

Building your own CPUs enables this. They have the budget for it, they have the motivation.

It might be 5 years or more but they WILL put out a high end ARM based processor of their own sooner or later, and eventually they’ll stuff a heap of them in their pro lineup.


edit:
wait.. what? you reckon it will be expensive for them to switch to AMD?

lol. no. they do a motherboard/chipset revision every time intel puts out a new platform. this is no different. macOS already runs on Ryzen, today if you do a hackintosh.
[automerge]1575007342[/automerge]
If Apple's roadmap is to move to in-house silicon, then it doesn't matter who the momentary leader in the commercial CPU horse race happens to be. It's all a symptom of being seduced by the spec sheet/benchmark. What really matters is how well a system has been optimized to incorporate its individual components.

I also see a certain irony in a call to embrace AMD CPUs when Mac power users have long been calling for Apple to drop AMDs GPUs. If you think you'll get a Mac with an AMD CPU and an NVIDIA GPU... Uh huh, yeah, right.

Vendor/platform changes of this magnitude take years of planning and development work. By the time the switch is made, will AMD performance have plateaued, or will they continue to improve at a faster rate than Intel? Will the short-term effort bring long-term benefit?

Meantime, the users who can actually make use of the additional power are an edge case. Web browsing, email, building spreadsheets and Powerpoint/Keynote presentations... are those users currently suffering with their Intel-equipped Macs? If they have an older Mac and upgraded to the current crop of Intel Macs, they're still likely to be delighted with the improvement.

The fact is, the vast majority of PC and Mac users are sold on the Intel name. It's a brand reputation carefully built and nurtured from the very beginning of the microprocessor revolution. Where was AMD when Intel was producing the 8080, 8088 and the earlier 80x86 processors? Cloning those designs under license from Intel. Yes, the past is the past, and the present is different, but when it comes to marketing, I'd wager less than 10% of any PC-maker's customer base would say, "Hell yes, thanks for switching to AMD!" The rest would say, "AM who?"

Any manufacturer who dropped Intel in favor of any other maker (including Apple switching to an in-house design) has a major marketing effort ahead of it. The question in the boardroom would be, "Will it be worth the risk?" For Apple, simply switching x86 vendors would be too risky - some edge-case users made happy, mainstream users fearful of change and suspicious of Apple's motive. The bigger reward is in doing for Mac what they've already done for iOS - take it all in-house. It's hard to beat the combination of proprietary silicon + proprietary OS, both for task-tuned performance and for reduced cost.


1. intel currently can not supply the market due to process problems
2. intel has had a massive run of security problems that has resulted in datacenters literally having to double their purchases (unwillingly) to cover performance degradation from the security fixes
3. AMD processors were largely unaffected (performance wise), back to and before bulldozer
4. Intel currently has nothing to compete


They had a good name. Their name right now is trash.
 

Slash-2CPU

macrumors 6502
Dec 14, 2016
404
268
Disagree....More like Pentium 4 vs Athlon (non-xp) again. Athlon killed it. Then XP came. Then 64.

Intel had the Pentium 4, which was a lousy ass chip.

Right. Netburst was a complete dead end. It was supposed to be this deeply-pipelined architecture with deadly accurate branch prediction that was going to scale to 8 or 10 GHz. It was also introduced with serial RDRAM which was a few percent faster than parallel SDRAM while being much more expensive, much hotter, and patented by Rambus. RDRAM was such a flop that Intel had to quickly roll out new chipsets that used the older SDRAM. This is when Athlon and Duron started to creep into the market share. It was a much cheaper platform and could use your old, existing RAM from a P-II or P-III system.

Athlon XP added SSE instructions and was about 10% faster than Athlon clock-for-clock. During this time, Intel did manage to make some very high-clocked P4's that were usually faster than Athlon XP's but at a steep premium. There were also oddball things you could do with Athlon XP Mobile CPU's such as overclocking(unlocked multiplier) or running in dual-socket setups. For DIY system builders, an Athlon XP-M could be made to run faster than any comparably-priced P4.

The real trouble for Intel started when they rolled out Prescott P4's, which were not much faster than the previous parts, while running much, much hotter. At about the same time, AMD decided to move the memory controller onto the CPU die with Athlon64 / Opteron chips. This led to massive improvement in memory throughput, comparable to the difference between a hard drive versus an SSD. The marginal CPU performance advantage, massive memory throughput advantage, and nominally lower power and heat all at a lower price was an absolute win for AMD.

AMD's Opteron of this time also scaled with dual-core and up to 8 sockets far beyond anything Xeon had to offer.

After that, Core architecture landed. AMD's CPU performance advantage evaporated instantly. Core2 was a faster, cooler CPU, but still had a frontside bus and no onboard memory controller. AMD still had a slight advantage in some cases(memory performance) and was still a better bang for the buck.

Intel unveiled the Core i-series with integrated memory controller. This is when any advantage AMD had was completely eliminated.

AMD then launched K10 architecture (Phenom), which never performed or scaled well. It also had a serious bug (TLB bug) that when patched caused a 5-20% performance hit. It was AMD's Netburst moment that, like Netburst, drug on for almost a decade. Intel got comfortable, fat, and lazy with their lead. Intel performance received incremental bumps while prices went up like hell.

Intel announced their 10nm Ice Lake architecture would launch in early 2019 or even late 2018. They've never succeeded as of now in making these parts. Intel's last major architecture release was in 2013 with Haswell, and they've been stuck at 14nm since 2015. That gave AMD plenty time to develop and improve Zen architecture.

Almost 20 years later, the whole thing is repeating again with AMD pulling into the lead.
Intel is going to be bringing Sapphire Rapids on 2021 with Pcie 5.0 and DDR5, and 2022 with Granite Rapids on 7nm. Since Mac Pro is basically 2020 product, those are most likely going to be next round of updates for Mac Pro.

If Apple was planning to switch to AMD, then new Mac Pro would have been already based on Threadripper 3 which is launching around same time as those Mac Pros are.

Yeah, and Intel is releasing 10nm Ice Lake in first quarter 2019 too. Just wait. (Sarcasm)
 

throAU

macrumors G3
Feb 13, 2012
9,234
7,396
Perth, Western Australia
If you look at the significant “firsts” over the past decade or two, it is actually AMD who hit them rather than intel

* first ghz CPU
* first 64 bit x86 compatible CPU
* first dual core x86 CPU


Intel had good fabs and a lot of money to bribe people with to not buy AMD. That’s about it.

The current x64 instruction set is an AMD invention.
 
Last edited:
  • Like
Reactions: PC_tech

Kpjoslee

macrumors 6502
Sep 11, 2007
417
269
  • Like
Reactions: thisisnotmyname

OkiRun

macrumors 65816
Oct 25, 2019
1,005
585
Japan
If Apple is ******** in their pants it’s from shock on how well received the 16” MacBook Pro has been. I went to the garden of my condominium today and sitting there were for high school students, each with a brand new 16” MacBook Pro. I asked them about it and they all had units that were either fully or nearly maxed out. That’s 5k to 6k each! They had ‘pro’ Level machines. They all had FCPX up and running and each were posting to their YouTube channels. What does it mean to be a Pro in the apple universe? Kids think themselves pros these days. 6k for a Mac Pro is nothing to this new generation of Pros. Nearly everyone in this condo complex wears an iWatch. I don’t think apple is sweating intel vs AMD. Their engineers seem to be some of the best in the world. They cook a dinner, ring the triangle bell, and yell, come and get it. Lots of people do. Lots don’t. Life.
 
  • Like
Reactions: Zdigital2015

koyoot

macrumors 603
Jun 5, 2012
5,939
1,853
More expensive? Sure. However they can build the silicon they want. Like they did with the ipad, iphone, etc. instead of using off the shelf parts.

Apple right now are beholden to intel’s product catalogue. Or AMD if they switch. But then they will be stuck with whatever AMD pump out.

Apple do a lot of stuff that isn’t the cheapest way of doing things, purely in the interest of being agile.

Building your own CPUs enables this. They have the budget for it, they have the motivation.

It might be 5 years or more but they WILL put out a high end ARM based processor of their own sooner or later, and eventually they’ll stuff a heap of them in their pro lineup.


edit:
wait.. what? you reckon it will be expensive for them to switch to AMD?

lol. no. they do a motherboard/chipset revision every time intel puts out a new platform. this is no different. macOS already runs on Ryzen, today if you do a hackintosh.
Its hilarious that you did not read in that sentence huge irony...

Of course it will be cheaper to stay on x86 than to switch to ARM.

And yes, using AMD portfolio they can design WHATEVER they want, the only thing they are stuck with are architectures. But with chiplets, Apple can demand whatever they want.

That is the power of Semi-Custom design ;).

If Apple's roadmap is to move to in-house silicon, then it doesn't matter who the momentary leader in the commercial CPU horse race happens to be. It's all a symptom of being seduced by the spec sheet/benchmark. What really matters is how well a system has been optimized to incorporate its individual components.

I also see a certain irony in a call to embrace AMD CPUs when Mac power users have long been calling for Apple to drop AMDs GPUs. If you think you'll get a Mac with an AMD CPU and an NVIDIA GPU... Uh huh, yeah, right.

Vendor/platform changes of this magnitude take years of planning and development work. By the time the switch is made, will AMD performance have plateaued, or will they continue to improve at a faster rate than Intel? Will the short-term effort bring long-term benefit?
Semi-Custom products based on AMD's Chiplet architecture, designed and ordered by Apple would also constitute to being Custom, home-made products ;).

And Van Gogh, AMD APU which is already in Catalina Beta is Semi-Custom product.

Make out of it what you wish ;).
 

deconstruct60

macrumors G5
Mar 10, 2009
12,493
4,053
Be aware that Intel CPU have different TDP and TGP in real life.

Intel is base clock usage , not turbo/max. AMD has variance too ( not at max either).

Ryzen 3rd gen has better power consumption than Intel CPU for sure because of 7nm technology.

But if AMD 'blows' that better x86 core consumption on higher iGPU usage it will be a wash. In the chart, I posted the 4 core with iGPU was in same range as the 8 core (and no GPU).

Intel's Gen 11 or 12 GPU isn't a slouch. The iGPU space is more competitive than the x86 cores at this point. AMD isn't likely to simply just take the power reduction there ( as oppose use it for higher clocks and/or more function units. )

And despite what you are believing, Apple added AMD APU in macOS Catalina. Why would they?

If it consisted solely of 2019 (or newer) APUs it would be more creditable. That older APUs are being thrown in here and Apple very highly unlikely to use those older ones in a future product should give folks pause as to what is going on here.

It is macOS Catalina beta. The "feature" could be folded right back out again before the official release of that version. Beta doesn't mean features are locked in stone. Especially with Apple's flakey release chain (overzealous integration build process they are suppose to be getting rid of for the next macOS version. ) .

I pointed out in a previous post that could also be for leverage for cheaper prices. Expose the "Plan B" build in a beta, get the price drop , and don't actually use "Plan B" on most (if not all) models if hit price targets.
 
  • Like
Reactions: thisisnotmyname

deconstruct60

macrumors G5
Mar 10, 2009
12,493
4,053
https://www.anandtech.com/show/1512...phire-rapids-cpus-with-six-ponte-vecchio-gpus

Well, Aurora Supercomputer, which is going to be finished in 2021, will have Sapphire Rapids Xeons, so I am sure Sapphire Rapids Xeons will be available in 2021. Since you can't win $500m contract on vaporwares. :)

It is set to be delivered in 2021. For units that big , there is usually a "shakedown" phase before it is fully operational to general approved user workloads. There is a very good chance that it will be installed enough to get LINPACK numbers in the the November 2021 supercomputer conference edition of the top500 listing, but not general usage ready at that point. That also doesn't mean that Sapphire Rapids will be available to mere mortals ( like the top web service folks ( Amazon, Google, etc.) , top end supercomputer vendors get early access with the Xeon SP parts are still officially in certification process before volume release.

Some parts don't exist yet, but yeah for $500M, Intel probably had to so some NDA demos of some prototypes that are substantive.
 

deconstruct60

macrumors G5
Mar 10, 2009
12,493
4,053
....
Semi-Custom product means that it appears ONLY in products that vendor ordering it offers. That is why it is called Semi-Custom.

You're leap is that AMD custom will only take orders from a single entity. They don't have to. If 2-3 customers all want X and they divide up the costs between them, then they'll get X.

If it was Apple who ordered Van Gogh,

Which is the point. If it was only Apple that ... For a single Mac product almost none of them make sense to order up a single substantively custom central CPU-GPU package for. The volume really isn't there to easily pay for that.

and if we already have kexts for it in Mac OS Catalina beta means one thing. Apple has working hardware, and they needed an OS to test it.

Like Apple hasn't been doing "Plan B" builds of macOS in previous years on in house hackintoshes . ( like how x86 was built on some small set of Windows PCs alongside PPC Macs before the transition. ). I'm sure AMD has been invited to deign bake offs before. (and lost when they were 100% on top of their game and didn't have everything Apple wanted. ).



Denying that is at the very least ridiculous. Ordering design of this APU, and then not putting it in the comptuers means that basically 200 millions of dollars in physical design and manufacturing design costs were put in the toilet, by Apple. Good luck with proving this is natural way how Apple works.

apple doesn't need to order anything to test on AMD system. They can get reference boards from AMD to do that. The Van Gogh is a corner-case in the context of the older stuff that also bubbled out here which is highly unlikely to make an official future Mac product. What this looks like is Apple took their "Plan B" branch and just folded it into a beta of Catalina ( old "experiments and all". ). Perhaps they are planning to clean that up over next several Catalina releases, but burping out old experiments into a general release build is sloppy.

the hand wavy foundation you are proposing for Van Gogh ( APU with a dGPU slapped into the package) or just a different bigger configuration of the iGPU bolts with same Vega family ) ... all that will run on Windows just fine. The variants that AMD does for Playstation / Xbox are way more highly tuned to those operating system. Apple simply just asking fora bigger grunt on the GPU is something that probably other AMD customers have also asked for. AMD could easily put those folks into a consensus group and they'd just buy. There would be no enormous ( more than many 10's of Millions ) sunk cost that Apple was saddled too.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.