Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Chuckle... sources on the monolithic die ? Given AMD doesn't have a 7nm design for I/O and memory for the other chips where are they cooking it up for the APU version? If they are using the chiplet how can it be monolithic ? They could do a chiplet that had both x86 cores and GPU cores on it. However, if it is hooked to a 14nm I/O chip to get any I/O then the package won't be monolithic. A different, smaller I/O chip perhaps ( so cheaper and less power)
AMD Renoir is GFX 909 family. It is Vega based. Because of this - APU using Renoir is monolithic die.


In another post you asked about what David Kanter said in 2011. No, Kanter was talking about it with Apple engineers between 2015 and 2016 year. And at the time they were unable to9 put more than 40 amps to those chips. The design is efficient in low power, and low amperages.
 
No, Kanter was talking about it with Apple engineers between 2015 and 2016 year. And at the time they were unable to9 put more than 40 amps to those chips. The design is efficient in low power, and low amperages.

Are you talking about amps here to make you sound like you're knowledgeable or something? I simply ask that, politely, because you don't expound more. You simply name drop "amps" and then move on.
 
Last edited:
Are you talking about amps here to make you sound like you're knowledgeable or something? I simply ask that, politely, because you don't expound more. You simply name drop "amps" and then move on.
I dumb down circuit design so that everybody can understand that. I leave matter of Apple CPUs, because I simply... do not know how Apple is willing to bend on 3 factors: power, performance, area, and how much they are willing to pay for silicon designs. And 7 nm process designs are not cheap.
 
First, there is a difference between architecture and implementation. The reference standard implementation of the architecture that anyone with some money can get from ARM (the company) is skewed low power. However, folks can also buy an architecture license and do their own "bottom up" implementations. Apple has one of these. That doesn't mean they are trying not to go in general direction that ARM is going in.

However, folks could go a very separate way. ( Which actually hasn't worked out so well for vast majority of them. It has been a slow slog. For example getting a real, support Linux server distro up and running all the way through support validation/qualification stages. ).

https://www.nextplatform.com/2018/05/16/getting-logical-about-cavium-thunderx2-versus-intel-skylake/
[ This article also mentions Ampere ]


Some peaks at power (note: this is the system consumption ) .....
"... Our Gigabyte/ Cavium ThunderX2 Sabre development platform hit a peak of 823W at 100% load. We think that there are likely optimizations that can occur at the system’s firmware level, and by using GA power binned chips. At first, we thought that these numbers were way out of line so we discussed them with Cavium and that is when we were told that the ~800W range was correct for our system and pre-production chips. ..."
https://www.servethehome.com/cavium-thunderx2-review-benchmarks-real-arm-server-option/8/

If go to the first page of that article you'll see a change where the top end Thunder X2 has a TDP of 180W. For 64 cores you'd need two... so 360W just on CPU . That's isn't low amps. It is also way off the ARM reference implementation (e.g., 4 way symmetric multi-threading (SMT) . Apple has about zero SMT. )

Anandtech ran some benchmarks on a Thunder X2 system also. ( it is decent for some stuff. A single user workstation ..... in most case probably not. It isn't what is it built for. )
https://www.anandtech.com/show/12694/assessing-cavium-thunderx2-arm-server-reality/


Ampere's eMag system was benched over at https://www.phoronix.com/scan.php?page=article&item=ampere-emag-osprey&num=1
( TDP of that CPU is in the 120W range. )





Are you talking about stuff from this era ??? (2011 )
" ... “ARM microprocessors are designed for lower performance and unlikely to match x86 performance in the next few years,” Kanter said. ... "
https://www.macworld.com/article/1159856/macbook_arm.html
Which was and is true. At this point, we are more than several years past 2011. However, it is EXTREMELY illustrative of just how long this 'echo chamber' has been clamoring that Macs are going ARM just around the corner.




Not really. Apple is more than several years away from that point. But they aren't the only ARM implementors out there. The several implementations have mainly been designed for different workloads. They could be shift but highly debatable whether there is a reasonable sized market there to justify the expense. Just the server stuff has been problematical in terms of volume and reach with available OS base options.



On which benchmarks? The notion that ARM servers were going to wipe the x86 servers from every task in the Data Center and in HPC envrionments in a couple of years ... yeah that's mostly hooey. High users , high latency ( due to concentrated workload) tasks they do work. But that isn't a single user workstation context for vast majority of workstation users.

Get a room , nerds ! ;)
What you are discussing are Windows machines .

The next MP will be about getting accepted back into the market again, after a long hiatus , or it will be Apple's swan song on professional computing .

CPU architecture and choice are of no consequence, unless it messes with existing programs or peripherals .

You like to make fun of people 'whining' about getting no PCIe slots and SATA bays (2.5 fine by me) ?
Well, you are asking for the next tcMP then, hence I think you completely miss the point .
 
  • Like
Reactions: ETN3
I dumb down circuit design so that everybody can understand that. I leave matter of Apple CPUs, because I simply... do not know how Apple is willing to bend on 3 factors: power, performance, area, and how much they are willing to pay for silicon designs. And 7 nm process designs are not cheap.

Uhhh... you don't need to dumb it down. This is a mac forum and a tech forum.... you can be more techie about it than simply saying a number like "40" and then "amperage" after it. You can like flex your nerd muscle more. We can all learn instead of wondering the vagueness.

And, we get that it's not cheap. Can you be more specific? And, don't copy paste an article. (we can all do that ourselves).
 
I just want a Mac Pro that can easily handle the 4K or 8K video or whatever formats show up over the next five years or so. Which means it should be readily expanded, upgraded and updated by the owner.
I absolutely do not care how big or small it is, or what it looks like.
The only Apple product whose size matters to me is my iPhone. FFS, make 'em smaller, not bigger!
And bring back the iPod shuffle while we're at it.
Please continue with your informative discussions about the tech details of the next Mac Pro. That's not snark, I really mean it. I may not understand the fine points but I enjoy learning things here.
 
No problem , just deduct the cost of the external box from the price of an MP without drive slots .
For a fast and quiet 4 bay enclosure, including cables , about 800 bucks or so should do .

You mean, just like they did with the trashcan? With it, I would have spent $2,500 on missing functionality (3 enclosures - 1 for the missing bays, 1 for the external 4-bay bay enclosure that didn't have a TB connector, 1 for the Blu-ray player, and a dock, so I could connect scanners, thumb drives, iPads, iPhones, etc).

Today, Apple is about fashion and how much they can shear the sheeple. I suspect they will keep the prices the same.
 
Uhhh... you don't need to dumb it down. This is a mac forum and a tech forum.... you can be more techie about it than simply saying a number like "40" and then "amperage" after it. You can like flex your nerd muscle more. We can all learn instead of wondering the vagueness.

And, we get that it's not cheap. Can you be more specific? And, don't copy paste an article. (we can all do that ourselves).
In simple terms: 250 000 000 USD per each design. That is just the design, implementation, validation costs, without manufacturing costs of each chip, that lands in the computer. 7 nm process, for now, is ridiculously expensive.

iMac and Mac Mini Share chips, iMac Pro, and presumably, Mac Pro also would share them. That alone makes it 500 000 000 spend on design costs of chips. Then we have Macbook, Macbook Air, and Macbook Pros. 3 different designs, which total at 750 000 000. Of course, there is IP shared between the design, which lowers the cost, BUT, it does not halve it.

Now scale those numbers per year, per each hardware refresh, and you get to stupid design costs each year spend just to have silicon designed for their computers. Yes, the design costs go down, as time goes by(China company Subor payed around 30 000 000 USD to AMD for their Custom designed APU for the Chinese market console, but we are talking about a process for which, at the beginning, design costs were at 75 000 000 USD).

Its way easier to team up with a company that does hardware already, at massive scale and ride this boat, than to design those chips by yourself, if you want multiple designs. Apple has just two designs on 7 nm.

Those manufacturing costs are the main reason why AMD went with Chiplet designs. Second is scalability. You make one 7 nm design for CPU chiplet, for ALL of your desktop CPUs. And you can pay pennies for 14 nm I/O dies, that this design would require.
 
In simple terms: 250 000 000 USD per each design. That is just the design, implementation, validation costs, without manufacturing costs of each chip, that lands in the computer. 7 nm process, for now, is ridiculously expensive.

iMac and Mac Mini Share chips, iMac Pro, and presumably, Mac Pro also would share them. That alone makes it 500 000 000 spend on design costs of chips. Then we have Macbook, Macbook Air, and Macbook Pros. 3 different designs, which total at 750 000 000. Of course, there is IP shared between the design, which lowers the cost, BUT, it does not halve it.

Now scale those numbers per year, per each hardware refresh, and you get to stupid design costs each year spend just to have silicon designed for their computers. Yes, the design costs go down, as time goes by(China company Subor payed around 30 000 000 USD to AMD for their Custom designed APU for the Chinese market console, but we are talking about a process for which, at the beginning, design costs were at 75 000 000 USD).

Its way easier to team up with a company that does hardware already, at massive scale and ride this boat, than to design those chips by yourself, if you want multiple designs. Apple has just two designs on 7 nm.

Those manufacturing costs are the main reason why AMD went with Chiplet designs. Second is scalability. You make one 7 nm design for CPU chiplet, for ALL of your desktop CPUs. And you can pay pennies for 14 nm I/O dies, that this design would require.

I don't think Apple is paying the design cost of the CPU's. That is on Intel. This is also why Apple is also a year off whenever a new Intel CPU is released. The time gives Intel the ability to produce more, fix bugs in the production line and perhaps, lower the price on their chips that will be put in Macs. I am just using common sense here and dot-connection to come up with that (Ie, I am not an insider or some "expert").

If, you got those numbers from an article on the web, then, you already didn't do what I asked you not to do which is copy and paste stuff. Seems like you just copied and pasted numbers to come up with a favorable conclusion to your point of what seems to be "AMD CPU in Macs" thing that you can't let go of.

PS--this is also why Intel has tick-tock thing. And Xeons are also a year off from Mainstream counterpart to continue that common sense/connect the dot logic to the subject regarding CPU-whatever-production.

PPS--Your post above also confirms, in my mind, your prevalent tendency to copy/paste numbers (40 amps and now $$$$-thing) in order to be specific and at the same time very vague to make your point, while using your copy/pasting-vagueness to shield or coax how much knowledge you possess, while at the same time, coerce the readers that you must know somehow that AMD CPU in Macs makes sense.
 
Last edited:
I don't think Apple is paying the design cost of the CPU's. That is on Intel. This is also why Apple is also a year off whenever a new Intel CPU is released. The time gives Intel the ability to produce more, fix bugs in the production line and perhaps, lower the price on their chips that will be put in Macs. I am just using common sense here and dot-connection to come up with that (Ie, I am not an insider or some "expert").

If, you got those numbers from an article on the web, then, you already didn't do what I asked you not to do which is copy and paste stuff. Seems like you just copied and pasted numbers to come up with a favorable conclusion to your point of what seems to be "AMD CPU in Macs" thing that you can't let go of.

PS--this is also why Intel has tick-tock thing. And Xeons are also a year off from Mainstream counterpart to continue that common sense/connect the dot logic to the subject regarding CPU-whatever-production.

PPS--Your post above also confirms, in my mind, your prevalent tendency to copy/paste numbers (40 amps and now $$$$-thing) in order to be specific and at the same time very vague to make your point, while using your copy/pasting-vagueness to shield or coax how much knowledge you possess, while at the same time, coerce the readers that you must know somehow that AMD CPU in Macs makes sense.

It is VERY obvious that you are not an "expert". Apple is a year off because nowadays they purchase CPUs at the end of their life-cycle, not the beginning. It isn't 2006 anymore.

It wouldn't surprise me in the least bit if the reason for such long gaps between computers is because they only build 1 run of them and don't start working on the next one until they are nearly out of supply.

Intel doesn't have a "tick-tock" thing anymore - hasn't had it for a while. You might want to read up on the whole Intel's disaster at 10 nanometers. Intel is more like "Tick-tock-tock-tock-tock" in 2019.


Try a TCO analysis and get back with us on how to quantify that Intel Xeons are better than AMD Threadripper & Epyc CPUs. So far, all you have is F.U.D.
 
  • Like
Reactions: koyoot
Intel doesn't have a "tick-tock" thing anymore - hasn't had it for a while. You might want to read up on the whole Intel's disaster at 10 nanometers. Intel is more like "Tick-tock-tock-tock-tock" in 2019.

I just used the tick-tock thing as an example. We can read tech news, bro.
[doublepost=1549327364][/doublepost]
Try a TCO analysis and get back with us on how to quantify that Intel Xeons are better than AMD Threadripper & Epyc CPUs. So far, all you have is F.U.D.

I am not going to do that because IDGF (I Don't Give A YouKnowWhat)! Nor, does Apple (I have a bet on this so, if, Apple puts AMD CPU's in Macs, then I give $5 to charity and post the receipt here)!

PS--What? All I have is Fear, Uncertainty and Doubt?
 
Last edited:
I just used the tick-tock thing as an example. We can read tech news, bro.
[doublepost=1549327364][/doublepost]

I am not going to do that because IDGF (I Don't Give A YouKnowWhat)! Nor, does Apple (I have a bet on this so, if, Apple puts AMD CPU's in Macs, then I give $5 to charity and post the receipt here)!

PS--What? All I have is Fear, Uncertainty and Doubt?

You certainly don't have numbers nor facts, because of IDGF. You have an opinion, blissfully untethered to any facts.

Every business does a TCO analysis on every decision outside of trivial ones. Picking CPUs isn't a trivial decision.
 
  • Like
Reactions: koyoot
You certainly don't have numbers nor facts, because of IDGF. You have an opinion, blissfully untethered to any facts.

No sheet, Sherlock! It goes without saying that they're mere opinions!

PS--The reason why (as if I have to mansplain this to you), Mrs. NoDoubtfire, that IDGF about your TCO assessment is because I am not Apple. Nor, am I personally shopping for Xeon or Threadripper PC's. But, more the former reason because that is what the subject is about (AMD CPU's in Macs).
[doublepost=1549332139][/doublepost]
Every business does a TCO analysis on every decision outside of trivial ones. Picking CPUs isn't a trivial decision.

Is that so, Mrs. NoDoubtfire?
 
Last edited:
^^^^Don't bring me into this mess! And, for the record, I never had any disagreements with MVC. I am and have always been a satisfied customer. Don't know what the heck you're talking about 1387914497.gif

Lou
 
Gosh, Intel is screwed in these days. AMD is preparing 3rd gen CPU with maximum 16 cores and 5.1ghz base on the CPU model. Threadripper will get up to 64 cores under $2000! Isnt this what we want?
 
  • Like
Reactions: Wuiffi
Gosh, Intel is screwed in these days. AMD is preparing 3rd gen CPU with maximum 16 cores and 5.1ghz base on the CPU model. Threadripper will get up to 64 cores under $2000! Isnt this what we want?

I doubt Threadripper will go beyond 32 cores for a while. Running 64 cores on measy quad channel memory would have too many scenarios where memory throughput will be limiting factor. Higher clocked 32 core will provide much better balance.
 
The big problem Apple faces leaving Intel is the MacBook Pro, especially the 15" MacBook Pro. AMD doesn't have a workstation grade laptop chip that is either ~45 watts without graphics or ~75 watts with really capable graphics, and 6 (perhaps soon to be 8) cores. They have adequate chips in the 13" MacBook Pro range, but much less choice than Intel offers.

The MacBook Pros are big sellers, and they'd require a new core to move to ARM - especially the 15" is too powerful for "just a bunch of Tempest cores"

The MacBook/MacBook Air world is much easier to move to ARM, and they could bring the Mini and maybe the 21" iMac along. They'd need 6 and 8 Tempest core designs at the higher end (1.5 iPad Pros and double iPad Pros), but no new cores. 8 Tempest cores would be OK for the MacBook Pro 13", but it would be hard from a marketing perspective to split the MBP line - 13" ARM and 15" x64.

The 27" iMac and above all have great AMD choices - but nobody but Intel makes the right chip for that darned 15" MBP, which is a big seller.
 
I am not going to do that because IDGF (I Don't Give A YouKnowWhat)! Nor, does Apple (I have a bet on this so, if, Apple puts AMD CPU's in Macs, then I give $5 to charity and post the receipt here)!

PS--What? All I have is Fear, Uncertainty and Doubt?
Let me put it to you this, this way.

Every CEO of any company who will pick Intel over AMD in upcoming 2 years, both on desktop, or Server market will be extremely stupid, and deserves to be sacked because of wasting company's money, on outdated hardware.

Every CEO DOES GIVE YOU KNOW WHAT about Total cost of ownership. AMD last round have had Value on their advantage, compared to Intel. This round, they will have ALL OF IT: Performance, Efficiency, Value to their advantage over Intel.

Being Apple, you want to be on the bleeding edge of technology, otherwise you rule out yourself from ANY equation. Thousands of proffesionals will have the choice: paying 5000$ for iMac Pro or two times faster computer, using AMD mainstream CPUs, for half as much, what will they pick? How much faster computer you can buy, using AMD hardware, with the same money you pay for iMac Pro?
Gosh, Intel is screwed in these days. AMD is preparing 3rd gen CPU with maximum 16 cores and 5.1ghz base on the CPU model. Threadripper will get up to 64 cores under $2000! Isnt this what we want?
Where did you got Info on 16 Core CPU getting to 5.1 GHz base...?

At 125W TDP on 7 nm Maximum Two core Turbo clock possible is 5.1 GHz, without binning. All-core Turbo states will be lower. 65W 8C/16T CPU is supposed to Turbo to 4.0 GHz on all cores in 65W's.

The big problem Apple faces leaving Intel is the MacBook Pro, especially the 15" MacBook Pro. AMD doesn't have a workstation grade laptop chip that is either ~45 watts without graphics or ~75 watts with really capable graphics, and 6 (perhaps soon to be 8) cores. They have adequate chips in the 13" MacBook Pro range, but much less choice than Intel offers.
Have you heard anything about AMD Renoir? It is 7 nm monolithic APU. It has 8 cores, and most likely 16-20 CU's.
 
I don't think Apple is paying the design cost of the CPU's. That is on Intel. This is also why Apple is also a year off whenever a new Intel CPU is released. The time gives Intel the ability to produce more, fix bugs in the production line and perhaps, lower the price on their chips that will be put in Macs. I am just using common sense here and dot-connection to come up with that (Ie, I am not an insider or some "expert").

If, you got those numbers from an article on the web, then, you already didn't do what I asked you not to do which is copy and paste stuff. Seems like you just copied and pasted numbers to come up with a favorable conclusion to your point of what seems to be "AMD CPU in Macs" thing that you can't let go of.

PS--this is also why Intel has tick-tock thing. And Xeons are also a year off from Mainstream counterpart to continue that common sense/connect the dot logic to the subject regarding CPU-whatever-production.

PPS--Your post above also confirms, in my mind, your prevalent tendency to copy/paste numbers (40 amps and now $$$$-thing) in order to be specific and at the same time very vague to make your point, while using your copy/pasting-vagueness to shield or coax how much knowledge you possess, while at the same time, coerce the readers that you must know somehow that AMD CPU in Macs makes sense.
Apple designs chips themselves. We "Know" that Intel will be booted out of Apple hardware in upcoming years. Already Apple is on 7 nm process, with A series Chips. Each design costs 250 mln $ on this process. That is Why I posted that if Apple wants to design chips, on their own, for their computers, they will need A LOT OF MONEY to put into designs. Its way easier to team up with someone who does already have 7 nm designs, and manufactures chips on massive scale.

Tick-Tock is dead. 14 nm was late, and 10 nm is already 2.5 years late, and appears to be dead in the water(Intel ported back 10 nm to be something line 12 nm process, because of inability to make "standard" 10 nm process work, because of rubbish yields).

You fail to understand the basic concept that is being discussed, and yet - you want to find any proof that using AMD hardware is bad idea, or that person talking about it is not credible enough.

I will turn the tables. In a world, where AMD has better hardware than Intel, in every single way, try to prove that it is a good thing to stay with Intel, for upcoming years. I will gladly laugh my ass off reading your post about it.
 
  • Like
Reactions: ssgbryan
Have you heard anything about AMD Renoir? It is 7 nm monolithic APU. It has 8 cores, and most likely 16-20 CU's.

Next APU should be Picasso, which is either a 12nm version of the current 14nm Raven Ridge APUs or a new 12nm design...

I kinda hope it is a new design, because maybe that means more CUs...

Renoir should be after that, so sometime in 2020...?

What I really want to see is a high-end Enthusiast APU, something to bring a 8C/16T CPU & solid mid-range 40CU GPU together on one package, maybe even 8GB of HBM2...
 
Next APU should be Picasso, which is either a 12nm version of the current 14nm Raven Ridge APUs or a new 12nm design...

I kinda hope it is a new design, because maybe that means more CUs...

Renoir should be after that, so sometime in 2020...?

What I really want to see is a high-end Enthusiast APU, something to bring a 8C/16T CPU & solid mid-range 40CU GPU together on one package, maybe even 8GB of HBM2...
Picasso is 12 nm process version of Raven Ridge APUs. Nothing more.

For the upcoming few years - don't count on ridiculous amount of CU's.

Renoir may have 8C/16T+20 CU Vega GPU. Dali can have 4C/8T+10CU Navi GPU.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.