Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
There's a lot of valid reasons why they would stay with intel, but my gut feel is that they will go AMD because:

- If they were going intel then all the parts are already available; There was a mid and a late version of the imac in 2015 so it's not like they can't refresh quickly if they wanted to.
- Kaby Lake iisn't going to introduce that much gain over the current top end bto spec
- Practically zero rumors on an iMac replacement or leaks which, if due to a move to amd, goes hand in hand with the NDA code of silence which lifts on ryzen on March 2
- If the Ryzen part turns out to live up to the hype with performance equal or better to intel at half the price it's going to be a heck of a lot cheaper and pretty difficult for Apple to ignore.

Not necessarily true, I don't believe Kaby Lake CPU's for iMac are out just yet (or just recently came out) and they are likely waiting for a new GPU update from AMD.

I *really* doubt we're going to see AMD CPU's anytime soon, especially not a first generation chipset. A lot of people state price is the reason they would but AMD's chips have low margins and they probably can't give Apple the same bulk buy % off that Intel does.
 
There's a lot of valid reasons why they would stay with intel, but my gut feel is that they will go AMD because:

- There are no Kaby Lakes without a GPU.** The entry level GT2 GPU takes as much space as four CPU cores. The GPU in the 6700K is just wasted in a 27" iMac. Thus AMD can offer Apple an 8 core chip at a comparable price, or a 4 core for less. Hence, the time is ripe for a switch to AMD.

- Practically zero rumors on an iMac replacement or leaks which, if due to a move to amd, goes hand in hand with the NDA code of silence which lifts on ryzen on March 2

It is interesting that MacOS betas have signs of Kaby Lake laptops, but not desktops. If desktops are moving to AMD, it makes sense that any new kexts and configuration data have yet to be moved to the main MacOS fork.

Not necessarily true, I don't believe Kaby Lake CPU's for iMac are out just yet (or just recently came out)

Released in January. Of course there will never be a simple Kaby Lake upgrade suitable for the 21.5" retina.


** While there are many SKUs, there appear to be only four Kaby Lake dies:

1) Dual core with GT2 graphics binned into 4.5W and 15W versions.
2) Dual core with GT3e graphics binned into 15W and 28W versions.
3) Quad core (with GT2) mobile.
4) Quad core (with GT2) desktop.

There are Kaby Lake Xeons, but those are just quad core dies with extra features enabled.

There will also be KabyLake-X processors, but those will be quad core desktop dies with the GPU disabled leaving more TDP for the CPU cores.[/QUOTE]
 
Not necessarily true, I don't believe Kaby Lake CPU's for iMac are out just yet (or just recently came out) and they are likely waiting for a new GPU update from AMD.

I *really* doubt we're going to see AMD CPU's anytime soon, especially not a first generation chipset. A lot of people state price is the reason they would but AMD's chips have low margins and they probably can't give Apple the same bulk buy % off that Intel does.
AMD has much higher Margin on Ryzen chips, than Intel on Broadwell-E. At least that is accountable for 6 core CPUs from Intel. 8 and 10 core have very high margins.

AMD can afford to sell the 1800X even at 200$ price point. Can Intel afford to sell Broadwell-E or Skylake-X, 8 core chips at that price point?

It is interesting that MacOS betas have signs of Kaby Lake laptops, but not desktops. If desktops are moving to AMD, it makes sense that any new kexts and configuration data have yet to be moved to the main MacOS fork.
One of previous versions of Mac OS also had already traces of Raven Ridge. What is Raven Ridge? It is codename for AMD's Mobile, and desktop APUs.
 
AMD has much higher Margin on Ryzen chips, than Intel on Broadwell-E. At least that is accountable for 6 core CPUs from Intel. 8 and 10 core have very high margins.

AMD can afford to sell the 1800X even at 200$ price point. Can Intel afford to sell Broadwell-E or Skylake-X, 8 core chips at that price point?

One of previous versions of Mac OS also had already traces of Raven Ridge. What is Raven Ridge? It is codename for AMD's Mobile, and desktop APUs.

If one thing should be a reminder is that Apple doesn't source components solely based on price. Again there is a lot of inherent risks moving to AMD chips that just are not there with Intel's. If it was a second generation chip that was tried and true or a truely revolutionary chip that leaped forward in performance it'd be worth the risk but Ryzen is neither of those it's a small bit better than Kaby Lake and a big price cut.

The other thing is to date no AMD chipset has ever supported Thunderbolt, the likelihood of Apple shipping an iMac without Thunderbolt is probably very close to 0%. AMD would have to license the tech from Intel to include it and that would probably drive the costs up quite a lot.
 
If one thing should be a reminder is that Apple doesn't source components solely based on price. Again there is a lot of inherent risks moving to AMD chips that just are not there with Intel's. If it was a second generation chip that was tried and true or a truely revolutionary chip that leaped forward in performance it'd be worth the risk but Ryzen is neither of those it's a small bit better than Kaby Lake and a big price cut.

The other thing is to date no AMD chipset has ever supported Thunderbolt, the likelihood of Apple shipping an iMac without Thunderbolt is probably very close to 0%. AMD would have to license the tech from Intel to include it and that would probably drive the costs up quite a lot.
There is nothing in the world limiting AMD to not use Thunderbolt controller. Alpine Ridge controller costs between 6 and 8$ but adds 20-25$ to the end price of each motherboard.
 
- There are no Kaby Lakes without a GPU.** The entry level GT2 GPU takes as much space as four CPU cores. The GPU in the 6700K is just wasted in a 27" iMac. Thus AMD can offer Apple an 8 core chip at a comparable price, or a 4 core for less. Hence, the time is ripe for a switch to AMD...

The GPU in the i7-6700K and i7-7700K is definitely not wasted, as that's how Intel implements Quick Sync. In a world where even cell phones can shoot 4k H264 video, Quick Sync is huge performance advantage for software which uses it. Unfortunately AMD doesn't have this.

Xeons above 4 cores also don't have Quick Sync but in a 12-core or more workstation the additional CPU cores partially compensate. It will be interesting to see how Apple handles this area when the Mac Pro is updated.
 
If anyone of you asked about performance and efficiency of Ryzen CPUs:
GTA-V.png


h5D9BZW.png


Compare 6950X, that costs 4 times more, than the CPU in question.
 
The GPU in the i7-6700K and i7-7700K is definitely not wasted, as that's how Intel implements Quick Sync. In a world where even cell phones can shoot 4k H264 video, Quick Sync is huge performance advantage for software which uses it. Unfortunately AMD doesn't have this.

Xeons above 4 cores also don't have Quick Sync but in a 12-core or more workstation the additional CPU cores partially compensate. It will be interesting to see how Apple handles this area when the Mac Pro is updated.
All modern dGPUs have video encode accelerator. NVENC in Nvidia GPUs, and VCE in AMD GPUs. Polaris is said to support 60fps encode for 4K HEVC.
 
There is nothing in the world limiting AMD to not use Thunderbolt controller. Alpine Ridge controller costs between 6 and 8$ but adds 20-25$ to the end price of each motherboard.

It'd be interesting to see the performance when it's run through a controller instead of being embedded in the CPU. I've never seen an AMD enabled chipset though.
[doublepost=1488402180][/doublepost]
If anyone of you asked about performance and efficiency of Ryzen CPUs:
GTA-V.png


h5D9BZW.png


Compare 6950X, that costs 4 times more, than the CPU in question.

These screenshots miss valid data points that concern Apple A LOT more than cost:

• How hot are the CPU's at these peak rates
• Benchmarks for processing workloads rather than gaming workloads (Video Editing, Code Compiling etc…)

Again this AMD chip is going to floor Intel at costs, but it's barely better in performance/power, like someone mentioned earlier we're way more likely to see Apple use this as a bargaining chip to get even cheaper Intel chips.
 
These screenshots miss valid data points that concern Apple A LOT more than cost:

• How hot are the CPU's at these peak rates
• Benchmarks for processing workloads rather than gaming workloads (Video Editing, Code Compiling etc…)

Again this AMD chip is going to floor Intel at costs, but it's barely better in performance/power, like someone mentioned earlier we're way more likely to see Apple use this as a bargaining chip to get even cheaper Intel chips.
29-Media-Espresso-Video-Convertor.png
28-POV-RAY-3.7.png


Things to remember. Iranian site made the reviews on Engineering Samples of both: CPU and the Motherboard, and they were done at stock: 3.4 GHz, and 2133 MHz with 17 CL timings.

Its not fully accurate review, reflecting real world performance. But it is close enough.
 
It'd be interesting to see the performance when it's run through a controller instead of being embedded in the CPU. I've never seen an AMD enabled chipset though.

The performance presumably would be similar to the 2016 MBPs - which use the Alpine Ridge controller. AFAIK, no current Intel CPU nor PCH has integrated Thunderbolt.
 
We're just hours away from the the lifting of the Ryzen review embargo but right now, all signs are pointing to Ryzen absolutely destroying Intel's current offerings on both the price performance curve, and absolute metrics of multi core performance, while coming within striking distance (and exceeding with mild overclocking) the single core performance of the i7 7700K.

I'll be honest, if the next desktop iMacs don't include at least the option for a 6 (or maybe even 8!) core CPU, the chance of Apple getting my money is very slim indeed.

Honestly Apple's let their desktop lineup stagnate for way too long to just say "ok, here's Kaby Lake (iMac), Broadwell (or Skylake) E (Mac Pro), and polaris, see how much we care about pros!"

As for thunderbolt, given Apple's hand in its creation, and its role as the sole initial promoter of TB1 and 2 (in spite of their slow adoption of TB3), they should be embarrassed if they haven't secured the rights to use it with non Intel CPUs.

To briefly restate my position from another thread, I think we're going to see Apple moving towards AMD CPUs and APUs featuring an HSA memory architecture. This would allow Apple to offer greatly increased CPU, memory, and GPU compute performance at a lower TDP, and with the added benefit for Apple of being able to offer a full lineup of machines with soldered ram (ugh).
 
Last edited:
• How hot are the CPU's at these peak rates
The temperature ("wasted energy" from switching loss, etc) is directly proportional to the energy being consumed (oh, physics) and how fast the heat is dissipated (which is less relevant to the processor itself). There is a reason why TDP is called Thermal Design Power.

Fun fact: AMD soldered the IHS, unlike Kaby Lake which uses mediocre TIM pastes.

;)
 
The temperature ("wasted energy" from switching loss, etc) is directly proportional to the energy being consumed (oh, physics) and how fast the heat is dissipated (which is less relevant to the processor itself). There is a reason why TDP is called Thermal Design Power.

Fun fact: AMD soldered the IHS, unlike Kaby Lake which uses mediocre TIM pastes.

;)

Correct thermals are a direct relation to power used, but efficiency and design of the chip is what makes the same chips run cooler or hotter at the same compute rate and depending on workloads that efficiency will show itself gruellingly in thermals.

We really just don't know what this chip entails for performance and longevity yet, only the official release (tomorrow) and time will tell.

Again all I'm saying is that the AMD chip is going to have to absolutely smoke the Intel chip all across the board for Apple to likely switch to a first generation unproven product from AMD who has been lagging in the CPU market for a decade. Competition is great though and AMD will surely carve itself a huge market share with this chip outside of Apple and it should get Intel maybe pushing the envelope more.
 
Last edited:
Again all I'm saying is that the AMD chip is going to have to absolutely smoke the Intel chip all across the board for Apple to likely switch to a first generation unproven product from AMD who has been lagging in the CPU market for a decade. Competition is great though and AMD will surely carve itself a huge market share with this chip outside of Apple and it should get Intel maybe pushing the envelope more.

It doesn't need to prove itself by staying in market for generations. TBH none of the silicon works this way. When it is taped out and the first silicon is back, being victory or death can be well determined. "Smoking chips across the board" is absolutely not a criteria too, or otherwise you cannot explain why GPUs are from the red camp when they are being "smoked across the board" for years.
 
It doesn't need to prove itself by staying in market for generations. TBH none of the silicon works this way. When it is taped out and the first silicon is back, being victory or death can be well determined. "Smoking chips across the board" is absolutely not a criteria too, or otherwise you cannot explain why GPUs are from the red camp when they are being "smoked across the board" for years.

I'm not saying the chips have to be leaps better for Apple to use them, I'm saying they have to be leaps better to justify the risk in switching. Remember how many people did (and still) bash Apple for using AMD GPU's rather then Nvidia? The difference is that AMD's GPU's are actually better for OpenCL which Apple utilises a lot in FCP.

Nobody here is giving a good reason for the switch except cost, I don't think the cost difference to Apple would be as great as it is for a consumer and if this chip is as good as quoted it would give Apple even more leverage to negotiate prices down.
[doublepost=1488457187][/doublepost]Remember too switching CPU's even with the same arch is not necessarily a shoe in. Every CPU has special optimisations for it's specific chipset and we have no idea how much of the Accelerate framework would be ready for AMD chips rather than just Intel chips.
 
No, but perception is reality, and the fact is that that apple positions itself as a premium brand and the perception is that AMD is usually in bargain priced computers.

I could make that same argument with Nvidia and AMD for the GPU, and they still choose to use AMD's GPU. So why not the CPU? Just have the normal press conference, say how great it is with 150 better %/10x better/etc., and then we find out it's subpar to the Intel equivalent, just like with the graphics. Business as usual.
 
These screenshots miss valid data points that concern Apple A LOT more than cost:

• How hot are the CPU's at these peak rates
• Benchmarks for processing workloads rather than gaming workloads (Video Editing, Code Compiling etc…)

Again this AMD chip is going to floor Intel at costs, but it's barely better in performance/power, like someone mentioned earlier we're way more likely to see Apple use this as a bargaining chip to get even cheaper Intel chips.

I cherry picked the following screenshots out of Hardware Unboxed's youtube video.

Here:


It should be noted he apparently had a lot of problems with stability for reasons not exactly known. So while the following benchmarks are good I want to still take them with a grain of salt until he has reviews with hardware he knows can perform the best it can. And even those these results are cherry picked like I mentioned they tend to give a good representation of real world use, obvious Adobe Premier is real world.

Screen Shot 2017-03-02 at 4.20.19 PM.png


Screen Shot 2017-03-02 at 4.19.56 PM.png


Screen Shot 2017-03-02 at 4.19.32 PM.png


As far as temps, well his reviews isn't representing what I'm seeing from other reviews and makes me question the hardware. Obviously we are comparing to the 6900k at this point.

Screen Shot 2017-03-02 at 4.27.11 PM.png


Excuse the scrubber at the bottom but that is his power usage of an overclocked (4.1ghz) 1800X which was at 277 watts. Seems high because his stress test with an air cooler on an open test bench MAXED out at 53c degrees.

Gaming results should be completely thrown out of the window when it comes to a Mac centric conversation. The GPU will likely always be the bottleneck for us when it comes to gaming. Even at a lower resolution our displays (referring to iMac's) are 60hz, its not until you are talking high refresh gaming where CPU's become a notable bottleneck.

It also seems there is a probably with high speed RAM (2600+ MHZ) but again for us lowly Mac owners this likely will never be a problem for us considering the limitation Apple imposes.

Needless to say I'm pretty excited with what AMD came up with. TDP aside I would still rather have a 6900K but not for twice the price and virtually no extra (sometimes less) performance. Love them or hate them we should all at least be able to acknowledge their accomplishment here.
 
After going through some benchmarks, I fully expect Apple to start using 8-core processors either from AMD or Intel in its iMacs and Mac Pros. The time is now.

If they don't start doing that, then I'll take it as a sign that Apple doesn't really care about its customers. But hey this is Apple we're talking about.
 
I could make that same argument with Nvidia and AMD for the GPU, and they still choose to use AMD's GPU. So why not the CPU? Just have the normal press conference, say how great it is with 150 better %/10x better/etc., and then we find out it's subpar to the Intel equivalent, just like with the graphics. Business as usual.

First I agree with you. Only recently (maybe not in the fast paced tech world) has AMD got the stigma of budget brand. But not because they were, just because they didnt have a high end to compete. I remember buy Intel CPU's that the AMD counterpart was better in nearly every way (I was buying Intel due to motherboards). And even today and pre Ryzen AMD easily beats Intel with integrated graphics. This is one of the reasons I want to stay with dedicated graphics, I hate know AMD does it better and I'm getting the worst of the worst essentially.

However the reason I don't think we will see AMD CPU's at least for a while is due to Intel proprietary tech. Namely Thunderbolt, also there are things like quicksync and other techs. Plus with Kaby Lake Intel has a native h265/VP9 encoding and decoding which is important.
 
  • Like
Reactions: ssgbryan
The biggest reason why we could not see AMD CPU in any of Macs is... amount of bugs, on new uArch platform.

It will be extremely hard 6 months for AMD to solve it all...
 
However the reason I don't think we will see AMD CPU's at least for a while is due to Intel proprietary tech. Namely Thunderbolt, also there are things like quicksync and other techs. Plus with Kaby Lake Intel has a native h265/VP9 encoding and decoding which is important.
The chip to add Thunderbolt to AMD costs less than $10.

Encoding/decoding inside the CPU does not matter. Modern GPUs have those techs.
 
After going through some benchmarks, I fully expect Apple to start using 8-core processors either from AMD or Intel in its iMacs and Mac Pros. The time is now.

If they don't start doing that, then I'll take it as a sign that Apple doesn't really care about its customers. But hey this is Apple we're talking about.

You can get a 6/8/12 core Xeon on the Mac Pro since 2013.

I agree though a 8 core iMac option would benefit a lot of the creative and productivity types. However I also think Apple would think an 8 core iMac would hurt Mac Pro sales, which it probably would. But in my opinion those people not buying Mac Pros are probably building PC's instead.
 
You can get a 6/8/12 core Xeon on the Mac Pro since 2013.

I agree though a 8 core iMac option would benefit a lot of the creative and productivity types. However I also think Apple would think an 8 core iMac would hurt Mac Pro sales, which it probably would. But in my opinion those people not buying Mac Pros are probably building PC's instead.
Not if the MP CPU would start higher than 8 cores.
 
The chip to add Thunderbolt to AMD costs less than $10.

Encoding/decoding inside the CPU does not matter. Modern GPUs have those techs.

I was unaware there were ways to add in support for Thunderbolt (been using Macs too long I suppose). I'll need to research a bit more into that. Because that would be great!

As for as encoding I apologize I was talking specially for video work which requires the highest quality possible (CPU).
[doublepost=1488492913][/doublepost]
Not if the MP CPU would start higher than 8 cores.

Assuming Apple stuck with Xeon I think that would price them out of even being remotely competitive. We are talking 2500+ dollar CPUs for a 12 core Xeon. Apple doesn't need Intels help pricing themselves out of the market, they are doing a damn fine job on their own thank you! lol
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.