Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Status
Not open for further replies.

dpny

macrumors 6502
Jan 5, 2013
272
109
Added emphasis. I think that you have to go back in time about a decade to see an AMD CPU that wasn't disappointing.

And don't bring up the fake FirePro cards that Apple puts in the MP6,1 - they aspire to be disappointing.

I really do hope Zen ends up being a good CPU family. Intel needs some competition. But I will wait and see.
 

ManuelGomes

macrumors 68000
Original poster
Dec 4, 2014
1,617
354
Aveiro, Portugal
Matt, TB3 implementations are still not abundant and no manufacturer will spend any resources designing new systems with TB3 based on old platforms. New designs (containing TB3) will always come as Skylake machines. I would agree though that X99 or C610 based mobos should surface while there are no Skylake Xeons or HEDTs. Maybe from the likes of GB/MSI/Asus... and Apple!!
 

Mago

macrumors 68030
Aug 16, 2011
2,789
912
Beyond the Thunderdome
Added emphasis. I think that you have to go back in time about a decade to see an AMD CPU that wasn't disappointing.

And don't bring up the fake FirePro cards that Apple puts in the MP6,1 - they aspire to be disappointing.
Agree.

Zen on the paper looks promising and in short term (at least at first year) Zen should lead the CPU performance race since Intel short term plans are focused on power efficiency than increase the IPC, assuming AMD don't overrated it's estimate about Zen (ejem ejem) it shouldn't have problem to lead at least until Intel reacts and builds a new IPC focused core.
 

jonisign

macrumors regular
Jul 7, 2007
153
16
Aren't these clock speeds very low? Will there not be lets say a 6-core with > 3.0 GHz? Because then I should really go for the 5,1 or hack route...

Many of those chips will have high all core turbo clocks.

For instance, the 14 core E5-2680v4 is 2.4 GHz, the 12 core E5-2680v3 is 2.5 GHz. The turbo multipliers for the v3 version are +400 MHz all core and +800 MHz single/dual core. I imagine we'll see something similar for these, so realistically a 2680V4 would operate at 2.8GHz with all 14 cores running, and 3.2GHz in work that runs in 2-4 threads.

Also, it looks like there are several options that run at 3GHz+ at base clocks that are within the ~130W TDP for the Mac Pro:

According to wikipedia: 135W chips
Xeon E5-2637 v4: 4 core, 3.4GHz
Xeon E5-2643 v4: 6 core, 3.5GHz
Xeon E5-2667 v4: 8 core, 3.4GHz
Xeon E5-2689 v4: 10 core, 3.1GHz

While lower base clocks, the following chips will likely have all core turbos up to ~3.1GHz and dual core turbo up to ~3.6GHz:
Xeon E5-2690 v4: 14 core, 2.4GHz
Xeon E5-2697 v4: 18 core, 2.3GHz (145W)
Xeon E5-2698 v4: 20 core, 2.1GHz (confirmed turbo to 3.5GHz)
Xeon E5-2699 v4: 22 core, 2.2GHz (confirmed turbo to 3.6GHz) (145W)
 
Last edited:

dpny

macrumors 6502
Jan 5, 2013
272
109
Aren't these clock speeds very low? Will there not be lets say a 6-core with > 3.0 GHz? Because then I should really go for the 5,1 or hack route...

Newer CPUs have much better IPC than the six-year old chips in the cMP. Any decent current CPU will run rings around the Xeons in the cMPs.
 

koyoot

macrumors 603
Jun 5, 2012
5,939
1,853
We should look at Xeon E5v4 16XX for base models rather at Xeon E5v4 2XXX CPUs.
 

Zarniwoop

macrumors 65816
Aug 12, 2009
1,038
760
West coast, Finland
We should look at Xeon E5v4 16XX for base models rather at Xeon E5v4 2XXX CPUs.
Yes, in Haswell, EP-2600 series runs with 10-20% lower clocks than EP-1600 series.

It seems that on higher clocks, Haswell and Skylake are very near each other in test results. Some multi-core tests benefit from Skylake, as well as Photoshop. So.. I wouldn't expect that Broadwell EP is much different form Haswell performance wise. And Haswell brought ~7% speed gain with 7% higher TDP. So, perf/watt Haswell offered nothing. Apple wont get much benefit from changing from E5-EP v2 > v4. The big difference has to become elsewhere. Maybe Apple is interested in the new 105W category of E5-EP v4?

This test has been made with 4GHz processors:
i7 6700K vs i7 4790K Performance Summary
Unigine Heaven Pro 4.0 No change with discrete GPU, >25% faster with onboard graphics.
PCMark 8 Pro .5-1% increase in performance.
Geekbench 3 5% increase in multi-core performance.
Cinebench R15 1% increase in CPU performance, 1.5% decrease in GPU performance.
POV-Ray 7.7% increase in performance.
Linpack 12.5% increase in performance. *Updated using Linpack 11.3
Lightroom CC 2015 1.5-2% increase in most image handling tasks, but a huge 15% increase in image export performance. ~2-3% increase in HDR/Panorama image creation performance.
Photoshop CC 2015 8.5% increase in overall performance. Varies anywhere from .8% to 16.7% depending on the effect.
Premiere Pro CC 2015 6% increase in H.264 encoding performance, marginal increase in MPEG2 encoding performance.

Source: https://www.pugetsystems.com/labs/articles/Haswell-vs-Skylake-S-i7-4790K-vs-i7-6700K-641/
 
Last edited:

Zarniwoop

macrumors 65816
Aug 12, 2009
1,038
760
West coast, Finland
If that is true, my predictions with why Apple is using AMD products starts to become more believable. Apple wants to be independent.. and this is the way. By using AMD's GPU's Apple has a) funded its secret processor projects to be independent of Intel, b) has got rid of CUDA dependence c) and has adapted Mantle principals for Metal.

But instead of going 100% AMD, Apple will most likely use best of both worlds, Intel and AMD... at least for now.

Apple wants to create unique products, but it is very difficult when all competitors are using the same components.
 

AidenShaw

macrumors P6
Feb 8, 2003
18,667
4,677
The Peninsula
If that is true, my predictions with why Apple is using AMD products starts to become more believable. Apple wants to be independent.. and this is the way. By using AMD's GPU's Apple has a) funded its secret processor projects to be independent of Intel, b) has got rid of CUDA dependence c) and has adapted Mantle principals for Metal.

But instead of going 100% AMD, Apple will most likely use best of both worlds, Intel and AMD... at least for now.

Apple wants to create unique products, but it is very difficult when all competitors are using the same components.
I really can't picture Apple "betting the desktop business" on AMD. The split of the RTG out of AMD proper is a classic "get ready to get the best prices after a bankruptcy sale" gambit.

Apple will be royally "in trouble" if AMD folds and the RTG doesn't get a buyer. No more ATI graphics, and years of burning the bridges with Nvidia.

The AMD cheerleaders aren't talking about how AMD has been running a negative cash flow for an unsustainable time.
 

dpny

macrumors 6502
Jan 5, 2013
272
109
Apple will be royally "in trouble" if AMD folds and the RTG doesn't get a buyer. No more ATI graphics, and years of burning the bridges with Nvidia.

I don't think there's been any burning of bridges. I've watched Apple switch back and forth between ATi/AMD and nVidia for fifteen years. If Apple decides to go all nVidia next year, nVidia won't say no.
 

koyoot

macrumors 603
Jun 5, 2012
5,939
1,853
If that is the case, that AMD goes down, first offer for RTG buyout will be Apple made.

That information, I think, is based on industry rumors about the APU boards that AMD provided to Apple few months ago(early december if I remember correctly).
 

Stacc

macrumors 6502a
Jun 22, 2005
888
353
I don't think there's been any burning of bridges. I've watched Apple switch back and forth between ATi/AMD and nVidia for fifteen years. If Apple decides to go all nVidia next year, nVidia won't say no.

There are rumors that Nvidia threatened Apple with a patent lawsuit sometime between 2011 and 2013. Both samsung and qualcomm were sued by Nvidia in the beginning of 2014. Apple doesn't take kindly to this sort of thing so it wouldn't surprise me if they didn't use Nvidia chips for this reason alone.
 

zephonic

macrumors 65816
Feb 7, 2011
1,314
709
greater L.A. area
There are rumors that Nvidia threatened Apple with a patent lawsuit sometime between 2011 and 2013. Both samsung and qualcomm were sued by Nvidia in the beginning of 2014. Apple doesn't take kindly to this sort of thing so it wouldn't surprise me if they didn't use Nvidia chips for this reason alone.

That seems unlikely to me.

Apple and Samsung had the mother of all lawsuits, and it never stopped them from doing business together.

I think they approach these things pragmatically.
 
  • Like
Reactions: pat500000

Stacc

macrumors 6502a
Jun 22, 2005
888
353
That seems unlikely to me.

Apple and Samsung had the mother of all lawsuits, and it never stopped them from doing business together.

I think they approach these things pragmatically.

Apple doesn't have much choice when it comes to iPhone chip manufacturers. They need so much volume they need all the production they can get.

When it comes to gpus they have an alternative that is more than willing to design custom form factors, first access to chips and great prices.
 

ManuelGomes

macrumors 68000
Original poster
Dec 4, 2014
1,617
354
Aveiro, Portugal
I can see Apple going AMD but only if Zen is indeed good, and we'll only know that in a year. 2017/18 sounds reasonable then.
Still, Apple would become too dependent on AMD, and as some like to point out whenever they can, AMD is in deep $h|t right now, and that's a bit risky. If they do indeed have good products this year, they might make it through.
Or Apple will eventually buy them out, as rumored as well.

It's hard at this point to come out with something really innovative and extraordinary in terms of performance, all has been done and all the tricks are known to everyone. Who ever implements it best gets the crown. Upgrades are becoming less interesting as time goes by.
But I guess we all always want the latest, even if it brings nothing really new.
 

Zarniwoop

macrumors 65816
Aug 12, 2009
1,038
760
West coast, Finland
So AMD representative said on Reddit Q/A that Polaris will come in mid-2016. There are rumors about nVidia bringing new mobile chips same time. Broadwell EP-1600 is not the thing Apple will celebrate on their next nMP. The speed advantage is so small. So, even if Intel releases it within a month, Apple wont update nMP because of that.

To the ultimate question about life and everything: when will we see nMP v2. My educated guess is: at WWDC 2016. There's going to be a whole new Pro line including new displays and Pro software, new laptops and updated Mac Pro. And because Apple wants to win developers and Pro users back, it's going to be a big event. Maybe about VR content creation. And a showcase how to use iPad Pro with Macs in a new productive ways.

This March we could see updates for MacBooks and Mac mini. And perhaps the funerals of Macbook air and its reborn to something similar, like 14" Macbook S. These along the new 10" iPad Pro and iPhone SE.

Zen based Apple VR-TV might come for late 2016 for the holiday season. After when developers have had 6 months to create some content for it.

And just for fun, one detail from Reddit Q/A:
- How did Mantle effect the development on DX12 besides catalyzing progress, was there a significant amount of Mantle going into the development of DX12 or was it just the philosophy of low CPU/Driver overheads?
"I like to say that Mantle was influential. Microsoft was one of several parties that had access to the API specification, documentation and tools throughout Mantle's 3-year development cycle. We're glad Microsoft saw we were doing the right thing for PC graphics and decided to spin up the DX12 project; DX12 has been pretty great for performance and features."

And why I took that question? Because (most likely) Apple "was one of several parties that had access to the API specification, documentation and tools throughout Mantle's 3-year development cycle." That's why Metal came around the same time with Mantle and DirectX 12 beta. Because they're all AMD origin technology.
 
Last edited:

ManuelGomes

macrumors 68000
Original poster
Dec 4, 2014
1,617
354
Aveiro, Portugal
Seen the latest on GoW? Does that surprise anyone still?
[doublepost=1457290481][/doublepost]Still very little info on OS X 10.12 (few machines running it still), wish someone would get their hands on it and see what news it might provide. On the nMP that is.
 

Zarniwoop

macrumors 65816
Aug 12, 2009
1,038
760
West coast, Finland
Even if AMD can deliver only Polaris 10 in mid-2016 and the rest chips will come later, Apple could use four of those babies in the best model of nMP v2.

So, low end machine could be even just a dual Tonga with 2GB, the better model with a dual polaris 10 with 4GB (TDP ~ 75W each) and the best model with quad Polaris 10 with 4GB (TDP ~60W each).

WWDC 2016.
 

dpny

macrumors 6502
Jan 5, 2013
272
109
There is one good source on the internet that says, that Apple got rid of Nvidia from their computers for, Im quoting here: "for foreseeable future". ;)

http://semiaccurate.com/forums/showthread.php?p=252498#post252498

3 last posts on the page.

Without any provenance it's not helpful. Could be true, could be completely crap.

As someone else pointed out above, given he dire state of IP law, many of the lawsuits are defensive and pro forma. nVidia made less than US$ 500 million last year. If Apple comes calling, they will say yes.
 

goMac

macrumors 604
Apr 15, 2004
7,663
1,694
If that is true, my predictions with why Apple is using AMD products starts to become more believable. Apple wants to be independent.. and this is the way. By using AMD's GPU's Apple has a) funded its secret processor projects to be independent of Intel, b) has got rid of CUDA dependence c) and has adapted Mantle principals for Metal.

But instead of going 100% AMD, Apple will most likely use best of both worlds, Intel and AMD... at least for now.

Apple wants to create unique products, but it is very difficult when all competitors are using the same components.

Apple is not switching to AMD for CPUs. Metal and Mantle have nothing to do with each other besides similar sounding names (and Metal basically has nothing to do with any CPU vendors, only GPU.) AMD doesn't make Apple any more independent.

It's not impossible Apple would use custom silicon for their laptop lines. The iMacs are a bit of a stretch, it's harder for AMD to offer a competitive advantage there, and Intel is doing well there... The Mac Pro just makes no sense at all. The sales volumes aren't high enough to justify a custom chip, and Intel is doing very well competitively with the Xeon.

I think if Apple did AMD, you'd be likely to see if somewhere like the 12" Macbook. The Mac Pro makes no sense at all for AMD right now.
[doublepost=1457298091][/doublepost]
So, low end machine could be even just a dual Tonga with 2GB, the better model with a dual polaris 10 with 4GB (TDP ~ 75W each) and the best model with quad Polaris 10 with 4GB (TDP ~60W each).

As nice as quad Polaris could be, remember OS X has no Crossfire. Apple could make the case that they could be used as compute devices, but I think they'd probably want to stick with dual GPUs.

I wouldn't put it past AMD to deliver a GPU part early just for Apple that is higher end., especially if Apple announces the new Mac Pros at WWDC, but queues them for release in the Fall (which would be sad for people waiting for new Mac Pros, but maybe better if they can get better Polaris parts.)
[doublepost=1457298183][/doublepost]
And why I took that question? Because (most likely) Apple "was one of several parties that had access to the API specification, documentation and tools throughout Mantle's 3-year development cycle." That's why Metal came around the same time with Mantle and DirectX 12 beta. Because they're all AMD origin technology.

Metal predates Mantle by a while. Remember, Metal originated on iOS and was there for quite a while.

They're not related. Just two groups dealing with the same problem around the same time.

Metal, when it was introduced, also had some concepts that were totally incompatible with AMD GPUs. Metal is really an API intended for getting great performance out of integrated GPUs (which is a continued problem for it on the Mac.) Mantle is an API for getting great performance out of high end discrete GPUs. The target GPUs aren't even the same.
 
Last edited:
Status
Not open for further replies.
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.