Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Status
Not open for further replies.

Mago

macrumors 68030
Aug 16, 2011
2,789
912
Beyond the Thunderdome
But FCPX doesn't need accuracy, so Apple's decision makes sense.
Most amateur work don't need ecc, if where the case somebody us using the Mac to build original renders that's another story.

While ecc has its value mostly on system stability that data protection on a tipical Mac Pro duty.

I still 50/50 chance for another year of AMD gpu on Mac Pro, in case Apple backs to nVidia the best we will see are rebranded pascal 104 GeForce 1080ti
 

linuxcooldude

macrumors 68020
Mar 1, 2010
2,480
7,232
Competition is good for everybody. And frankly it isn't really different then what we had a few years back with NVidia flavoured mobo and AMD flavoured mobo. It forced Intel to invest more in chipset design and research that gave us the great chipset that we have today.

Mono-culture is a BAD thing. Diversity brings strength.

I think the best way now a days is software development that takes advantage of graphic cards no matter what the brand. Than we would need less of having to favor one brand over another because the software does not support it.
 
  • Like
Reactions: koyoot

AidenShaw

macrumors P6
Feb 8, 2003
18,667
4,677
The Peninsula
Most amateur work don't need ecc, if where the case somebody us using the Mac to build original renders that's another story.

While ecc has its value mostly on system stability that data protection on a tipical Mac Pro duty.

I still 50/50 chance for another year of AMD gpu on Mac Pro, in case Apple backs to nVidia the best we will see are rebranded pascal 104 GeForce 1080ti
If you investigate, you'll find a lot of rendering being done by the pros without ECC on the GPUs.

https://us.rebusfarm.net/en/news/2219-gpu-rendering-support-at-rebusfarm

Finally, GPU rendering is supported at RebusFarm!

We received a lot of requests about GPU rendering and why we don't provide this service.

Following an intensive period of development we are happy to announce that we can now provide a GPU solution consisting of 7 GeForce GTX TITAN Black GPUs.

A single-bit error in one subpixel of one frame will not be noticeable.
 

Stacc

macrumors 6502a
Jun 22, 2005
888
353
Most amateur work don't need ecc, if where the case somebody us using the Mac to build original renders that's another story.

While ecc has its value mostly on system stability that data protection on a tipical Mac Pro duty.

Here is an interesting link on using VRAM ECC in a server environment. Researches ran a simulation with ECC on and off and determined that the hit to performance and VRAM capacity wasn't worth it. If ECC VRAM isn't necessary in an academic HPC environment then I doubt its very important when rendering video or 3d models.
[doublepost=1460739978][/doublepost]
before Apple jumping to ARM on Macs we will see AMD cpus on Macs, much more likely, while at servers its another history server software use to be multithread optimized, so an OS/X server on ARM could arrive sooner, maybe this year since Apple recognized re-starting its server hardware development at leas for internal services (iCloud, SIRI etc), maybe Apple ends selling some rack-mounter blade based on ARM running OSX server, or maybe they just build that for internal consumption.

right now the mac mini as NAS its fair good as long you attach external thunderbolt drives, a Mac mini based server appliance would need HDD trays at least two, or upto 5 to be competitive against popular NASes

I still and doubtful an ARM based mac could happen within the next 5 years or so. At best the chip in the iPad pro is equivalent to the low powered broadwell chip in the macbook. Intel has a product lineup that spans from 3 W to 160 W and that shouldn't be taken lightly. Thats not easy to replicate using ARM processors.

Remember that the transition from PowerPC to Intel was tolerable because it was easy to emulate PowerPC due to how much faster Intel chips were. If we do a side-grade the transition won't be nearly as nice.
 

Zarniwoop

macrumors 65816
Aug 12, 2009
1,038
760
West coast, Finland
I still and doubtful an ARM based mac could happen within the next 5 years or so. At best the chip in the iPad pro is equivalent to the low powered broadwell chip in the macbook. Intel has a product lineup that spans from 3 W to 160 W and that shouldn't be taken lightly. Thats not easy to replicate using ARM processors.

No, in Geekbench 3 Apple A9X in single core test is as fast as rMBP 13" with Core i5 and in multicore test it matches Macbook Air Core i5. in GL test A9X is faster than 15" rMBP with Iris Pro.. by 50% in Manhattan HD offscreen test.

But here's the situation, which one you prefer?
1) Apple A9 + Polaris 11 4GB
2) Core i5 dual-core with Iris graphics

Thermals are same, price is same (for Apple), cpu is around as fast (faster single core, slower multi) and huge difference in GPU power!

Imagine then the following:
Dual A9X in Macbook air. Twice as fast as rMBP 13" in cpu. Four times the GL power.
Better thermals. Better battery life. Or as Apple will put it, thinner and lighter.

And what can we expect from A10, to be more powerful than A9? I believe so.

Apple A9 is ready for Prime time! It just need software to run...
 
Last edited:

tuxon86

macrumors 65816
May 22, 2012
1,321
477
I think the best way now a days is software development that takes advantage of graphic cards no matter what the brand. Than we would need less of having to favor one brand over another because the software does not support it.

Actually no, that would just be another form of mono-culture. It would force GPU makers to restrict themselves to what the software maker use instead of pushing new idea forward. Technology needs competition and diverging platform to progress. technological mono-culture was tried before. The MSX series of computers and the 3DO videogame console are prime example of this and both failed. In both case, manufacturer could go beyond what the base spec required, but in the end software makers only used the common platform to develop meaning that you've wasted money on ressources that were never used.
[doublepost=1460746555][/doublepost]
Here is an interesting link on using VRAM ECC in a server environment. Researches ran a simulation with ECC on and off and determined that the hit to performance and VRAM capacity wasn't worth it. If ECC VRAM isn't necessary in an academic HPC environment then I doubt its very important when rendering video or 3d models.
[doublepost=1460739978][/doublepost]

I still and doubtful an ARM based mac could happen within the next 5 years or so. At best the chip in the iPad pro is equivalent to the low powered broadwell chip in the macbook. Intel has a product lineup that spans from 3 W to 160 W and that shouldn't be taken lightly. Thats not easy to replicate using ARM processors.

Remember that the transition from PowerPC to Intel was tolerable because it was easy to emulate PowerPC due to how much faster Intel chips were. If we do a side-grade the transition won't be nearly as nice.

We run an application here that calculate every single water molecule that travels through a turbine. Since every molecule can impact the ones beside it and multiply that by the others neighbors, a couple of bits that are wrong will impact the validity of the results.
 
  • Like
Reactions: Flint Ironstag

Stacc

macrumors 6502a
Jun 22, 2005
888
353
No, in Geekbench 3 Apple A9X in single core test is as fast as rMBP 13" with Core i5 and in multicore test it matches Macbook Air Core i5. in GL test A9X is faster than 15" rMBP with Iris Pro.. by 50% in Manhattan HD offscreen test.

But here's the situation, which one you prefer?
1) Apple A9 + Polaris 11 4GB
2) Core i5 dual-core with Iris graphics

Thermals are same, price is same (for Apple), cpu is around as fast (faster single core, slower multi) and huge difference in GPU power!

Imagine then the following:
Dual A9X in Macbook air. Twice as fast as rMBP 13" in cpu. Four times the GL power.
Better thermals. Better battery life. Or as Apple will put it, thinner and lighter.

And what can we expect from A10, to be more powerful than A9? I believe so.

Apple A9 is ready for Prime time! It just need software to run...

Its not that simple. The A9X is a relatively big chip with lots of space dedicated to the GPU. Its 150 mm2 compared to something like the smallest intel skylake chip which is 100 mm2. This is to say that its likely more expensive to manufacture the A9X. The graphics score of the A9X is certainly impressive, but we don't have a wide range of benchmarks that compare the performance of the two. We also don't know what the performance per watt of the A9X is. In things like browser benchmarks, a core i5 is still much faster than the A9X.

Your configurations don't make a lot of sense. Apple A9 + Polaris 11 would be putting a ~ 5 W CPU with a ~45 W GPU. Probably much too GPU heavy compared to a 25 W Intel Iris chip.

If Apple did leverage their own processors in the mac it would likely try and decrease the size and increase the battery life. You can already see them doing that with the A9. The A9X goes in the big iPads with extra graphics performance and the smaller and slightly more efficient with less graphics power goes in the iPhone. It would be fun to see this in the mac but I its still a side-grade and we its still unlikely Apple can scale this up to desktop class speeds. Could Apple make a chip that supports PCIe graphics? What about 64+ GB of memory? Thunderbolt? Its not that easy to create a platform like intel has.

AMD is going for a strategy like this where they can put a Zen CPU and a Polaris (or other future) GPU on the same package. If their CPUs can become competitive on the efficiency front than it should be a great way to tailer the CPU + GPU thermals and performance for exactly the product you want. This way you don't have to spin a different chip for each CPU+GPU combination. Imagine a Mac Pro the size of the mac mini (although cooling that would be a nightmare with all that thermal density concentrated in the tiny CPU/GPU package).

Apple is limited to the same manufacturing process as everyone else (except Intel). If Apple is stuck on 16 nm again this year the A10 will probably only be slightly faster.
 

Zarniwoop

macrumors 65816
Aug 12, 2009
1,038
760
West coast, Finland
Its not that simple. The A9X is a relatively big chip with lots of space dedicated to the GPU. Its 150 mm2 compared to something like the smallest intel skylake chip which is 100 mm2. This is to say that its likely more expensive to manufacture the A9X. The graphics score of the A9X is certainly impressive, but we don't have a wide range of benchmarks that compare the performance of the two. We also don't know what the performance per watt of the A9X is. In things like browser benchmarks, a core i5 is still much faster than the A9X.

Your configurations don't make a lot of sense. Apple A9 + Polaris 11 would be putting a ~ 5 W CPU with a ~45 W GPU. Probably much too GPU heavy compared to a 25 W Intel Iris chip.

If Apple did leverage their own processors in the mac it would likely try and decrease the size and increase the battery life. You can already see them doing that with the A9. The A9X goes in the big iPads with extra graphics performance and the smaller and slightly more efficient with less graphics power goes in the iPhone. It would be fun to see this in the mac but I its still a side-grade and we its still unlikely Apple can scale this up to desktop class speeds. Could Apple make a chip that supports PCIe graphics? What about 64+ GB of memory? Thunderbolt? Its not that easy to create a platform like intel has.

AMD is going for a strategy like this where they can put a Zen CPU and a Polaris (or other future) GPU on the same package. If their CPUs can become competitive on the efficiency front than it should be a great way to tailer the CPU + GPU thermals and performance for exactly the product you want. This way you don't have to spin a different chip for each CPU+GPU combination. Imagine a Mac Pro the size of the mac mini (although cooling that would be a nightmare with all that thermal density concentrated in the tiny CPU/GPU package).

Apple is limited to the same manufacturing process as everyone else (except Intel). If Apple is stuck on 16 nm again this year the A10 will probably only be slightly faster.
Just saying, that mobile Polaris 11 will be ~25W part. Core i5 is 28W, so the difference between A9 +P11 and i5 is small.

Also remember, that Intel charges huge premium from its chips. A recent teardown report from IHS iSuppli estimated the cost to manufacture the Apple's A9 at $22. That's one tenth of Intel i5.. CORRECTION: i5-5257U costs $315 directly from Intel.
 
Last edited:

wallysb01

macrumors 68000
Jun 30, 2011
1,589
809
Just saying, that mobile Polaris 11 will be ~25W part. Core i5 is 28W, so the difference between A9 +P11 and i5 is small.

Also remember, that Intel charges huge premium from its chips. A recent teardown report from IHS iSuppli estimated the cost to manufacture the Apple's A9 at $22. That's one tenth of Intel i5..

You're comparing the cost to manufacture to the retail sale cost from Intel. The question should be on how much Intel is charging Apple. I'd guess its more than $22, but does the increased profit margin Apple could get out of this switch justify the potentially lost sales of a machine gimped with ARM....?
 

Zarniwoop

macrumors 65816
Aug 12, 2009
1,038
760
West coast, Finland
You're comparing the cost to manufacture to the retail sale cost from Intel. The question should be on how much Intel is charging Apple. I'd guess its more than $22, but does the increased profit margin Apple could get out of this switch justify the potentially lost sales of a machine gimped with ARM....?
I head the CEO of Apple is quite a bean counter.. when ever the software side is ok, he'll give green light.
 

Stacc

macrumors 6502a
Jun 22, 2005
888
353
Just saying, that mobile Polaris 11 will be ~25W part. Core i5 is 28W, so the difference between A9 +P11 and i5 is small.

We don't actually know the power consumption of Polaris and likely it will scale depending on what the chosen clocks are and the chassis its in. The 15" macbook pro has used a ~45 W GPU since more or less the powerbook days. If you go too much less than that it probably makes more sense just to use integrated graphics since the performance differential starts to shrink.
 

fuchsdh

macrumors 68020
Jun 19, 2014
2,028
1,831
ECC on GPUs isn't a big deal, unless it is. Basically a more fringe issue to the lack of a second processor socket. It's something where I'd certainly rather have it, since if you're paying upfront for GPUs you're likely not going to upgrade, the lack of ECC puts the Firepro DX00 cards into a weird twilight realm—they're not gaming cards and they're not workstation cards, and not priced accordingly to either spec (you'd get 7970s for much less than $900, you'd get the W-equivalent for much more).

Keeping around the D500 or a similar non-ECC entry-level card and then offering ECC-flavored Polaris offerings with a higher upgrade cost seems like a reasonable enough tradeoff they could pursue. If Apple's going to focus heavily on small powerful workstation then the GPUs are the weakest link without ECC.
 
  • Like
Reactions: Flint Ironstag

Zarniwoop

macrumors 65816
Aug 12, 2009
1,038
760
West coast, Finland
We don't actually know the power consumption of Polaris and likely it will scale depending on what the chosen clocks are and the chassis its in. The 15" macbook pro has used a ~45 W GPU since more or less the powerbook days. If you go too much less than that it probably makes more sense just to use integrated graphics since the performance differential starts to shrink.
There have been talks about Polaris being 2.5 times more power efficient. Sure it means that for 15" rMBP we should see double performance with same TDP. But it goes other way too. Polaris will open new segment for dGPU's and in 25W class we'll see the performance of the former >45W.
[doublepost=1460752181][/doublepost]Anyway, because this is MP thread, what ever I've written about the possible future ARM Mac, I think that the nMP we (should) see at WWDC, is an old fashion update with Intel EP and Polaris 11 and 10, DDR4, faster SSD and new io-connectors a la USB type-c.

If macOS comes with ARM support, we'll see new machines in this class next autumn the earliest.
 
Last edited:

goMac

macrumors 604
Apr 15, 2004
7,663
1,694
That's from the 1997 updated OS logo which was actually Mac OS with the double face. A bad time for Apple Computer.

Or, you know...
th


OS X up until 10.7 was known as "Mac OS" as well, X was just the version number. So 10.6, the good release, was "Mac OS". 107 was the first one that dropped "Mac OS" and just became "OS X."

The double face has also been with us the whole time. It's the Finder icon.

It's probably a good thing. The OS X rename came about because they wanted to pretend that had one OS everywhere. OS X on iPhone! OS X on Macs! OS X on iPads! MacOS means they at least have realized the Mac is not an iPad.
[doublepost=1460754374][/doublepost]
If macOS comes with ARM support, we'll see new machines in this class next autumn the earliest.

I really doubt it. There have been signs ARM is kind of somewhat competitive in single threaded only, but ARM isn't any faster than Intel chips. Why send everyone through a transition for something that's just as slow or slower?

Apple has pretty much flat out said they're not transitioning away from Intel for Macs because there just isn't any good reason.

They're already having trouble giving the Mac the attention it needs. I'm not sure how that leads anyone to believe that Apple is going to care enough and spend the considerable time and money to start custom designing and fabricating chips for the Mac.

If you start seeing the big game console makers switching to ARM, you'll know x86 is in trouble. But as it stands, both AMD and Intel can deliver very good performance per watt, and overall performance that is more than competitive with ARM.
 
Last edited:

ManuelGomes

macrumors 68000
Original poster
Dec 4, 2014
1,617
354
Aveiro, Portugal
Power 8 with NVLink systems coming up.
Another Polaris product, is there no imagination these days?

http://anandtech.com/show/10249/ope...inspur-supermicro-develop-power8based-servers
[doublepost=1460759303][/doublepost]Mention to upcoming Power 9 systems with PCIe4.
It would be cool if IBM would bring Power to workstations.
[doublepost=1460759896][/doublepost]Seen the LG adds with their new MB/MBP lookalikes running FCP and Logic? What?! W10 machines? They'll be correcting it. Man, how bad this gets and will it ever stop?
 

linuxcooldude

macrumors 68020
Mar 1, 2010
2,480
7,232
Actually no, that would just be another form of mono-culture. It would force GPU makers to restrict themselves to what the software maker use instead of pushing new idea forward. Technology needs competition and diverging platform to progress. technological mono-culture was tried before. The MSX series of computers and the 3DO videogame console are prime example of this and both failed. In both case, manufacturer could go beyond what the base spec required, but in the end software makers only used the common platform to develop meaning that you've wasted money on ressources that were never used.

I wouldn't think so. Hardware almost always comes before third party developers who have to develop applications around that hardware, taking in account the operating system and drivers for that hardware. Hardly the other way round. Sure, maybe some exceptions in some cases. Such as hardware/software integrated devices such as Apple computers or gaming consoles. It took developers a very long time for 64 bit CPU's, multicore, GPU & multiple GPU parallel processing. If at all. Obviously it does not happen over night though.

Than we have things like Gameworks/VRWorks, CUDA, PhysX baked into third party software applications making it more into the mono-culture you animately talk about.
 
Last edited:

Thunderbird

macrumors 6502a
Dec 25, 2005
957
790
I too have held out hope that Apple would throw our types a single bone to alleviate our anguish, but I have just managed to extinguish that hope. It has been a long time coming, but I have told myself, "It's dead Jim!" in McCoy's most emotive voice.



"The Mac Pro is...



...it's dead, Jim."
 

koyoot

macrumors 603
Jun 5, 2012
5,939
1,853
You know what is scary at this point? I was looking at computers in my country, and was looking for workstations. Normal, Xeon based, Nvidia equipped workstations. Xeon E5 8 core 1680v3, 800 GB Intel SSD + 8 TB of HDD, 64 GB of RAM, GTX 980 TI. It is 4000 PLN(around 1000$) less expensive than Mac Pro with... 6 core Xeon, 16 GB of RAM, 512 GB SSD, and dual D500.

Apple, shame on you.
 
  • Like
Reactions: rdav and pat500000

cube

Suspended
May 10, 2004
17,011
4,973
You know what is scary at this point? I was looking at computers in my country, and was looking for workstations. Normal, Xeon based, Nvidia equipped workstations. Xeon E5 8 core 1680v3, 800 GB Intel SSD + 8 TB of HDD, 64 GB of RAM, GTX 980 TI. It is 4000 PLN(around 1000$) less expensive than Mac Pro with... 6 core Xeon, 16 GB of RAM, 512 GB SSD, and dual D500.

Apple, shame on you.
I think if you look at Supermicro barebones, you will be much more impressed.
 

koyoot

macrumors 603
Jun 5, 2012
5,939
1,853
I think if you look at Supermicro barebones, you will be much more impressed.
At this point Im starting seriously considering finding a case that is small size maybe Mac Pro styled, and putting into it custom build PC with water cooling.
 
Status
Not open for further replies.
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.