Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
If it's what they want then so be it.

CENTOS just feels like a tested, complete system, where as Ubuntu feels like it stops at 95% it's just always got these little annoyances

I've given up on Windows completely. I've spent a long time trying out Linux different distros. So far my favorites have been Mint (w/cinnamon) and Debian. I run Ubuntu on my file server, CentOS on my web and mail server.

Unfortunately I haven't found a DE I've been happy with. KDE seems like a dogs breakfast of half baked ideas, and shows why having dozens of people with different ideas how a GUI should work should not work on the same project. Gnome 3 looks like the Fisher Price of DE's with it's application menu that's a poor rendition of launchpad.

Not a fan of Arch's pacman, or CentOS/RHEL's Yum, but prefer apt instead, which leaves me with a choice of the Debian tree.

Unfortunately Debian is behind in Linux kernels at this point. Perhaps I'll revisit a Debian LTS once they hit the 4.x kernel.

In the meantime, I'll stick with OS X as it's by far more polished.
 
I've given up on Windows completely. I've spent a long time trying out Linux different distros. So far my favorites have been Mint (w/cinnamon) and Debian. I run Ubuntu on my file server, CentOS on my web and mail server.

Unfortunately I haven't found a DE I've been happy with. KDE seems like a dogs breakfast of half baked ideas, and shows why having dozens of people with different ideas how a GUI should work should not work on the same project. Gnome 3 looks like the Fisher Price of DE's with it's application menu that's a poor rendition of launchpad.

Not a fan of Arch's pacman, or CentOS/RHEL's Yum, but prefer apt instead, which leaves me with a choice of the Debian tree.

Unfortunately Debian is behind in Linux kernels at this point. Perhaps I'll revisit a Debian LTS once they hit the 4.x kernel.

In the meantime, I'll stick with OS X as it's by far more polished.

You don't have to like Linux, and I'm not going to advocate for it usually it's the reverse. Use what makes you happy and productive no one has ever turned down a product because of the tool used to achieve it.
 
Last edited:
  • Like
Reactions: AidenShaw
If your time frame is weeks, don't expect an updated mac pro in that span. Updated CPUs from Intel and graphics cards from AMD may coincide for a June announcement of a mac pro at WWDC.[...]
You can disregard AMD as a reason for delay. Apple never (ever) uses brand-new just-announced new GPU designs. Even the D300/500/700 was designed with non-shiny-new GPUs. If AMD releases new generation Arctic Islands GPUs in June, you won't see them in Macs until a year later at least.

Apple isn't going to use any AMD GPU newer than Fiji this year. And they could already do it, as Fiji was introduced last June, so it applies as a GPU than can be used in Macs soon.

Anyway, back to my original post, I'm afraid I'm not going to buy Apple hardware this time. Apple doesn't have the product I need for OpenCL-based research. They had the best machines for research for some decades, but it's no longer the case, and I don't know if they will achieve it again someday.
 
I need a new machine for scientific research with GPGPU and multi-CPUs. The Mac Pro would be a great choice, but only if it was current. Investing in a 2-year old machine isn't an option for me. I can wait some weeks, perhaps a month, but not more. Are there any rumors of imminent update?

If negative, I'll take the painful choice of going Linux. Very sad, but only option I'm afraid.

As someone who also does scientific research. I have a nMP and it turned out to be a waste of time and money. I invested properly in a Linux Machine and it has been one of the best decisions i've made. The money you would spend on a nMP could build an incredible Linux machine.

I also do GPU - based research using OpenCL. My suggestion is this buy a beefy Linux machine for doing the computation/actual work. Then purchase a minimal mac machine for development. The only reason I still use macs for OpenCL development is Xcode and their tools for debugging kernels and testing them are quite good. Once the code is well developed you can deploy it onto the Linux machine.
 
You can disregard AMD as a reason for delay. Apple never (ever) uses brand-new just-announced new GPU designs. Even the D300/500/700 was designed with non-shiny-new GPUs. If AMD releases new generation Arctic Islands GPUs in June, you won't see them in Macs until a year later at least.

Apple isn't going to use any AMD GPU newer than Fiji this year. And they could already do it, as Fiji was introduced last June, so it applies as a GPU than can be used in Macs soon.

Anyway, back to my original post, I'm afraid I'm not going to buy Apple hardware this time. Apple doesn't have the product I need for OpenCL-based research. They had the best machines for research for some decades, but it's no longer the case, and I don't know if they will achieve it again someday.

There are other potential reasons Apple chose to use AMD's Tahiti and not Hawaii besides release date. Number one is that Tahiti has better compute performance for a given thermal limit. Yes, Hawaii had better absolute performance but it also generated more heat and wasn't as optimized for compute.

Fiji is a non starter when it comes to professional GPUs. While it has better performance per watt than Tahiti and Hawaii, it is also limited to 4 GB of VRAM and is more of a gaming oriented card.

Polaris is essentially a modern Tahiti. A compute focused GPU at a new smaller node that will have a moderate die size and moderate power consumption. It will support 16 GB to 32 GB of VRAM. This would be the chip that Apple is waiting on.

Finally, just because AMD introduces a new product doesn't mean Apple won't have access to it. Apple basically had exclusive use of Tonga XT for a year in the retina iMac. Remember, Apple introduced the mac pro with ivy bridge chips before intel had announced them. Vendors are very willing to give Apple early access to their hardware so that Apple can get it in their machines.
 
Remember, Apple introduced the mac pro with ivy bridge chips before intel had announced them. Vendors are very willing to give Apple early access to their hardware so that Apple can get it in their machines.
That's a nice urban legend.

All of the big vendors had engineering samples of Ivy Bridge-EP in the spring and summer of 2013. (My lab had a dual 6-core system already in use when the MP6,1 prototype was shown at MacWorld SF 2013.) Intel had shown them at its IDF, so there was no secret. Heck, at IDF 2012 Intel showed Haswell chips.

The official announcement date for the E5-x6xx v2 was September 2013 - many months before the MP6,1 started to ship.
 
Last edited:
  • Like
Reactions: 996085
I remember reading something about Apple getting xeons a few weeks early back when they made the switch from Power PC. But that could have mostly been a marketing story for Apple to help make the transition look more positive. Apple and Intel were in love, at least on stage.
 
I remember reading something about Apple getting xeons a few weeks early back when they made the switch from Power PC. But that could have mostly been a marketing story for Apple to help make the transition look more positive. Apple and Intel were in love, at least on stage.
https://forums.macrumors.com/thread...-wwdc-approaches.1589250/page-8#post-17338096

"To Apple fans the spin is that Apple got something exclusive and early. To people familiar with the process - Apple took stuff that nobody else wanted."
 
That's a nice urban legend.

All of the big vendors had engineering samples of Ivy Bridge-EP in the spring and summer of 2013. (My lab had a dual 6-core system already in use when the MP6,1 prototype was shown at MacWorld SF 2013.) Intel had shown them at its IDF, so there was no secret. Heck, at IDF 2012 Intel showed Haswell chips.

The official announcement date for the E5-x6xx v2 was September 2013 - many months before the MP6,1 started to ship.

I meant to say that Apple announced the mac pro before intel formally released Ivy bridge-E. New hardware is rarely a secret and vendors make efforts to get their chips out there. I also wasn't implying that only Apple got access to these chips.
 
dir
I meant to say that Apple announced the mac pro before intel formally released Ivy bridge-E. New hardware is rarely a secret and vendors make efforts to get their chips out there. I also wasn't implying that only Apple got access to these chips.
OK. Note that Apple did not announce that it had Ivy Bridge chips. Check one of the comments at 10:59 on http://www.anandtech.com/show/7056/wwdc-2013-keynote-live-blog .
 
[...]The only reason I still use macs for OpenCL development is Xcode and their tools for debugging kernels and testing them are quite good. Once the code is well developed you can deploy it onto the Linux machine.
Really? Can you measure OpenCL kernels occupancy with Xcode? How? (please, tell, because I never found how to do it). One of the reasons that makes me feel I really need Linux is that the AMD OpenCL profiler runs on Linux and Windows, but not on OS X. And developing OpenCL kernels without having a low-level profiler is like trying to write a TIFF image loader without libtiff and without the TIFF specification.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.