Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
The servers are used in a research environment where typically they'll do something for a couple of months before doing a different project for a couple of months, before doing a different project for a couple of months....

Nothing in the lab runs Apple OSX. It's either Windows (most likely Windows Server 2008 R2 or later) or a major Linux distro (RHEL 6.4 or CentOS 6.4 or a recent Ubuntu LTS release). We use the OS's that the Fortune 50 use.

Apple OSX is a client OS, so it's not on the map for most stuff.

Very similar here! In our university, we have tens of thousands of cores on our various research clusters across campus (I think someone told me it was currently over 70,000 cores total). They are all networked together with Condor, or you can just access a particular cluster, depending on the research project. There are some Windows machines, but the vast majority of the research clusters are running UNIX or Linux.

I've done a few really large computations over the years, and those are very fun. It is so exciting to take advantage of the large computational power that is becoming more and more widely available.
 
I've been using 256GB for the past year or so and can't imagine going back to 64 for a primary workstation. 128 is usable but not ideal.

Oh come on! 16 GiB is usually enough to check email and do some light surfing. ;)

ps: My development laptop has 32 GiB, two SSDs and a 1 TB SSHD....
 
Mac memory sizes

The problem is that once upon a time Apple actually had an interest in scientific computing. Now Apple doesn't and its primary market doesn't routinely require those amounts of ram. I'm sure if Apple was focusing on scientific computing requirements that we would have seen a very different nMacpro.
 
The problem is that once upon a time Apple actually had an interest in scientific computing. Now Apple doesn't and its primary market doesn't routinely require those amounts of ram. I'm sure if Apple was focusing on scientific computing requirements that we would have seen a very different nMacpro.

You're correct, but too narrow in your statement.

Yahoo! for "big data", "Hadoop", "MongoDB", "data mining" and similar terms. Mainstream business and finance apps are processing huge amounts of data, and systems with huge amounts of RAM and real GPGPUs are an essential part of that. It's not just science that Apple is ignoring - it's the real world.

The new Mini Pro is targeted at people running FCP-X, and those who need to feel "cool". It really is a huge upgrade to the MiniMac, but compared to what power users can get on other platforms it's almost a joke.

What was Apple thinking when they designed a "professional" system that has 4 DIMM slots on a processor that supports 12 DIMMs, and competitors are using two of them for a total of 24 DIMMs? That's not innovating, that's "styling".
 
The new Mini Pro is targeted at people running FCP-X, and those who need to feel "cool".

And pretty much anyone that works in desktop publishing, marketing, 3D animation, music recording and production, graphic design, photo editing, video editing, etc etc

Basically Apple's primary pro userbase.

It's not just science that Apple is ignoring - it's the real world.

Oh and I have listed imaginary businesses?

Mainstream business and finance apps are processing huge amounts of data, and systems with huge amounts of RAM and real GPGPUs are an essential part of that.

Sorry but when were business and finance an interest of Apple? Never.

Changing from single GPU standard to dual GPU standard does not cause a demographic shift. Changing from 8 RAM slots to 4 or from dual CPU to single CPU does not either. They are targeting more or less the same people they always did with the Mac Pro.

----------

I've been using 256GB for the past year or so and can't imagine going back to 64 for a primary workstation. 128 is usable but not ideal.

Workstation doing what kind of work? 32 is more than enough for most stuff. Even 16 is acceptable.
 
  • Like
Reactions: poematik13
I didn't realize that desktop publishing required dual GPUs.

Neither does music production, unless you want to use a lot of screens. But I agree most of these folks won't be using the second GPU, so they will be paying an extra 200$ or so for the second d300. Not a terrible deal though.
 
Workstation doing what kind of work? 32 is more than enough for most stuff. Even 16 is acceptable.

VFX work. Maya, Houdini, Realflow, Nuke, After Effects etc. Also dealing with 4k, 5k, and 6k footage. Pretty standard stuff.
 
Neither does music production, unless you want to use a lot of screens. But I agree most of these folks won't be using the second GPU, so they will be paying an extra 200$ or so for the second d300. Not a terrible deal though.

Funny, I don't have an infinite budget, so throwing away $200 per seat isn't a good deal.

Especially when I need a second hard drive. Make a SKU that has one GPU and 3 more PCIe SSD slots.
 
VFX work. Maya, Houdini, Realflow, Nuke, After Effects etc. Also dealing with 4k, 5k, and 6k footage. Pretty standard stuff.

Yes but again, not an area Apple is interested in. 3D animation on the Mac has been only done for the most basic stuff. The software became available quite late in the game, and many apps still missing. Complex 3D and VFX work is done on Windows PC's, not Macs.

----------

Funny, I don't have an infinite budget, so throwing away $200 per seat isn't a good deal.

So you don't buy it. Nothing wrong with that. No computer is tailored to a specific person. We all pay for things we are never going to use, and we always did with the old pro as well. I had only two PCIe slot occupied and that was the GPU and a eSATA card. I never bought 4 internal HD's and I never used more than 4 RAM slots. But I still paid for the whole thing. I wonder how much cheaper the Mac Pro tailored for "precious me" would be.
 
So you don't buy it. Nothing wrong with that. No computer is tailored to a specific person. We all pay for things we are never going to use, and we always did with the old pro as well. I had only two PCIe slot occupied and that was the GPU and a eSATA card. I never bought 4 internal HD's and I never used more than 4 RAM slots. But I still paid for the whole thing. I wonder how much cheaper the Mac Pro tailored for "precious me" would be.

Don't you see the difference between an empty slot that you don't fill, and a populated slot that you can't use?

Those empty disk slots and empty PCIe slots don't cost much. A slot with a purported "workstation class GPU" - well, it does cost even if you can't use it.
 
Don't you see the difference between an empty slot that you don't fill, and a populated slot that you can't use?

Those empty disk slots and empty PCIe slots don't cost much. A slot with a purported "workstation class GPU" - well, it does cost even if you can't use it.

I'm not sure which costs more to be honest. On the one hand you have a bigger computer due to a lot of slots and internal expansion, a computer that weighs 40 pounds and is 6 times larger than one that weighs 10 pounds. All the shipping and stocking costs alone might be more than 200$ without including the extra bays and slots. And I don't even know if a D300 costs as much as 200$, I just based it on the current 7870 costing around 200$ retail.

In any case we would be comparing dollar to dollar, which was not my point.

I was and still am against dual GPU standard as well, I just don't see it as a dealbreaker since at least they offer a cheap option.
 
That's exactly the problem - Intel explicitly says that it supports 1 Gb, 2 Gb and 4 Gb parts.

The current 16GB DIMMs use 2Gb parts.

" .. Registered • ECC • DDR3-1866 • 2048Meg x 72 • .... "
http://www.crucial.com/store/mpartspecs.aspx?mtbpoid=BA2FD285A5CA7304

Doubling the module density and using same 72 module count:

72 x 4096 => 294Gb divide by 8 36GB ( ~32GB)

4 * 32GB = 128GB

And the problem is what? Mostly that the 4Gb are currently (for short term future) too expensive for most buyers. That will change as those new fab lines mature.
 
You're correct, but too narrow in your statement.

Yahoo! for "big data", "Hadoop", "MongoDB", "data mining" and similar terms. Mainstream business and finance apps are processing huge amounts of data, and systems with huge amounts of RAM and real GPGPUs are an essential part of that. It's not just science that Apple is ignoring - it's the real world.

The new Mini Pro is targeted at people running FCP-X, and those who need to feel "cool". It really is a huge upgrade to the MiniMac, but compared to what power users can get on other platforms it's almost a joke.

What was Apple thinking when they designed a "professional" system that has 4 DIMM slots on a processor that supports 12 DIMMs, and competitors are using two of them for a total of 24 DIMMs? That's not innovating, that's "styling".

Yes, the Mac Pro main design purpose was to run as a MongoDB server....


:confused:

I have worked for various banks and financial institutions. Data mining large data sets on personal workstations is not common.
 
I have followed this thread with interest but no one has mentioned this link regarding Intel platform DDR3 memory validation. The interesting part to me is that the 32GB RDIMM was tested for both Westmere-EP and Sandy Bridge-EP, but dropped from Ivy-Bridge-EP validation It seems to me both Intel and memory manufacturers have decided to move on to LRDIMMs for 32GB and larger capacity.
 
I....The interesting part to me is that the 32GB RDIMM was tested for both Westmere-EP and Sandy Bridge-EP, but dropped from Ivy-Bridge-EP validation It seems to me both Intel and memory manufacturers have decided to move on to LRDIMMs for 32GB and larger capacity.

It is a cheaper validation to do. LRDIMMs configs can use less dense modules since 'hide' the ranks behind an abstraction. The cheaper approach is going to make it to market sooner.

However, those lists aren't intended to be exhaustive. p. 2 from the Xeon E5 v1 document.

"Listed below is a small sample DDR3 RDIMMs tested on Xeon E5 2600..... "


LRDIMMs are probably going to be more common short term but as DDR3 has to compete with shipping DDR 4 quantities the 4Gb modules are going to get more affordable. Right now, there is little good reason for memory vendors to drop the price; especially at higher clock rates.
 
Yes, the Mac Pro main design purpose was to run as a MongoDB server....

:confused:

I have worked for various banks and financial institutions. Data mining large data sets on personal workstations is not common.

Developers testing the mining code do like to have it all on their desk....

Of course nobody will connect petabytes of data directly to a new Mini Pro.

The 64 GiB RAM limit, though, will cause some people to bypass the Tube and get a much more capable rectangular box. And that's probably a good thing for them, since it is certain that none of the back end systems will be running Apple OSX.
 
Developers testing the mining code do like to have it all on their desk....

Of course nobody will connect petabytes of data directly to a new Mini Pro.

The 64 GiB RAM limit, though, will cause some people to bypass the Tube and get a much more capable rectangular box. And that's probably a good thing for them, since it is certain that none of the back end systems will be running Apple OSX.
In that case I would expect the local development environment to be a cut-down data set. Normally we have shared test environments when the local workstation cannot cut it. Unfortunately there are serious costs to consider here and developers don't always get what they wish for.
 
Now available to purchase at OWC.

32GB modules are 1333MHz. Wonder how this will affect performance.

OWC 128GB
Wow, that's a huge price difference. $849 vs. $2129...

As for performance, it probably won't matter much during tasks that can actually utilize the RAM. Other stuff, who knows.

Now, we just need to wait for mward333 to get his nMP... :)
 
Wow, that's a huge price difference. $849 vs. $2129...

As for performance, it probably won't matter much during tasks that can actually utilize the RAM. Other stuff, who knows.

Now, we just need to wait for mward333 to get his nMP... :)

Even if I bought a 2013 new Mac Pro, I wouldn't put 1333 MHz memory into it. I am only interested in how 1866 MHz memory behaves in it.

For our grant, our project is starting soon, and I finally decided to break down and buy a Dell rack-mounted server. I can get 2 of the 12-core Intel chips (instead of just 1, as in the new Mac Pro), and I have 12 slots of memory (instead of just 4, in the new Mac Pro). Moreover, the pricing is much less expensive. So, I can't quite believe it, but I finally opted to NOT get a new Mac Pro for the grant. It felt weird for a few days, but I finally feel at peace with this decision. I can get twice the number of computing cores, and TONS more memory, for the very same price. Of course I'll run Linux on it, since I really only need UNIX types of tools for the grant itself.

Nonetheless, I'm still interested in how this thread evolves, and I look forward to some manufacturers eventually (hopefully) putting out some RAM modules that will work in the new Mac Pro, at the 1866 MHz speed. Perhaps I'll make such a purchase later in the year. For this grant, however, I couldn't afford to wait any longer!
 
Hello everyone! This is Transcend Information's official MacRumors account. We just wanted to announce the availability of our new JetMemory series for Mac computers. We've made a quick video detailing our new DDR3 1600 upgrade kits for the nMP as well. Check it out at http://www.transcend-info.com/Apple/JetMemory
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.