Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Out of curiosity, what do people do with 24 processor cores? I remember a documentary about South Park where they said the entire show is animated and rendered on regular iMacs. 4K video editing can put a lot of stress on a computer but it looks like the new Pro was designed specifically for that.

South Park isn't a difficult render. We max out 2 MacPro's (48 cores) in our render farm all the time rendering HD tv graphics and animations. 3D renders and animations can get pretty grueling after all the passes, i.e. shadow, depth of field, any and all buffers, etc. and adding Global Illumination and Ambient Occlusion with sub surface scattering, and particles and/or physics.
 
looks like junk but pretty junk in a r2d2 kind of way. This may be the mac I want. A good cpu a good gpu and hook it up to my promise pegasus


If it can be upgraded I will buy the base model as a refurb and put in a better cpu. like I did with this thread below


https://forums.macrumors.com/threads/1122551/
 

Attachments

  • 6a00d834515f7269e200e54f863cfa8834-640wi.jpg
    6a00d834515f7269e200e54f863cfa8834-640wi.jpg
    29.3 KB · Views: 91
Let's see, you can build a machine today with 2 CPUs, 8 RAM slots and your choice of GPUs with CUDA or you could wait half a year for a machine with 1 CPU, 4 RAM slots and no CUDA. Decisions, decisions.
 
I rather prefer 2 CPU + 1 GPU. In my field, GPU computing isn't a standard yet. We still rely heavily on CPU. The old MacPro model offers this possibility, but the new on doesn't.
 
These are the procs most likely.
http://wccftech.com/intels-leaked-r...epen-processors-12-cores-30mb-cache-130w-tdp/
They will be much faster than Westmere 2x6-core at 3.06GHz. Believe it. You don't need dual processors and one 12-core 24-thread will be more efficient anyway without the QPI link. That said, external everything and no cost announced has me worried. CPU and GPU not so much. Would be nice if you could CTO a standard Radeon or GeForce in it though. Like why the hell does 10.8.4 have Titan drivers? Wait and see I guess.

----------

Heh. I've been rockin FW800 since 2003. Can't stand USB. I'll miss it when I eventually upgrade.

Seriously, who has been on OS X professionally for even a few years and not use FW800?
 
So basically no new Mac Pro then. Sounds more like a Mac Mini turbo. I need many computing cores and was almost expecting to see the possibility of adding a Xeon Phi card to the new Pro. I'd rather have dual cpu than dual gpu if I have to choose. With the previous form factor we could have both. Will most likely replace my old Pro with an iMac and a rack server running linux.

If you're working on custom applications that require lots of parallel computation, I think the general trend there is towards GPGPU work, which is where Apple put most of their emphasis with this new design.

If 12 cores is woefully inadequate, then I don't know how much of a difference 24 cores would make for you. If you need something with way more than 12 x86 cores then it sounds like a workload that may be better suited for a cluster setup.
 
yeah hmmm. bit disappointed in the single cpu config - for some people two cpu's are more important than two gpu's. Would of preferred that option as someone else pointed out.

That said... a current mac pro with twin x5690's - thats 24 cores. the new e5's are up to 40% faster and up to 16 cores. What ends up in the new mac pro could actually be fairly close to the horse power of a current mac pro... and a much smaller machine.

Will make a decision based on cost when it's out :)

Nox

Two x5690s is 12 cores, 24 threads. SB/IB are not 40% faster across the board than Westmere. Maybe in some benchmarks that can use AVX... but no where near that in real world performance. Additionally, a dual Westmere system can be clocked quite high (x5690s), while 12 cores in one 130 watt chip will probably end up being clocked in the upper 2Ghz range. There will be a lot of turbo available for lightly threaded applications, but not when all cores are maxed out.
 
If 12 cores is woefully inadequate, then I don't know how much of a difference 24 cores would make for you. If you need something with way more than 12 x86 cores then it sounds like a workload that may be better suited for a cluster setup.

I don't need way more than 12 cores, but fail to see why apple is so obsessed with physical size that they compromise the power of the Pro. Physical dimensions are important for a laptop, not for a workstation. 24 instead of 12 cores would imply e.g. 2 instead of 1 week of simulations. I could of course do the simulations on the gpu, but typically each computing core in a gpu is a bit to weak for my purpose. If the price of the Pro is reasonable I might buy one even though I buy a rack server. I don't really need a powerful gpu even though it would be nice with dual 4K monitors.
 
Two x5690s is 12 cores, 24 threads. SB/IB are not 40% faster across the board than Westmere. Maybe in some benchmarks that can use AVX... but no where near that in real world performance. Additionally, a dual Westmere system can be clocked quite high (x5690s), while 12 cores in one 130 watt chip will probably end up being clocked in the upper 2Ghz range. There will be a lot of turbo available for lightly threaded applications, but not when all cores are maxed out.

cores/threads - yeah ok, corrected :p did I post late or something :p

and yes, I did say up to 40% faster - reality is going to be a lot closer as you mentioned. I'm planning on sticking with my current mac pro, I think it's going to be the last in the line of expandable pro's :( Well, ones not using external boxes. But ultimately - I'll still decide based on cost. :)

nox
 
I don't need way more than 12 cores, but fail to see why apple is so obsessed with physical size that they compromise the power of the Pro.

It is in part not physical size but being cognizant the over time the evolutionary path of the computer industry is smaller. What is the point of swimming upstream of that if others are spending many billions of dollars per year to push in into the smaller direction.

The maximum cores in the old Mac Pro is 12. The new box ..... 12.
That count isn't backsliding.

Plateauing user workload requirements (e.g., "I don't need more than 12 cores" ) is another reason. Couple that with "I don't need more than 2TB local" and the Mac Pro fits quite nicely. ( might need a PCI-e SSD on the back of both GPUs but nothing in the design prohibits that).

For an increasingly broader spectrum of users the workload requirements are not going up as fast as the technology is getting faster, denser, and more integrated. In the left over subset, one of the faster growing subset of that subset are ones with a storage explosion problem. Where data is actually getting bigger faster than storage densities are improving. For that subset just about any reasonable fixed number of drive sleds isn't going work long term. The number of drives needed just continually increases. Delete both of those two growing subsets of users and what do you have left? Is it growing?

What the new Mac Pro slightly misses out of is just technology integration track. There are 10 unused SATA lanes in the box. The catch-22 is that now they are such a small part of the total system cost it possible to sweep them aside. So there is a trade-off being made.

And that may be temporary too. In the future Intel may have chipsets that punt SATA completely in exchange for room (and bandwidth) for Thunderbolt. If so a smaller system almost necessary because Thunderbolt controllers have to near the physical port. If put TB in the main chipset that means moving the main chipset closer to the device's edges/boarder .

However, again this is actually matches what Apple has done with pulling the edge of the device closer to the CPU/chipset.
 
If it can be upgraded I will buy the base model as a refurb and put in a better cpu.
You know, I'm already thinking refurb myself, I've waited this long.

Wonder if I can get one with a dent in it?

----------

Let's see, you can build a machine today with 2 CPUs, 8 RAM slots and your choice of GPUs with CUDA or you could wait half a year for a machine with 1 CPU, 4 RAM slots and no CUDA. Decisions, decisions.
Yeah, but we'd have to pay a lot less, I'll bet, and that would dilute the Apple experience we've all come to love.

BOXX guy called me this morning, I said let's wait and see.
 
Let's see, you can build a machine today with 2 CPUs, 8 RAM slots and your choice of GPUs with CUDA or you could wait half a year for a machine with 1 CPU, 4 RAM slots and no CUDA. Decisions, decisions.

Again with the CUDA!

We have no idea what Apple has been doing with software developers.

The one bit of information we do know (as a fact) is that the developer of DaVinci Resolve said that version 10 with OpenCL flat out SCREAMS on the new Mac Pro.

I am inclined to take a developers who has a major stake in their product sales and has used the new Mac Pro at their word.

Here you go:
(May 8, 2013)
Improved GPU support in Adobe Premiere Pro CC

Premiere Pro CC offers a significantly improved Mercury Playback Engine, giving more editors than ever before the ability to enjoy the best possible performance. As a recap, the Mercury Playback Engine is three things combined: a 64-bit architecture, massively multi-threaded CPU optimization, and GPU optimization, all of which combine to allow dense, effects-rich, multi-format sequences to play back smoothly. It has always been (and remains) perfectly feasible to run Mercury without the addition of a GPU (‘software rendering mode’) and for many kinds of projects this provides ample horsepower, but adding a GPU makes a noticeable difference, particularly as sequences become more complex.



Premiere Pro CC introduces support for both CUDA and OpenCL GPU architectures on both the Mac and Windows platforms, which results in a dramatically enhanced list of certified GPUs, the full list of which follows this post. (Please note that the list currently displayed on this page is out of date, and will be corrected soon.) Also, if you own a GPU that we haven’t officially tested, but which meets the minimum requirement of having 1GB of VRAM and appropriate drivers installed, you will be able to enable that GPU in Playback Settings. An alert warns you that your configuration isn’t officially certified, but you’ll still be able to turn it on to use it. All this means that more people than ever will be able to enjoy full, GPU-enhanced Mercury Playback Engine performance.



Finally, for customers using configurations containing multiple GPUs, Premiere Pro CC can use all of them during export (but not during playback) so customers who require the fastest possible encode times will be able to leverage all the GPUs they own.
http://blogs.adobe.com/premierepro/2013/05/improved-gpu-support-in-adobe-premiere-pro-cc.html
 
Again with the CUDA!

It's kind of a big deal.

We have no idea what Apple has been doing with software developers.

It actually doesn't matter that much what Apple has been doing with other software developers. There are known libraries and known applications that require CUDA, not OpenCL. They work with CUDA, and not OpenCL. Making them work with OpenCL would require an entire re-write.

The one bit of information we do know (as a fact) is that the developer of DaVinci Resolve said that version 10 with OpenCL flat out SCREAMS on the new Mac Pro.

Fantastic; that works for Resolve. What about all the other apps (some custom written, some commercially available) that require CUDA? Answer: they either get left behind or re-written. Failing that, the user of said software just doesn't buy a new Mac Pro.
 
Seriously, who has been on OS X professionally for even a few years and not use FW800?

Me. eSATA is considerably more ubiquitous for drive interfaces, and not all professional work involves plugging in a camera.

Again with the CUDA!

As someone else said, it's kind of a big deal.

We have no idea what Apple has been doing with software developers.

Not all of us just *use* software. Some of us make it. Some of us make it with CUDA, which is the dominant language for HPC.

The one bit of information we do know (as a fact) is that the developer of DaVinci Resolve said that version 10 with OpenCL flat out SCREAMS on the new Mac Pro.

Will they be rewriting my simulation code for me?

I am inclined to take a developers who has a major stake in their product sales and has used the new Mac Pro at their word.

You've established now several times you can't seem to imagine use cases beyond your own. I don't pontificate on software for creative professionals for a reason, perhaps you should consider not expounding on why CUDA vs. OpenCL isn't a big deal.

It actually doesn't matter that much what Apple has been doing with other software developers. There are known libraries and known applications that require CUDA, not OpenCL. They work with CUDA, and not OpenCL. Making them work with OpenCL would require an entire re-write..

This.
 
SATA was never meant to be external.

And yet here we are, with external SATA ports on a huge number of motherboards.

That's all I'm saying - there are a limited number of very-high transfer speed demands that need something like FireWire 800, and if you use a more standard drive interface (whether or not it was intended to be, that cat is well and truly out of the bag) and don't touch a camera, it's entirely possible to be a "Professional" and never notice your FW800 port.
 
It's kind of a big deal.

I guess there are certain risks associated with hardware-vendor-specific SDK's.

This mirrors what happened with the 3D graphics industry in the 1990's.

Many years ago, games were written to specific proprietary SDK's like Glide (3dfx Voodoo), S3D (S3 ViRGE), and Rendition's SDK. Eventually, OpenGL and Direct3D took over completely. Good thing too, because none of those original hardware vendors are still around.

5 years from now, I wonder if CUDA will still be a thing?
 
[/COLOR]Yeah, but we'd have to pay a lot less, I'll bet, and that would dilute the Apple experience we've all come to love.

BOXX guy called me this morning, I said let's wait and see.

I wouldn't go that far. Apple's workstation prices were competitive.




Well I suppose that's good for the Premiere users. That doesn't begin to address all of the other software out there.
 
It's kind of a big deal.

It is a temporary deal.

It actually doesn't matter that much what Apple has been doing with other software developers. There are known libraries and known applications that require CUDA, not OpenCL. They work with CUDA, and not OpenCL. Making them work with OpenCL would require an entire re-write.

roll back a couple of years.... And substitute Adobe Flash for CUDA and HTML5/Javascript for OpenCL. Horrible debacle iOS has been when Apple blocked flash.

What matters is what customers want. The completely oxymoronic thing with CUDA is folks who throw it up as enabler of and/or or example of choice when it has exactly the opposite objective and implementation/effect. CUDA always was tarpit designed by Nvidia.

More than a few customers do want choice. CUDA doesn't enable that. Yes it has a larger inertia. That isn't going to last forever it customer choice is the dominate market force.

Fantastic; that works for Resolve. What about all the other apps (some custom written, some commercially available) that require CUDA? Answer: they either get left behind or re-written. Failing that, the user of said software just doesn't buy a new Mac Pro.

Software vendor looking for multiple platform deployments are going to remove CUDA as fast as possible from their applications. The number of CUDA only apps is going down (Adobe is dumping it. Blackmagic is dumping it) . When Intel has fully deployed their second generation Iris Pro solutions it is going to go down even faster.

Struggling struggling vendors are going to stick with it. Narrow niche vendors who have a very small number of customers are going to stick with it.
 
I wouldn't go that far. Apple's workstation prices were competitive.
I think in some instances, their prices were quite good, in others not so much; but I'm not sure how this new machine will stack up, and it will need at least an external storage enclosure, which, if it's well made, won't be small or cheap.

It reminds me of power bricks, which I have always hated.

The current BOXX offerings seemed pretty affordable. Nothing as elaborate as the New Mac Pro, but certainly usable.

It all comes down to Bang for the Buck, after a year, anything on my desktop is covered up with post-it notes, and the aesthetics I'm interested in are on my screen.
 
And audio interfaces historically do NOT play well when there are "adapters" between them and the computer.

There's no adapter. Thunderbolt is PCI/E on a cable. A Firewire interface through Thunderbolt is as good as a Firewire card plugged into the current Mac Pro.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.