Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
OK, but in the light of recent successes like the iPhone, MacBook Pro, etc. you should not forget that sometimes Apple stuck its neck out—only to get a stiff neck because there’s some harsh wind blowing. Think of the
  • G4 Cube
  • Newton
  • Pippin
  • Puck Mouse
  • Xserve/Xserve Raid
All have been products where Apple tried something different and the market just said: Meh! Don’t like it because of [put your favorite reason here].

You are aware that the Xserve line ran from 2002 up until 2011 and has had Apple servers ever since 1993? You must be one of those that didn't like it and thats why it is on your list. The Xserve had a 9 year run and it would not have happened unless there was a market for it. Apple discontinued it for their own internal reasons which I doubt you are privy to.

Additionally I would not say a company is sticking their neck out for making a mouse. Its just a mouse.
 
I'm pretty much resigned to buying one as all my Macs are at least 2 years old and some are far older.

Indeed. I took the other route - I bought a new 2012 hex core to replace my 2008 3,1 MP. The theory is that if the nMP is a dog or too darn expensive, I'll live with the hex core. If it is wonderful and not too expensive (realizing that $$$ are relative), I'm sure to find lots of willing buyers for the old form factor with 30 months of AppleCare remaining.

Now all we have to do is wait ...
 
Think of the
  • G4 Cube
  • Newton
  • Pippin
  • Puck Mouse
  • Xserve/Xserve Raid

You missed the Apple 3 and the Lisa. As I said, they take risks - I didn't say they never failed.

You can't count the XServe there - it was successful for years before the switch to Intel, better support for Windows networks on Macs and increasing respectability of Linux made it irrelevant.

Who knows how the G4 Cube would have done if it hand't had cracking and overheating problems? The Newton was pretty influential and possibly axed before its time because of problems elsewhere in Apple. The "hockey puck" mouse... no, you're right, that was just unmitigated form-over-function stupidity.

Apple prices their Pro application extremely competitive. FCPX, LogicX, Aperture … they are all very cheap and thus target the prosumer market as well as the pro user market. Clearly they want to put their software into as many hands as possible.

Apple make their money selling hardware, not software. Offering "pro" software at low prices encourages people to buy Macs.

This is a relatively recent development: FCP, Aperture, OS X Server used to be far more expensive. The "pro" market is shrinking - Apple is betting big on the "prosumer" market being more profitable.

But then comes the new Mac Pro and all the components individual prices point to a higher system price level than before

The published prices for Intel Xeon processors and AMD FirePro graphics tell you absolutely zip, nothing, nada about the possible price of the new Mac Pro. Apple are a huge buyer of components. This is Intel's flagship for Thunderbolt 2 and AMD's standard-bearer for OpenCL vs. CUDA, and both Intel and AMD would quite like Apple to keep using their lesser chips in their mainstream computers, too. Apple will have got a very, very good price for this stuff. The price of the MacPro is more about how much markup Apple wants to make.

Target the system at the 1500-2000 $ price range and it would perhaps be another top seller.

...but how many of those sales would represent the lost sale of an iMac, Mac Mini, Mac Book Pro or "proper" Mac Pro?

Certainly, the chances of an "xMac" appearing at, or around the same time as, the new Mac Pro are zero. If you're in the market for a headless Mac with a decent GPU, Apple want you to buy a Mac Pro.
 
Given the specs, its going to be anything but cheap. I really like what I see with the MacPro but I'm 99% sure its way out of my price range. Its not like I need the power or all that storage that's onboard.

What, all that 512GB of SSD boot? Maybe a TB? Nothing is onboard. Everything costs extra.

----------

You can't count the XServe there - it was successful for years before the switch to Intel, better support for Windows networks on Macs and increasing respectability of Linux made it irrelevant.
It was successful after the switch. Same people who bought G4 and G5 bought Xeon. It was the Server OS that took a nose dive and as such no one was interested in dual quad core file servers with crappy tape backup software options. Only reason to have it was for the massive gains of working with afp. Everything else was given up on or not allowed to flourish. Linux was always respected (at least the Red Hat w/ contract and going back 12 years) but Unix and Windows still litter most enterprises.
 
I'm confused about all of the heat here about the new MP.

It isn't even out yet.
I love my current Large Mac Pro, but I'm open to new ideas.
Not so long ago, I thought Apple was dumping the whole Mac Pro concept.

I'm happy that they are continuing the line.
The new MiniMacPro could be great.
Not the vision I had, but I'm open to seeing what it can do.

I'm waiting to see the actual, purchasable product.
I'm waiting for the specs on the ones we can buy.
I'm waiting for the prices.
I'm waiting to see how it can be adapted to various situations. (storage, Pci, gpu, cpu, etc.)
I'm waiting for actual users to report about their experiences.

Might be a great new thing.... might not be....

Time will tell.

I wouldn't say that we are fighting, so much as we are evaluating. From my perspective, the potential customer spectrum for the iCan is smaller than the current Mac Pro

For me, this is what may have me joining the professional exodus back to Windows.

It would cost me at least $1200 to replace the missing functionality (2 TB HD enclosures and additional USB hubs), the top line version of the iCan has 25% fewer cores than what I can buy today (Dell & HP both have 16 core boxes), and I would be paying for a 2nd GPU that won't actually be used by the software in my production workflow.

These things matter to me - I am not an :apple: fanboi, I use what works best for me, and this certainly isn't it.

I believe the iCan will end up just like the cube - it will win a lot of design awards, but will be too expensive, with limited functionality, and the line will die off about 2 years after it's release.
 
I believe the iCan will end up just like the cube - it will win a lot of design awards, but will be too expensive, with limited functionality, and the line will die off about 2 years after it's release.

thing with the cube is that it was an underpowered version of the flagship.. this new macpro IS the flagship and has tons of power..

if you want to compare to the cube (which you should be comparing to anyway since it was, in essence, a prototype of the new mac), it might be better to do

newton -> ipad
cube -> new mac
 
Last edited:
thing with the cube is that it was an underpowered version of the flagship.. this new macpro IS the flagship and has tons of power..

if you want to compare to the cube (which you should be comparing to anyway since it was, in essence, a prototype of the new mac), it might be better to do

newton -> ipad
cube -> new mac

As someone who was an active Mac user when the Cube was out...

The Cube did not die because it was under powered, lacked expandability, or lacked a market...

The Cube died because it was too expensive. For the same price, you could buy a more powerful Power Mac G4.
 
thing with the cube is that it was an underpowered version of the flagship.. this new macpro IS the flagship and has tons of power..

if you want to compare to the cube (which you should be comparing to anyway since it was, in essence, a prototype of the new mac), it might be better to do

newton -> ipad
cube -> new mac

The top of the line iCan has 25% less power than a Dell or HP Xeon workstation. GPUs are irrelevant to my workflow. That is what I was talking about when I was pointing out the "potential customer spectrum". The iCan will be great if your software will use those GPUs - if it doesn't, (and that is my case) then a good chunk of the system is useless.
 
The top of the line iCan has 25% less power than a Dell or HP Xeon workstation. GPUs are irrelevant to my workflow. That is what I was talking about when I was pointing out the "potential customer spectrum". The iCan will be great if your software will use those GPUs - if it doesn't, (and that is my case) then a good chunk of the system is useless.

what software?

(i don't know jack about dell or hp workstations so i can't really comment on that.. other than it sounds like you're saying mac has, say, 3.4ghz processors and those comupters have 4.5ghz.. is that what you're saying? )
 
Here's another reason why:
 

Attachments

  • Mac-Pro_2013_Mac-Pro_2013.jpg
    Mac-Pro_2013_Mac-Pro_2013.jpg
    322.6 KB · Views: 125
what software?

(i don't know jack about dell or hp workstations so i can't really comment on that.. other than it sounds like you're saying mac has, say, 3.4ghz processors and those comupters have 4.5ghz.. is that what you're saying? )

No, he's saying that you can buy presently a two cpu, each with 8 core, workstation from dell or hp. This mean they have 16 total cores, so 25% more than the 12 of the nMP.

As previously mentionned to you, plenty of software are cpu bound. While gpu are great at some task, they aren't superior to cpus at other task.
 
No, he's saying that you can buy presently a two cpu, each with 8 core, workstation from dell or hp. This mean they have 16 total cores, so 25% more than the 12 of the nMP.


As previously mentionned to you, plenty of software are cpu bound. While gpu are great at some task, they aren't superior to cpus at other task.



meh.. ;)



booleanCPU.jpg
 

In other words, CPUs and GPUs have significantly different architectures that make them better suited to different tasks. A GPU can handle large amounts of data in many streams, performing relatively simple operations on them, but is ill-suited to heavy or complex processing on a single or few streams of data. A CPU is much faster on a per-core basis (in terms of instructions per second) and can perform complex operations on a single or few streams of data more easily, but cannot efficiently handle many streams simultaneously.

As a result, GPUs are not suited to handle tasks that do not significantly benefit from or cannot be parallelized, including many common consumer applications such as word processors. Furthermore, GPUs use a fundamentally different architecture; one would have to program an application specifically for a GPU for it to work, and significantly different techniques are required to program GPUs. These different techniques include new programming languages, modifications to existing languages, and new programming paradigms that are better suited to expressing a computation as a parallel operation to be performed by many stream processors. For more information on the techniques needed to program GPUs, see the Wikipedia articles on stream processing and parallel computing.

http://superuser.com/questions/308771/why-are-we-still-using-cpus-instead-of-gpus
 
In other words, CPUs and GPUs have significantly different architectures that make them better suited to different tasks. A GPU can handle large amounts of data in many streams, performing relatively simple operations on them, but is ill-suited to heavy or complex processing on a single or few streams of data. A CPU is much faster on a per-core basis (in terms of instructions per second) and can perform complex operations on a single or few streams of data more easily, but cannot efficiently handle many streams simultaneously.

As a result, GPUs are not suited to handle tasks that do not significantly benefit from or cannot be parallelized, including many common consumer applications such as word processors. Furthermore, GPUs use a fundamentally different architecture; one would have to program an application specifically for a GPU for it to work, and significantly different techniques are required to program GPUs. These different techniques include new programming languages, modifications to existing languages, and new programming paradigms that are better suited to expressing a computation as a parallel operation to be performed by many stream processors. For more information on the techniques needed to program GPUs, see the Wikipedia articles on stream processing and parallel computing.

http://superuser.com/questions/308771/why-are-we-still-using-cpus-instead-of-gpus

well, you're more or less saying what i was stabbing at.. that is-> applications/processes which can be multithreaded will also be the ones that can make use of the gpus.. where as apps that need to do linear calculations (such as the activity monitor screen shot of a rhino boolean), generally aren't going to be able to utilize multiple cores or gpu..

but from what i see so far (at least with rendering apps), are they can benefit from multiple cpus.. BuT, they benefit even more when programmed to use a gpu..

which is why i made the joke (yes, i was playing dumb) about the 3.4 vs 4.5 clocks.. because, to me at least, that's where i'd actually notice/feel/etc more power.. but if i had 16 cores instead of 12 -- so what.. all that means (again, for me in many/most situations) is that i'll have 15 cores sitting idle instead of 11..
(ie- no 25% speed increase.. in fact, more cores will probably slow down my work because generally, the more cores on a cpu, the slower the individual cores are)
 
well, you're more or less saying what i was stabbing at.. that is-> applications/processes which can be multithreaded will also be the ones that can make use of the gpus.. where as apps that need to do linear calculations (such as the activity monitor screen shot of a rhino boolean), generally aren't going to be able to utilize multiple cores or gpu..

but from what i see so far (at least with rendering apps), are they can benefit from multiple cpus.. BuT, they benefit even more when programmed to use a gpu..

which is why i made the joke (yes, i was playing dumb) about the 3.4 vs 4.5 clocks.. because, to me at least, that's where i'd actually notice/feel/etc more power.. but if i had 16 cores instead of 12 -- so what.. all that means (again, for me in many/most situations) is that i'll have 15 cores sitting idle instead of 11..
(ie- no 25% speed increase.. in fact, more cores will probably slow down my work because generally, the more cores on a cpu, the slower the individual cores are)

For your workflow gpu is better. As long as you understand that for others, the lost of the second cpu makes the nMP less desirable due to their workflow and applications then everything is peachy ;)
 
Here's another reason why:



And that is another real reason.

I have to factor in the cost of all the additional external devices that previously are currently inside my MacPro.

I have 5 usb devices (iPhone, iPad, iPod, Keyboard, Mouse, & scanner - not including thumb drives, but I also need a port for one in the FRONT of the computer; & 6 hard drives (4x2TB 3.5" & 2x240 ssd) so that is another $1,200 that must be factored in. That is why I was saying in another thread that to be cost effective, the max price had to be around $3,000 - and even at this price point, I am down 4 cores.
 

Attachments

  • Mac-Pro_2013_Mac-Pro_2013.jpg
    Mac-Pro_2013_Mac-Pro_2013.jpg
    322.6 KB · Views: 93
well, you're more or less saying what i was stabbing at.. that is-> applications/processes which can be multithreaded will also be the ones that can make use of the gpus..

In some cases, yes, others, no. I can give you a job that will use all the cores you throw at it, but needs 1TB of RAM....got that on a GPU?
 
The top of the line iCan has 25% less power than a Dell or HP Xeon workstation. GPUs are irrelevant to my workflow. That is what I was talking about when I was pointing out the "potential customer spectrum". The iCan will be great if your software will use those GPUs - if it doesn't, (and that is my case) then a good chunk of the system is useless.
Exactly.
I don't remember anything in the pr for Logic X about open CL or GPGPU. And my admittely aged dsp cards are being given a premature burial. Quite a lot like the corpse collector scene in Monty Python's Holy Grail.
"I'm not dead yet!".
 
Here's another reason why:

The back of my mac pro running two cinema displays, speakers and Matrox MX02 pretty much resembles the mess this person has artfully created for the new Mac Pro.

To suggest that the current Mac Pro is an elegant, neat and tidy solution isn't the reality for me.
 
...but that's partly because your "big box" Mac Pro represents a significant investment in the infrastructure to hold all that internal expansion. All those backplanes, connectors, fans, the power supply and all that structural aluminium add to the cost of the system, especially since nobody ever accused the Mac Pro of under-engineering.

To me, how big of an "investment" something is depends on the price (time and money). I don't think anyone's arguing that the nMP will be cheaper than the standard Quad2.66 1,1 model.

Like I said, and you imply, if you want/need replaceable video cards, the nMP becomes disposable. If it's really cheap then that's great, but anyone who sells a computer to replace it with another will probably tell you they'd rather have upgraded their old one to suit their needs.

Edit: Also, there's this:

c06enPS.jpg
 
Last edited:
As a result, GPUs are not suited to handle tasks that do not significantly benefit from or cannot be parallelized, including many common consumer applications such as word processors.

While this is true, 12 core vs. 16 cores isn't going to make any difference for a word processor. Applications that can't be parallelized never go beyond one core anyway, so what difference does 12 or 16 cores make?

Exactly.
I don't remember anything in the pr for Logic X about open CL or GPGPU. And my admittely aged dsp cards are being given a premature burial. Quite a lot like the corpse collector scene in Monty Python's Holy Grail.
"I'm not dead yet!".

Any modern processor architecture (even back to the iil ol iPhone 3GS) has DSP acceleration built in. That was even a big selling point of the G4 vs. the G3. As those architectures get better a DSP card gets less necessary because it's already included in the CPU.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.