Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
....
It's true that prior to the 2012 Mac Pro, Apple was not only price competitive at the high end (the fat), but I found them them to be less expensive than their greedy competitors who were and are on fatfull diets.
....
Nope. Neither you nor I know what the exact computational potential of the low end box will be, but we can use history as a guide (and the base has been watered down). This is the area where Apple has not been price competitive traditionally, .... I acknowledging that Apple sold more units at the low end at higher prices than its competitors.

They sold more at the low end with a box not optimized to sell at the lower end of that spectrum. This one is. It walks away from the dual processor market so it is more much more a single CPU focused offering. That doesn't necessarily be loss in computational horsepower, just fewer x86 cores.

I don't think Apple has any keen interest in selling $5-12K boxes to stagnant group of customers. The other system vendors like the high priced boxes because they can goose the margins. Typically those increased margins are used to offset "loss leader" boxes they have elsewhere in their line up. Apple has no loss leader boxes. That is one reason why their prices were more reasonable as go to the top end of the scale.

This new box should move the top end of the scale to a new place. Again Apple has little to no need to goose standard configuration prices increasingly higher because not in a "rob Peter to pay Paul" exercise.

As far as history goes Apple has historically put a technology based performance gap between products and their pricing. I don't think Apple is likely to shrink the GPU in the entry Mac Pro so much that it is below the where the higher iMac BTO options are. So a W7000 equiv is probably as low as they go. But yeah that is a educated guess.

One of the problems Apple had with the single CPU package box is that it needed to be higher than iMac and lower than the dual. If they toss the upper end they will have more freedom to put value-add into the entry single that won't wipe out dual sales. Dual sales are off the table.

If left with just needs to be higher than iMac there is little reason to no stuff move value ( e.g., multiple TFLOPs of computational horsepower) so that the $2,499 is buying alot more. Paying $2,499 for empty space where can later put something is pretty expensive empty space. It think it is going to be much easier for Apple to sell an actual thing (here is the 2nd GPU this is what it can do ) than "empty space".


Isn't the true reason why a true upgrade from Mac Pro 2010 has taken so long is that Apple wasn't satisfied with the tiny share of the overall market that it was eating?

Time will tell if this was a "cut loose the dead weight, so can maneuver better" move to better serve the single CPU but moderately high core count market or " the discrete GPU card market spawn a resurgence in boxes with slots".

Personally, I would not bet again some form of GPGPU being folded into CPU packages in the Xeon E5 line up in the future. It will become commonplace for there to be at least one GPU that is embedded into the system. Same thing in entry/mid-level server market ( 1 CPU package) .

The "attack of the killer micros" killed off practically all of the specialized supercomputer CPUs from 2 decades ago. The same forces will kill off the the dual / quad micros over time using same technology forces. For example the Xeon E7 line up is starting to be endangered by the E5 4600 line up. E7 will shrink to smaller number of boxes that are death sprial kind of pricing ( higher price ,, fewer customers, higher price ,,, ... )

Over time computers have generally gotten smaller. Not only Apple's ... all system vendors. You'd be hard pressed to name a single successful system vendor who did better long term by "going bigger" over time rather than smaller. In fact, many of the industry inflection points have "big" vendors dragging their feet on going small because catering to the legacy , sunk cost market that was stuck on the larger form factor.
 
So what if anything have you learned from the responses to your question ? I too might do the same just for an interest in parallel programming.

Not much frankly.

My belief is that the surprise at release with the new Mac Pro is on the downside. This is a box that throws everything possible out (multi CPU's, PCI cards, 8 RAM slots, internal 3.5") in favor of small size and silence. That SCREAMS consumer item. Meanwhile at WWDC they did nothing but try and convince us what a high powered workstation it is. They build one thing, and the head of marketing tries to convince us it's something else (this makes complete sense since the WWDC showing was only for the purpose of keeping the video professionals from bolting).

For example, imagine if at WWDC they showed off the entry level machine and just talked about how small and quiet it is. Video professionals would have written it off and walked away. So, clearly they designed a consumer box and used marketing (and high end GPU's) to convince the Video guys to stay.

Now for we who this computer appears to be targeted for - the Prosumers - I think it will be a brilliant solution. Presently to get a highly parallel box you need to pay $6000's, once you pay the Apple Tax and the Nvidia tax. Here all you are paying is the Apple tax.

I think the bottom end price is $2k (1-2 TFLOP), and for $3500 you can get a really high end machine (3-4 TFLOP), with $5k-$6k (6-7 TFLOP) putting you in what they showed at WWDC.
 
Not much frankly.

My belief is that the surprise at release with the new Mac Pro is on the downside. This is a box that throws everything possible out (multi CPU's, PCI cards, 8 RAM slots, internal 3.5") in favor of small size and silence. That SCREAMS consumer item. Meanwhile at WWDC they did nothing but try and convince us what a high powered workstation it is. They build one thing, and the head of marketing tries to convince us it's something else (this makes complete sense since the WWDC showing was only for the purpose of keeping the video professionals from bolting).

For example, imagine if at WWDC they showed off the entry level machine and just talked about how small and quiet it is. Video professionals would have written it off and walked away. So, clearly they designed a consumer box and used marketing (and high end GPU's) to convince the Video guys to stay.

Now for we who this computer appears to be targeted for - the Prosumers - I think it will be a brilliant solution. Presently to get a highly parallel box you need to pay $6000's, once you pay the Apple Tax and the Nvidia tax. Here all you are paying is the Apple tax.

I think the bottom end price is $2k (1-2 TFLOP), and for $3500 you can get a really high end machine (3-4 TFLOP), with $5k-$6k (6-7 TFLOP) putting you in what they showed at WWDC.

I think you are right. Maybe what they showed is a high spec model, but with lower spec options available (for a total package < $2K) for you and I.

Please post what you end up doing as that might be my 2014 project :cool:
 
Let's hope Apple cares. Do please remember that the geek delivering the information at WWDC said it will have two 6GB fire pro cards as a default configuration.

Of course the same geek also said it has 6 firewire ports too tho, so... :p
Yes, but he did say "You know what I mean", afterwards. Not sure what he meant by that.
 
I think you are right. Maybe what they showed is a high spec model, but with lower spec options available (for a total package < $2K) for you and I.

Please post what you end up doing as that might be my 2014 project :cool:

I think $2k ($1999) will be the bottom end. Bare bones GPU's, 4-6 core CPU, 128G flash and 4G-8G RAM. My question is not to buy, but when.

The Apple lineup suffers (IMO) due to a lack of decent 3 monitor computing. Three monitors is optimal for creators like myself. One central and two symmetrically around it. The only way to do that today is with a Retina or a Mac Pro. I don't count the older 17" or 15" with two TB monitors - I've done that, and the disparity between the monitors and the inbuilt screen is too much.

With the Retina you can close the lid and run three external TB monitors apparently. Which is a stupid solution, they are too big and expensive, and if you unplug the laptop for the batteries sake (as you should) then the whole things goes to sleep.

I've finally decided that, in my opinion at least, using Apple laptops as desktops is stupid. Unfortunately the present Mini only supports 2 monitors. So we're stuck. A low end Mac Pro would nicely allow people to have a flexibility with external monitors in a desktop configuration, which is what the lineup really needs.
 
... .
I think the bottom end price is $2k (1-2 TFLOP), and for $3500 you can get a really high end machine (3-4 TFLOP), with $5k-$6k (6-7 TFLOP) putting you in what they showed at WWDC.

For your prediction to fully materialize at the top end, Apple better be getting extraordinary deals on the (6-7 TFLOP) Fire Pro GPUs because a pair of them now would cost you or me, at retail, from $3,000 to 6,000+. Frankly I think that your estimates, although the price points where Apple should be targeting, are far removed from what Apple actually does. I'm not aware of any Apple (40%) Tax Reduction Act. I think that also jeopardizes your other predictions, but not as profoundly. However, if Apple were to use the Radeon line it would be more credible.
 
For your prediction to fully materialize at the top end, Apple better be getting extraordinary deals on the (6-7 TFLOP) Fire Pro GPUs because a pair of them now would cost you or me, at retail, from $3,000 to 6,000+.

My understanding, which makes perfect business sense, is that the FirePro are the same consumer gamer chips with ECC and better drivers. Exactly the strategy businesses do. "Certify" the software and charge a fortune to the Enterprise customers (I do this in my day job). Actually having different silicon would cost them billions (new fabs) which is not cost effective.

Apple has AMD under their thumb, as they do all their suppliers. They are getting the chips at a specified low price. And the drivers are all certified - by Apple! It's Apple's software, they got the source from AMD and expertise and rolled their own. I think calling them FirePro is mostly marketing on Apple's part, they have the FirePro support (ECC) and whatnot but it's the same old silicon at the core.

Let me ask you this, do you seriously think Apple is going to sell this at (pick the low end) $6000 for GPU's plus say $2000 for the rest of it? You think ANYBODY would buy a $8k trash can with no internal expandability? The customers who have that money want PCI slots which Apple stripped out to make a consumer machine. It would be DOA which is the last thing they want.

You guys have to get real about this, it amazes me that people can see a retail FirePro chip and duh, think that therefore Apple MUST be charging FirePro retail, on a NON FIREPRO RETAIL card. Ain't gonna happen because if it did they wouldn't sell ANY, basic marketing.

This machine is target at Prosumers. Apple wants to bring them into the line to make it more of a viable line rather than just for the graphics nerds.
 
My understanding, which makes perfect business sense, is that the FirePro are the same consumer gamer chips with ECC and better drivers. Exactly the strategy businesses do. "Certify" the software and charge a fortune to the Enterprise customers (I do this in my day job). Actually having different silicon would cost them billions (new fabs) which is not cost effective.

Apple has AMD under their thumb, as they do all their suppliers. They are getting the chips at a specified low price. And the drivers are all certified - by Apple! It's Apple's software, they got the source from AMD and expertise and rolled their own. I think calling them FirePro is mostly marketing on Apple's part, they have the FirePro support (ECC) and whatnot but it's the same old silicon at the core.

Let me ask you this, do you seriously think Apple is going to sell this at (pick the low end) $6000 for GPU's plus say $2000 for the rest of it? You think ANYBODY would buy a $8k trash can with no internal expandability? The customers who have that money want PCI slots which Apple stripped out to make a consumer machine. It would be DOA which is the last thing they want.

You guys have to get real about this, it amazes me that people can see a retail FirePro chip and duh, think that therefore Apple MUST be charging FirePro retail, on a NON FIREPRO RETAIL card. Ain't gonna happen because if it did they wouldn't sell ANY, basic marketing.

This machine is target at Prosumers. Apple wants to bring them into the line to make it more of a viable line rather than just for the graphics nerds.

Obviously, you have more faith that Apple is not as greedy as I do because I think that $8K figure is too low for the top of the line system. But Apple and time will tell.
 
Obviously, you have more faith that Apple is not as greedy as I do because I think that $8K figure is too low for the top of the line system. But Apple and time will tell.

They did scale back in one area to just a single X86 processor. I assume they intend for the firepros to be leveraged. There's a market beyond $8k, but I don't know whether Apple will cover it.
 
Yeah, IMO if the machine they showed at WWDC is $6k they will have just priced themselves completely out of the industry and there'll not be a 2015 model following it.

That system is worth about $3.5K max and they're gonna need to shoot even below that if they intend to gain market share. By the time the MP6,1 releases we will be able to build the identical system (minus 4GB per GPU) for right around the $2K mark.

The processor in the MacPro costs slightly over $2k with OEM pricing in trays of 1000. Retail pricing, when it eventually shows up in retail channels, will be higher. What they are currently showing is a minimum of $4-$5k in OEM priced PARTS. You most certainly could not build anything anywhere near identical for $2k.

As for the price of the GPUs, the actual chip that makes up the W9000 in no way shape or form costs that much. The W9000 is expensive, but the chips that make them up are not. Driver development and validation is why pro graphic cards are expensive, not the actual silicon. The likely scenario is, Apple is helping to develop and validate the drivers for their custom version of the W9000.

And Dual GPUs are standard, Dual W9000s are not. They will have atleast two different models, if not three. With the highest end model being the one they have been showing.
 
Last edited:
They've got the Mini guys scared. There is a thread in their forum section about fear that the new Mac Pro is going to kill the mini. As in take its market.

This wasn't something they ever worried about before June 11th
 
They've got the Mini guys scared. There is a thread in their forum section about fear that the new Mac Pro is going to kill the mini. As in take its market.

This wasn't something they ever worried about before June 11th

Maybe they'll make a huge Mini with 6 PCIe 3.0 slots, 6 HHD/SSD bays, dual 12-core Xeons and 16 RAM slots. I could get into that.
 
They did scale back in one area to just a single X86 processor. I assume they intend for the firepros to be leveraged. There's a market beyond $8k, but I don't know whether Apple will cover it.

I think you have less understanding of marketing than I do. I'll spell it out, they can price these at $4k-$8k and sell none, or they can price them at $2k-$6k (probably for the high end I think) and sell a lot. In which scenario do they make more money?

But I could be wrong, they could have other plans for this and maybe there is a good reason to price it in the stratosphere across the board. If there is I don't see it.
 
I think you have less understanding of marketing than I do. I'll spell it out, they can price these at $4k-$8k and sell none, or they can price them at $2k-$6k (probably for the high end I think) and sell a lot. In which scenario do they make more money?

But I could be wrong, they could have other plans for this and maybe there is a good reason to price it in the stratosphere across the board. If there is I don't see it.

Thanks for the personal attack following a misinterpretation of my post:p. There are a lot of workstations below what Apple charges currently. There are some that exceed $10k. They have to determine a range. If it started at $4k, it likely wouldn't be sustainable. I never made a claim on where it would start. I just said that there are workstation configuration options that Apple has decided not to address at this point. Those include dual socketed versions. We'll see if they made the right choice there. Their current problem seems like an issue of poor alignment. The $2500 machine is a real stretch for that, and yet that is the one they would depend upon to maintain whatever level of volume is required for a viable line.
 
Thanks for the personal attack following a misinterpretation of my post:p. There are a lot of workstations below what Apple charges currently. There are some that exceed $10k. They have to determine a range. If it started at $4k, it likely wouldn't be sustainable. I never made a claim on where it would start. I just said that there are workstation configuration options that Apple has decided not to address at this point. Those include dual socketed versions. We'll see if they made the right choice there. Their current problem seems like an issue of poor alignment. The $2500 machine is a real stretch for that, and yet that is the one they would depend upon to maintain whatever level of volume is required for a viable line.

Apologies, I accidentally quoted the wrong post! Urg ...
 
Apple is known for putting out hardware that is ahead of the curve. Five years I'd say. So we have two tidbits, one, the geekbench score shows that single proc is not much faster than last gen. Therefore Intel has hit a wall with CPUs, until they move on to next generation, like true 3d chips or graphene.

Two, because of this hi end computing is moving on to GPUs due to raw and watt/flop performance. From supercomputers to more pedestrian workstations more and more computing is going on GPUs.

The new Mac Pro is a clear statement that this is where Apple (correctly) sees the future performance at. Phil even mentioned OpenCL in the keynote.

So yes, I think dropping multi CPUs in favor of GPUs was the correct decision.
 
Apple is known for putting out hardware that is ahead of the curve. Five years I'd say. So we have two tidbits, one, the geekbench score shows that single proc is not much faster than last gen.

Probably not true. Consolidating more stuff into the CPU package is going cut down on other latencies. For example over next couple of years Xeon E5 will probably get much larger eRAM L4 caches. That is going to make it easier to suck larger fragments of these benchmarks data inside the package. That will increase speed.

Another factor is that the year over year increases aren't large anymore. To do another 2-9%

Therefore Intel has hit a wall with CPUs, until they move on to next generation, like true 3d chips or graphene.

The notion of a CPU is going to change. It will be far more closer to a SoC for even the Xeon E5 class.

When Seymour Cray was doing supercomputer design each major generation box he developed got smaller. ( some grew a bit bit in overall container size but the "CPU" core section of each one got smaller. ). Yes now to bust the top500 list you need half an arce of floor space, but no one really wants that. That is more so driven by limits of current technology as opposed to what want to do if given any tech.


graphene and the other alternatives still haven't proven they are economical at scale. After being conditioned by 30 years of Moore's Law nobody is going to buy "it is faster, but 1.5x more expensive". GaAr CPUs went no where.


Two, because of this hi end computing is moving on to GPUs due to raw and watt/flop performance.

That is partially because the cores are simplier and don't contain alot of baggage. However, you can quite easily put a GPU inside the same package as the CPU. the major blocker to that has be increasing the memory bandwidth high enough to keep up with that many simpiler cores while still providing the more general purpose ones enough bandwidth to keep them feed also.

that is largely independent to moving to radically different process technology. It is more of an architectural shift in where the transistor budget goes.


The new Mac Pro is a clear statement that this is where Apple (correctly) sees the future performance at. Phil even mentioned OpenCL in the keynote.

I think the Mac Pro is far more a long term design statement that trying to maximize the separation between the CPU and GPU is wrong direction going forward. They are probably correct. Short term they may take a lump or two but by the time the "firmly dedicated to the legacy market" workstation vendors wake up Apple's design will be on version 3 (or 4 ) of the new trend and those folks will be on version 1 of theirs.

So yes, I think dropping multi CPUs in favor of GPUs was the correct decision.

There is a difference between good decisions and good outcomes. If the software doesn't show up fast enough it won't be a good outcome.


Also, while that one is aligned with trends, Apple may have nuked the 10 SATA lanes in the 2013 Mac Pro prematurely. That's more of Apple pushing things into a corner hoping that effective solutions appear. Supercomputers systems come with substantial storage. Sure those storage devices aren't placed very close to the computational units but lots of computation often means lots of of data either in , out , or both. Not that it should tackle any mount of data ( or moderatly large ones ), but shouldn't be on the relatively small side either.
 
A new Mac Pro should do between 7 and 8 TFlOps, with 2 W9000's. But so would a standard desktop PC with two HD7970's.

True but......
mac-pro-new-vs-old.jpg


You can probably fit 6 nMP in the space of a "standard" desktop PC.
(It is 1/8th the volume of the oMP but you have other considerations)

Shame Apple got rid of xGrid.
Should have made it "transparent" with OpenCL IE no extra code to add machines to any application.
 
Apple is known for putting out hardware that is ahead of the curve. Five years I'd say. So we have two tidbits, one, the geekbench score shows that single proc is not much faster than last gen. Therefore Intel has hit a wall with CPUs, until they move on to next generation, like true 3d chips or graphene.

Two, because of this hi end computing is moving on to GPUs due to raw and watt/flop performance. From supercomputers to more pedestrian workstations more and more computing is going on GPUs.

The new Mac Pro is a clear statement that this is where Apple (correctly) sees the future performance at. Phil even mentioned OpenCL in the keynote.

So yes, I think dropping multi CPUs in favor of GPUs was the correct decision.

I agree with a lot of things here, e.g. the move to GPUs, and as you've said Apple is ahead of the curve.... but a LOT of software isn't with them yet! They will either adjust to align to Apple's strategy (which could be a HUGE and expensive re-think for a lot of vendors), or perhaps they could choose for themselves what processing technology works best for their purpose - which may mean sticking with CPU.

When you said single proc is not much faster than last gen, are you talking in terms of GHz?? I don't understand because Intel's E5 single CPU overall performs better in geekbench than a pair of last gen put together! I am particularly frothing over putting a pair of 10 core 3.4GHz E5 CPUs into action.

Now if only the mac pro had come with those!

**Just noticed deconstruct60 had replied way faster than me haha
 
Last edited:
You can probably fit 6 nMP in the space of a "standard" desktop PC.
(It is 1/8th the volume of the oMP but you have other considerations)

And cost about 6 times as much.




Shame Apple got rid of xGrid.

xGrid was not competitive with the numerous other solutions out there. xGrid is largely proprietary. The other ones aren't. That's one of the major reasons it "lost" over the long term.

You can still create a grid of these Mac Pro. The only thing that is lacking is a Jonny Ive orchestrated GUI sitting sitting on top of the system. That has exceedingly little impact on computational performance.

The main competitive advantages xGrid had at the end was that it was "free" (bundled ... i.e., costs deferred onto whole mac user base) , had a nice GUI, and hooked to some of the OS X Authenticate relatively easily.
None of those have to do with performance.



Should have made it "transparent" with OpenCL IE no extra code to add machines to any application.

That is a holy grail that won't work. Where NUMA (non uniform memory acesss) gets extreme there is an impact on applications. That extreme non-uniformity will be anything but transparent. Mild stuff like OpenCL over PCI-e v3.0 or even more mild two package QPI Xeon set-ups are in a whole different ballpark.
 
You guys are nuts. A fire pro W9000 is a 365mm^2 die. There would be 82 on a 200mm wafer. That would put the cost of each die (50% yield) at roughly $50. Using a 50% GPM, maybe $100 to apple. Just because a high end dual W9000 card is $3000 doesn't mean the GPU's in the new mac pro will add $3000 to the price.
 
. but a LOT of software isn't with them yet! They will either adjust to align to Apple's strategy (which could be a HUGE and expensive re-think for a lot of vendors)

It is not Apple's strategy. It is largely the industry's strategy. AMD is merging CPU+GPUs. So is Intel. Nvidia has Maximus. If AMD workstation graphics group was thoroughly slacking they'd have a Maximus-like solution too.

A higher number of dual GPU cards and a huge increase in OpenGL computational resources were coming to the next Mac Pro regardless of the embedded setting being "on" or "off" on the next Mac Pro's design. More than a few folks were going to fill any discrete slot Mac Pro with multiple cards to if more power was allocated to the PCI-e thermal zone.

So "we're shocked that more folks have TFLOP of GPGPU power available" would be clueless software vendors.

There are alot of vendors who either think can't adapt to it or where punting for a later day ( despite the fact that in 2013 millions of Macs would be sold with two OpenCL GPUs emebbed inside them regardless of what happened on the Mac Pro. ).

Apple's strategy move is different in that they are making it a more uniform deployment of duals more quickly. That would only cause a faster return on investment for those who were correctly tracking trends and a bigger disadvantage to some of those who were firmly tracking the rear view mirror.

or perhaps they could choose for themselves what processing technology works best for their purpose - which may mean sticking with CPU.

Or best for their profit margin; just push the legacy code base forward.
Some can't. Some don't want to.
 
It is not Apple's strategy. It is largely the industry's strategy.

Agreed! More specifically I could have said - align to Apple's move with overall industry change etc.

A higher number of dual GPU cards and a huge increase in OpenGL computational resources were coming to the next Mac Pro regardless of the embedded setting being "on" or "off" on the next Mac Pro's design. More than a few folks were going to fill any discrete slot Mac Pro with multiple cards to if more power was allocated to the PCI-e thermal zone.

So "we're shocked that more folks have TFLOP of GPGPU power available" would be clueless software vendors.

There are alot of vendors who either think can't adapt to it or where punting for a later day ( despite the fact that in 2013 millions of Macs would be sold with two OpenCL GPUs emebbed inside them regardless of what happened on the Mac Pro. ).

Apple's strategy move is different in that they are making it a more uniform deployment of duals more quickly. That would only cause a faster return on investment for those who were correctly tracking trends and a bigger disadvantage to some of those who were firmly tracking the rear view mirror.

Or best for their profit margin; just push the legacy code base forward.
Some can't. Some don't want to.

The thing that sucks is I have work to do TODAY which still requires CPU performance. I wish there could have been a transitional phase where 2x CPU machines were offered with multiple GPU BTO upgrades. I would have bought a machine with 2 x CPUs and QUAD GPUs! But obviously very few would fork out for such an astronomically costly machine.

You are right in the way that the direction we are heading is obvious, and it may be a disadvantage that some vendors will be left behind. However a bigger disadvantage for me would be to stay locked into hardware that doesn't offer the maximum performance currently available for some software I've spent 15 years learning. Especially of which there are virtually no competitors.

It's not like switching to non-Apple hardware would be a compromise in GPU performace for CPU gains. As far as reasonable bang for buck it could be surpassed in both CPU and GPU areas.

The new mac pro clearly has the capability of being a performance leader (with $ costly external expansion) in the GPGPU area, but in video production and animation there are currently not that many ways to take advantage of it.

I do like the design of the new mac pro and think that it is quite innovative, but unfortunately for Apple I'm going to go where the power is until everyone catches up!
 
You guys are nuts. A fire pro W9000 is a 365mm^2 die. There would be 82 on a 200mm wafer. That would put the cost of each die (50% yield) at roughly $50. Using a 50% GPM, maybe $100 to apple. Just because a high end dual W9000 card is $3000 doesn't mean the GPU's in the new mac pro will add $3000 to the price.

Are you implying that Apple will use an ultra-deep discount to deliver the new Mac Pro at an ultra-competitive price? I guess they could. But I could also see them using a nearly-at cost discount to make the systems ultra-profitable and just expensive as ever. I mean, it's Apple we're talking about.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.