Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Contractors don't work for free. If Apple doesn't cut them a check I doubt they are going to work now and maybe get paid later.
No kidding. Nor did I imply otherwise. But they are doing it at a very low cost as a means of attracting Apple's business (board manufacture), as well as offering lower costs on assembly (i.e. essentially just over the break even point on R&D and assembly, and make their income on PCB manufacturing, which is more lucrative).

The fact that HHP can rely on Intel for reference designs helps them keep their costs low as well (no need to reinvent the wheel, primarily just need to customize the layout per Apple's ideas/specifications for reduced internal wiring and add an FW chip, assuming that will continue to be a part of Apple's lineup in the near future).

The same factor that reassigns Apple software folks from working on Mac Pro oriented software to other projects can just as easily hire other external contractors to work on other projects that Apple execs deem have higher priority.
I haven't seen any indication they're hiring contractors for software development.

Makes more sense with the hardware, as there's not much of their own IP involved, if any.

The primary point is that they are not going to panic if sorting this out is going to take 4, 8 , or even 12 quarters. Apple isn't worried about short term results. They are going to execute against long term (relative to the rest of the industry) plans.
They certainly have the capital to wait things out if the markets get really rough.

Where I have issue with this long term idea however, is due to erosion of the market base. That is, if they let this market stagnate, or worse, end it even temporarily, they're going to have an uphill battle to try and get those customers back when they do have the product they're after ready to go (current or already former users would have switched to another platform and system vendor to fill their needs, and would likely be very hesitant to get back in bed with Apple after finding themselves abandoned from their POV once already).

Apple doesn't have to buy the company so there is a "mac only" option [software].
They seem to be having a hard time convincing 3rd party developers to commit to an OSX version though that haven't already done so. Thus leaving the OSX professional software market a bit thin vs. other platforms.

They need a "hook" to bring in more customers (generate growth) if they want to stay in this market, and right now, this would be the best way to go about it in general IMHO, as the entire workstation market has been shrinking for awhile now.

It would not be surprising if Ivy Bridge E5 1600 still had 4 and 6 core options with perhaps a top end 8 core one. If so it one trend could be toward doing SoC like consolidation even on this subset of Xeons also.
With increasing core counts, I do expect the workstation market to center on SP based boards in the not too distant future. I've even mentioned this in at least a couple of threads previously.

Definitely cost effective.
 
No kidding. Nor did I imply otherwise. But they are doing it at a very low cost as a means of attracting Apple's business (board manufacture),

I don't think low cost is really a major factor. It is convenient to boost the margins a bit but not really necessary. The primary reason to hire them as contractors is that don't really want the talent in-house.



I haven't seen any indication they're hiring contractors for software development.

You are getting caught up on what companies name is the the badge. That's relatively immaterial. If Apple management retasks the associated software folks ( who happen to be Apple employees ) to other projects why would they fund the hardware half of the system. They either engage the whole system or they don't. There are mix of resources that can be (and probably were ) deallocated. That's personnel , material , and money. In the contractor case the "money" nukes them.


Where I have issue with this long term idea however, is due to erosion of the market base.

With the Mac Pro they have a slow moving market base. Most users only update every 4-6 years. For the folks 1-4 years into their Mac Pro they probably aren't moving anyway.

I think also loosing the context of where the discussion was. The "long term" shift up through Haswell was three alternatives.

early 2013 Sandy Bridge , skip Ivy , and catch Haswell (presumably in 2014) " [ I think there is an assumption that Haswell Xeon E5 is coming ~Q3 2014. That probably is quite agressive ]

early Q3 2013 Ivy Bridge and then catch Haswell

early 2013 SB , early 2014 (or very late 2013 ) Ivy , early 2015 Haswell

"long term" gap in these contexts is less than 1-2 quarters (not even a year) behind the vendors who do "releases" along Intel's schedule.

The gap between the 2010 Mac Pro and the 2013 Mac Pro was a mistake. If they repeat it then that would be a problem.

Since the 2008's (and earlier) are coming up on 5 years, it would be better to go with Sandy Bridge early 2013 and slide Ivy back. If suitable Ivy E5 solutions aren't going to pragmatically arrive until late Q3 they will be digging a huge hole to dig out of if they use that as a starting point. They recovery will be significantly longer and subject to much larger risks.

That is, if they let this market stagnate, or worse, end it even temporarily, they're going to have an uphill battle to try and get those customers back when they do have the product they're after ready to go (current or already former users would have switched to another platform and system vendor to fill their needs, and would likely be very hesitant to get back in bed with Apple after finding themselves abandoned from their POV once already).

It is an uphill battle but it isn't impossible. The "hackintosh" folks are relatively easy to get back on their next upgrade cycle. Similarly the number of folks in this forum pointing people back to refurbs for solutions. Those are even easier to recover.

They seem to be having a hard time convincing 3rd party developers to commit to an OSX version though that haven't already done so. Thus leaving the OSX professional software market a bit thin vs. other platforms.

Eh? AutoCAD title ports to OS X are up over last 6 years. There is movement to OS X this is being driven by OS X's increase share of the PC market. Most of the vendors whose software costs as much or more than a entry Mac Pro (and/or have GPU requirements that only a bleeding edge Mac Pro can meet ) probably aren't coming over the near term. That's is something Apple doesn't really control. Frankly, creating an environment for smaller, more nimble alternative vendors to rise up and wipe those slow moving software vendors out is a better move for that class of software.


They need a "hook" to bring in more customers (generate growth) if they want to stay in this market, and right now, this would be the best way to go about it in general IMHO, as the entire workstation market has been shrinking for awhile now.

some of the "enterprise" shrink is being driven by global economic drama.

1. Getting back to a regular release cycle would be easy to do. IHMO if decoupling from Intel's schedule would increase regularity then that would be a better move. Predictable targets for ports and predictable devices to buy is a "hook".

2. the "hook" being just one ( or two ) Apple software titles is a huge mistake. One of the sources of workstation decline is myopic focus on a small subset applications. Unless that subgroup is growing a hyper rates ... it just isn't going to lead to long term growth.

With increasing core counts, I do expect the workstation market to center on SP based boards in the not too distant future. I've even mentioned this in at least a couple of threads previously.

Depends on software. Software that scales linerly without some nonelinear increase in costs will get better bang for the buck out of 20 cores than 10.

This "core count" war is going to stop or at least slow down either after Haswell or Broadwell.

As long as workstations also need higher I/O bandwidth the dual package will also be in the game. It isn't just about cores. Memory and I/O scale with packages. Single package doesn't just mean capping cores, but also I/O and memory. The non-core scaling limitations only increase as more functionality is added to the "CPU" package.

The shift to single package would be much more indicative that hardware had greatly outstripped increase in user workloads. That isn't really going to "save" workstations because may of those users are going to slide all the way down to mainstream CPU+GPU oriented solutions. There are even more cores in a GPU if primarily just have embarrassingly parallel floating point problems.
 
I don't think low cost is really a major factor. It is convenient to boost the margins a bit but not really necessary. The primary reason to hire them as contractors is that don't really want the talent in-house.
From my perspective, the primary reason for shifting talent from in-house to contractors, is per cost reasons (get permanent employees off of the payroll, as their tasks have been reduced to less than is required for a full time employee <cost savings is more than just wages>).

There isn't that much time needed for the MP, but cycling people to and from the various devices to service the MP line hurts their more profitable device market.

So instead of hiring on full time employees that may only work 6 months or less isn't attractive vs. using contractors that do this sort of design work regularly.

You are getting caught up on what companies name is the the badge. That's relatively immaterial.
Actually, I don't care about the name on the badge. Rebranding happens all of the time generally speaking.

If Apple management retasks the associated software folks ( who happen to be Apple employees ) to other projects why would they fund the hardware half of the system. They either engage the whole system or they don't. There are mix of resources that can be (and probably were ) deallocated. That's personnel , material , and money. In the contractor case the "money" nukes them.
I think there's a misunderstanding here as to what I was referring to.

My comment was aimed to the effect of pulling in software and key developers (from a software company buy-out) for a ready to go or nearly so (add a few UI tweaks) as a means of offering additional value to the platform. By doing so, they can attract new users in the professional market as they've done before.

The other key point to this, is it also allows them to have complete control over the development rather than running the risk of partners or external contractors that could make a mess of it, or leak/steal their IP.

With the Mac Pro they have a slow moving market base. Most users only update every 4-6 years. For the folks 1-4 years into their Mac Pro they probably aren't moving anyway.
Which is part of the problem in regard to the poor growth figures as they've lost the former enthusiasts in this segment due to being priced out of the market.

To keep the product around, they need growth. One way to do this is via pricing, but I don't expect this to be the case without a reduction in level of CPU (shift from the enthusiast socket to something from the mainstream line). Second is to add value through features that give a better overall user experience and TCO, which from what I've seen of Apple's history, is more likely to be how they'd approach it, assuming they genuinely plan to stay in this particular market long term.

early 2013 Sandy Bridge , skip Ivy , and catch Haswell (presumably in 2014) " [ I think there is an assumption that Haswell Xeon E5 is coming ~Q3 2014. That probably is quite aggressive ]
Given the delays in releasing SBE5 and the lack of competition from AMD in this segment, I don't expect Haswell to be out that soon either (expect more like H2 2015).

"long term" gap in these contexts is less than 1-2 quarters (not even a year) behind the vendors who do "releases" along Intel's schedule.
Keep in mind however, that there will be a board change during this process (SB comes on late, and that board is to last through IB, then a new board needed for Haswell).

If Apple continues to try and sell IB systems when the PC counterparts are selling the newer Haswell based systems, then they will either lose sales, or those buyers will wait for the newer hardware causing a negative effect for that particular quarter or two. If the previous quarters are good enough to make up for this, then great.

I'm just not convinced that this particular market segment is large enough for Apple that such an instance would be possible however. Thus my belief that Apple would need to add value to this particular market to maintain a sufficient sales volume (retention at a minimum, but better yet, if it's good enough, it should attract new customers as well).

The gap between the 2010 Mac Pro and the 2013 Mac Pro was a mistake. If they repeat it then that would be a problem.
Definitely.

Unfortunately however, I'm not convinced that they see it that way (intentional, as they've already decided on a fundamental shift in what they're going to offer in this segment).

I genuinely expect Apple will take a crack at SP variants only before other vendors. And if their goal is to pick up additional users, that this could realistically be based on a mainstream part, thus starving professional users of the PCIe lanes they tend to require for over-all system performance.

Eh? AutoCAD title ports to OS X are up over last 6 years. There is movement to OS X this is being driven by OS X's increase share of the PC market. Most of the vendors whose software costs as much or more than a entry Mac Pro (and/or have GPU requirements that only a bleeding edge Mac Pro can meet ) probably aren't coming over the near term. That's is something Apple doesn't really control. Frankly, creating an environment for smaller, more nimble alternative vendors to rise up and wipe those slow moving software vendors out is a better move for that class of software.
I'm talking about true professional software that would require a MP to run it properly, and specifically, expanding into areas they have little to no presence in right now.

Take Electronic Design Automation (EDA). LabVIEW is the only thing that comes to mind that runs natively under OSX (industry standard) regarding this area. Even National Instrument's own MultiSIM package doesn't offer an OSX variant at all.

Not impossible to run on a MP, but it won't be directly under OSX (either natively under Windows or VM'ed).

And you can forget Synopsis, Altium, or any of the other suites most would be using.

some of the "enterprise" shrink is being driven by global economic drama.

1. Getting back to a regular release cycle would be easy to do. IHMO if decoupling from Intel's schedule would increase regularity then that would be a better move. Predictable targets for ports and predictable devices to buy is a "hook".

2. the "hook" being just one ( or two ) Apple software titles is a huge mistake. One of the sources of workstation decline is myopic focus on a small subset applications. Unless that subgroup is growing a hyper rates ... it just isn't going to lead to long term growth.
It's in Intel's best interest to get back on a stable cycle IMHO, and hopefully, this is just a hiccup on their end. Unfortunately, I'm not sure this is the case due to the increasing issues that occur with shrinking dies, and secondly, due to AMD's lack of competition (less motivation).

Now I get your point on Apple not sticking with Intel's cycle at all, but to me, it goes back to what I mentioned above.

As per software, I'm talking about creating a presence either directly (acquisition), or by enticing existing software developers in areas that aren't currently available for a Mac user. The latter would require Apple to be more willing to work with them better than their history demonstrates (better development environment, and more importantly IMHO, a much better attitude <rather than "take it or leave it">).

Depends on software. Software that scales linearly without some nonlinear increase in costs will get better bang for the buck out of 20 cores than 10.
My SP comment was aimed at Apple, not every vendor. DP systems have, and will continue to have a place for the foreseeable future.

Where I do expect the SP versions to take over however, is for both single-user workstations and GPGPU processing, so long as it's more cost effective than doing this with DP based systems (let's assume that the lane counts will continue to scale linearly with CPU count; so if 40 lanes on an SP board, 80 lanes on a DP board).

This "core count" war is going to stop or at least slow down either after Haswell or Broadwell.
I actually agree with this, particularly given the general state of software (very little can leverage all the available cores).
 
After todays changes, I'd say no. The Mac division just took over the iOS devision, and Jony Ive, who's been a quiet critic of the direction OS X's interface is now in charge of interface.

Writing seems to be on the wall that iOS's power in Apple is being re-balanced.
 
Low R and D

the parts are already made... How hard is it to make a case that holds the stuff?? Its cant be as hard as making an uber compact imac, mbp, ipad or iphone - there is actually a little breathing room in a desktop computer. I just dont get it!
 
After todays changes, I'd say no. The Mac division just took over the iOS devision, and Jony Ive, who's been a quiet critic of the direction OS X's interface is now in charge of interface.

There is very little to back this up. It isn't a Mac thing.

1. The retail thing was sketchy from the get go. Go back and look at comments here on macrumors when the guy was hired from folks in the UK who had experience at the stores run by this guy. Then came some screw ups ( cut hours ... oops didn't really mean that. ) and he is gone.

2. Forstall I think was drinking too much Kool-aid about being next CEO-in-waiting. Again he let two, Maps and Siri, of his major projects flounder (and according to the recent updates ... worse still stuck his nose into other folks stuff). From the outside this looks like he was playing "how would Steve act" along with "What would Steve do" .

Decide to nuke those two and most of the rest Falls into place.

Maps and Siri really are at their heart internet services. Not an OS service. They really didn't belong in the OS team in the first place. They belong on a team that is focused on web hosting and internet services just the core of Googles , Yahoo, and Microsoft have.

If Apple is going to compete head to head with other groups that have a dedicated SoC team .... Apple should have one too. Merging other IP associated with Flash controllers and wireless into a substantially more 'crowed' SoC is going to be challenging.

For most of the history of OS X and iOS the two variants were run under one team. [ Only after the iPod guy got bumped out did iOS fork. ]

Ive hasn't been a critic of iOS UI. It is certain iOS apps that are a problem. This one is a bit of leap, but in terms of giving Ive a spin at managing something outside of industry design its isn't bad.

Writing seems to be on the wall that iOS's power in Apple is being re-balanced.

No. As I said in another response iOS and OS X relationship is like that between Mac OS and Windows over a decade ago. However, when Jobs said
" We have to let go of this notion that for Apple to win, Microsoft has to lose "

that is essentially is also now

" We have to let go of this notion that for OS X to win, iOS has to lose. And vice versa"

This notion that OS X is 'winning' here is flawed. However, Apple doesn't have to starve OS X or resources to grow iOS. iOS products generate enough support they can fund themselves. Likewise Mac products are strong and deserve funding along their own path. [ That Mac line up. Not every single product within the line up. ]

OS X is still going to track iOS just as much if not more than how they have tracks Windows. It is going to borrow what seem to be good ideas that users like alot and work hard at integration.
 
I think there's a misunderstanding here as to what I was referring to.

My comment was aimed to the effect of pulling in software and key developers (from a software company buy-out) for a ready to go or nearly so (add a few UI tweaks) as a means of offering additional value to the platform. By doing so, they can attract new users in the professional market as they've done before.

Again that might have worked in the early days in the new Jobs era, but the economics now it won't. The total number of Macs sold creeping up on 10 times as many per year. Back then the percentage increase of adding a relatively small niche had the appearance of looking significant. Now it is likely would get lost in the noise.

The only reason it would look significant would be because Apple has shrunk the Mac Pro workstation sold to a smaller point. Look, at this point, just getting Mac Pro to competitive footing should demonstrate a huge spike in growth. (even without any acquisition gimmicks ).



The only acquisition boost that will have any significant Mac market boost are those that will impact the whole Mac market. Cherry picking a single model isn't going to move the needle much at all.


Which is part of the problem in regard to the poor growth figures as they've lost the former enthusiasts in this segment due to being priced out of the market.

If "enthusiasts" was the major lynchpin element of the Mac Pro market then it is gone. It is a machine that has value because it generates value; not because it makes folks feel value. If Apple is going to stick to this relative price point ( pretty likely since doing it for around decade now across their product line) it isn't going to track enthusiasts. The enthusiasts are increasing going to break out their trusty screwdriver and order commodity parts from places like newegg. Largely because they can buy cheaper thrills.

To keep the product around, they need growth.

Yes. Profits and revenues in-and-of themselves are not the point. Those are just necessary component elements of growth.


Given the delays in releasing SBE5 and the lack of competition from AMD in this segment, I don't expect Haswell to be out that soon either (expect more like H2 2015).

It will be before that unless something goes haywire. I don't think there is going to be rapid industry displacement of Ivy Bridge though. The rumors so far are that Haswell E5 will be a transition to DDR4 along with the transactional memory stuff. You are likely right it will be around H2 2015 when folks stop being skittish about the new memory semantics and the cost of the next generation memory technology. So you'll see vendors selling Ivy right along side Haswell models for a longer than usual time.

If there is a slow overlapping transition, then that is one of the contexts where Apple can catch up fast. If there is an edict from the execs that "this is what we are doing" then it is done. Customers either concur or look for someone else as a vendor. Apple isn't going to try to be all things to all people. It is one of the advantages of having a non-bloated product line up.

If Apple continues to try and sell IB systems when the PC counterparts are selling the newer Haswell based systems, then they will either lose sales,

Every major vendor is still selling Westmere right along Sandy Bridge now. The previous gen doesn't disappear at these price points as fast as you suggest.

There is a "lunatic fringe" group Apple might loose but IHMO those are more more heavily populated with the "trusty screwdriver" crowd anyway.

The other major assumption here is that the vast bulk of the buying for a new arch comes in the first quarter or so the new arch is on sale. I don't buy that either. large acquisitions tend to cycle on when the buyer has money ready rather than impulse buys from cash on hand. It is quite likely that Intel will keep moving the E5 launch date around in the calendar year. ( not purely a 12-13 month cycle). As it drifts from quarter to quarter even if there are spikes you spread out the "bubble" over time.



I'm just not convinced that this particular market segment is large enough for Apple that such an instance would be possible however. Thus my belief that Apple would need to add value to this particular market to maintain a sufficient sales volume (retention at a minimum, but better yet, if it's good enough, it should attract new customers as well).

New customers are essential. The rest of the Mac market is drawing new customers. If the Mac Pro won't they'll release another Mac that will with the resources formally allocated to the Mac Pro. The Mac Pro has a "get out of jail free" card for the moment, but that isn't permanent.





Unfortunately however, I'm not convinced that they see it that way (intentional, as they've already decided on a fundamental shift in what they're going to offer in this segment).

Perhaps overly optimistic but my view is that someone decided one direction (for the most part that it was obsolete segment ) and then someone else made an argument they just don't know what the market potential is here because the product is so far out of alignment with the market ( off cycle.).

They give a good try to see if this segments has potential, but if it doesn't they will exit.



I'm talking about true professional software that would require a MP to run it properly, and specifically, expanding into areas they have little to no presence in right now.

Apple isn't going there. A healthy Mac Pro segment might be 1-2% of the Mac market. The Mac market is about 7%. 1% of 7% is too small. That will primarily draw vendors with megabuck price tags.

Vendors who can't work at all reasonably on other Macs will either price the software so just a small number allow them to recoup the whole investment or just won't bother. There is nothing that is going to guarantee the Mac Pro will stick around long term. Porting to a single machine is loopy. Vendor will port to a multi-machine marketplace.

That's why multiple system vendor Windows or multiple systm vendor Linux are viable options.

If there is some associated PCI-e card that has to be present to enable their solution then either they get Thunderbolt religion or avoid the Mac market.


It's in Intel's best interest to get back on a stable cycle IMHO, and hopefully, this is just a hiccup on their end. Unfortunately, I'm not sure this is the case due to the increasing issues that occur with shrinking dies, and secondly, due to AMD's lack of competition (less motivation).

1. ARM implementers are more competitive than AMD is. So Intel's major focus is down market from Xeon E5 and up class. Apple's product mix is skewed due to the delayed desktop refresh but 80:20 mix of laptops to desktops. Intel is focused on threats to the 80 far more so than imagined threats to the much less than 20.

The better clock management and lower power helps the higher end over time.


2. There is a catch-22 of not moving to new tech like PCI-e v3.0 and DDR4 . Throwing yet another layer of copy-and-paste x86 cores to the layer cake won't work unless rebalance the I/O. Whether have a 4 , 6 , 8 , 10 E5 product now is really just a matter of adding another 2 core + ring bus segment + cache element to a layer cake (only horizontal on the die).

[... I/O elements non-core .... ]
[core ring cache core ]
[ core ring cache core ]
..... ....
[ ... I/O and/or GPU elements non-core ... ]

The "non-core" updates are larger and more complicated in order to stay at the substantially better level.

I think Intel needs some separation between mainstream and these higher end I/O non-core adjustments so that the two tracks can cleanly feedback their respective improvements to each other (while maintaining time to clean up glitches that may occur).


I actually agree with this, particularly given the general state of software (very little can leverage all the available cores).

At this point I think most scale to small numbers; 2-4. There are now quad core tablets. Claiming that it is some exotic number that nobody has is laughable at this point. Even the laziest of responsible software vendors at this point can't claim the "nobody has these kinds of systems" excuse. Even many basic OS frameworks now are threaded.
 
the parts are already made...

Actually not. Apple's motherboard is relatively unique.
Likewise if on-board mSATA and Thunderbolt are designated elements to make the motherboard even more unique then all the more so.

There is no Intel or Supermicro or other workstation class motherboard that has all those elements.


How hard is it to make a case that holds the stuff??

if Apple were just wrapping a case around a stock Intel board that might the case but that would just be in the "screwdriver" assembly business. Apple doesn't compete there.



Its cant be as hard as making an uber compact imac, mbp, ipad or iphone - there is actually a little breathing room in a desktop computer. I just dont get it!

It isn't a question of hard, but of focus.
 
This "core count" war is going to stop or at least slow down either after Haswell or Broadwell.
As i speculated in another thread the "core count war" may be just about to begin! Clock frequencies have reached a soft cap with diminishing returns, so multi-core is the way to go. Only for a workstation the class of a Mac Pro you would need to go massively multi-core...
 
Actually not. Apple's motherboard is relatively unique.
Likewise if on-board mSATA and Thunderbolt are designated elements to make the motherboard even more unique then all the more so.

There is no Intel or Supermicro or other workstation class motherboard that has all those elements.




if Apple were just wrapping a case around a stock Intel board that might the case but that would just be in the "screwdriver" assembly business. Apple doesn't compete there.





It isn't a question of hard, but of focus.

thunderbold x79 motherboard
http://www.eteknix.com/news/asus-rog-zeus-x79-motherboard-features-crossfirex-integrated-graphics/
 
As i speculated in another thread the "core count war" may be just about to begin! Clock frequencies have reached a soft cap with diminishing returns, so multi-core is the way to go. Only for a workstation the class of a Mac Pro you would need to go massively multi-core...

I see other ways of improvement. Cache and pipe improvements. Material use. After 20 cores you can't effectively use, who cares?
 
As i speculated in another thread the "core count war" may be just about to begin! Clock frequencies have reached a soft cap with diminishing returns, so multi-core is the way to go. Only for a workstation the class of a Mac Pro you would need to go massively multi-core...

ARM. LOL No. Makes for great fodder on these forums but greatly underestimates Intel's roadmap and abilities.



In terms of Xeon E5, cores is implicitly based on x86. Yes, it will likely slow. Of "any of core" were pure number is all mostly concerned with and willing to compile to optimize? That already started with GPGPUs. If you want 50 x86 cores there is already the Xeon Phi.

http://www.intel.com/content/www/us...h-performance-xeon-phi-coprocessor-brief.html

But that is a coprocessor and a variant of the same track the GPGPUs are on.

I have some doubts Apple will go this route that revolve around Apple (or Intel ) having to do some low level integration work to make this work.
http://www.anandtech.com/show/6017/intel-announces-xeon-phi-family-of-coprocessors-mic-goes-retail

and it would be around until next year.

"... Stampede will go live on January the 7th, 2013. ..."
http://www.anandtech.com/show/6265/intels-xeon-phi-in-10-petaflops-supercomputer


But that only further drives home the issue how "out of touch" some of the aspects of the current Mac Pro are. The amount of power/heat thrown off by the PCI-e cards is going to be at least as large as the GPUs. The current Mac Pro design is oriented to much less power hungry cards.

If Intel is selling Many Integrated Core (MIC) devices along another Xeon Phi line there is little reason at all to duplicate that in the Xeon E5 line up.

Moving SAS/SATA , High Speed Ethernet ( 10GbE) , more direct connect DDR4 would be far better at improving throughput on code that has significant scalar sections. Highly parallel jobs can be punted to the co-processor(s) if appropriate. If you look at top tier supercomputers these days there is a mix of processors that are brought to bare on jobs. That same mix is likely going to trickle down to the workstation level.

----------

thunderbold x79 motherboard

From the linked article ".... However, it is ”highly unlikely” that the ZEUS will ever make it to the market. ... "

I said all of the elements. When you find one with a CPU/RAM removable daughter card, you come back and post.

It is indicative though that the folks insisting Thunberbolt GPU PCI-e cards were coming were on the wrong track. Thunderbolt and embedded GPUs naturally go together.
 
This notion that OS X is 'winning' here is flawed. However, Apple doesn't have to starve OS X or resources to grow iOS. iOS products generate enough support they can fund themselves. Likewise Mac products are strong and deserve funding along their own path. [ That Mac line up. Not every single product within the line up. ]

But Mac and iOS being on different teams was the reason OS X was being starved resources. Forstall was argumentative and didn't really care about the Mac division. He couldn't get along with existing designers and tried to influence the Mac even though it was out of his domain. He continually worked to try to drain OS X resources and redirect them to iOS.

Putting iOS under someone who also has the Mac's interests in mind is a gigantic step up for the Mac. It wouldn't surprise me if Forstall trying to shift resources to iOS is one reason the Mac Pro has been having issues.
 
After 20 cores you can't effectively use, who cares?
...unless the OS is autonomously taking care of that (like e.g. Grand Central Dispatch is designed to do).

ARM. LOL No. Makes for great fodder on these forums but greatly underestimates Intel's roadmap and abilities.
It does not need to be solely focused on the raw technical ability. There would be more to a move away from Intel. And users would not necessarily have to sacrifice Windows compatibility, as Win8 is also running on ARM processors. Intels current dominance will shrink!

If you want 50 x86 cores there is already the Xeon Phi.
Except for the fact that that is merely a coprocessor as you said, it is still "just another Intel chip" and thus a Mac with that would still be "just another Intel box".

I do believe that Apple sooner or later will try to get rid of that notion to differentiate themselves better on the market.

Plus - any Xeon is produced in low numbers and without having much competition on the market, therefore prices are comparatively high. An (already cheap) ARM CPU that is produced in millions and millions for iOS devices would be dead cheap in comparison. The criticism on Apple's ever-increasing system prices could be countered by moving to a significantly cheaper hardware basis.

Moving SAS/SATA , High Speed Ethernet ( 10GbE) , more direct connect DDR4 would be far better at improving throughput on code that has significant scalar sections.
While those are nice incremental updates, i'm not sure it would classify as "something really great" as Cook has put it.

Highly parallel jobs can be punted to the co-processor(s) if appropriate. If you look at top tier supercomputers these days there is a mix of processors that are brought to bare on jobs. That same mix is likely going to trickle down to the workstation level.
I'm not sure if Apple would go this comparatively complex route.

Of course they might use a Sandy/Ivy Xeon together with GPGPU or some ARM processors as support, but i think it'd be more Apple to leave out that "main" (Xeon) processor completely and "only" have a multitude of ARM cores clustered - which by the way would probably be way cheaper than adding more hardware to an already expensive set. Just imagine a new Mac Pro machine with the power of the top tier competitor machine for 2/3 of the price (just as the MP 1,1 was considered to have a very good price/performance ratio at its time). People would be all over it!

Also - all moves Apple has made in recent years point towards Apple taking a different approach to what is considered "Pro" use and leaving the "classical" server/workstation field to other companies like HP or DELL. Might well change again after the recent personnel changes with Forstall gone, but that is really only speculation.
 
The Mac pros will be the first apple capable of running the large retina monitor that is in apples future. The iMac won't be able to go retina for years at 27 inches and using a laptop GPU. Retina will save the Mac Pro for a little longer.

We thought the same about a 2560 x 1600 display powered by intel integrated graphics.... they still went there ;)
 
...unless the OS is autonomously taking care of that (like e.g. Grand Central Dispatch is designed to do).

GCD doesn't autonomously take care of anything. You still have to write GCD code.

I don't disagree that GCD could take care of 20 cores. But GCD is not at all designed to interact with co-processors like Phi (OpenCL is what you'd use), and it still means people have to write GCD code.
 
My bet is March 2012 with new Mac Pros with Ivy Bridge processors, USB 3, Thunderbolt, and a new 27" 5mm thin Cinema display as seen in the iMac 27".

The optical drive will also be thrown out the door and the tower will be redesigned.

That would suck! Sure hope not! They already make an iMac why would they need to turn the Mac Pro into one. They would more likely drop the line.
 
GCD doesn't autonomously take care of anything. You still have to write GCD code.
Thanks for the clarification. So it would be part of the development to bring GCD up to speed and have it working more autonomously. But i do believe that could be done...
 
Thanks for the clarification. So it would be part of the development to bring GCD up to speed and have it working more autonomously. But i do believe that could be done...

It's not really possible.

Plus, GCD isn't at all designed to work with cards. OpenCL is. GCD by design can only work with onboard RAM. In order to use Phi, you have to use the RAM on the Phi card, which only OpenCL knows how to do.
 
When is the (Mac Pro suitable xeon) Ivy Bridge release date supposed to be as of this writing? This is all I could find lately from September:


"SAN FRANCISCO: CHIPMAKER Intel has announced that it is sampling 22nm Ivy Bridge Xeon E5 and Xeon E7 chips and will brand its Centeron Atom server processor as the Atom S.

Intel launched its Sandy Bridge Xeon processors back in March, months before the firm released its 22nm Ivy Bridge consumer chips. Now Diane Bryant, corporate VP and GM of Intel's Datacenter and Connected Systems Group said the firm is already sampling its mid-range and high-end Ivy Bridge Xeon E5 and Xeon E7 processors.

Bryant said that although the firm is sampling Ivy Bridge Xeons, products won't be available until 2013, and didn't give a more specific timeframe. She also said that Haswell Xeon E3 chips will be available in 2013, but once again didn't reveal when next year they will surface.

"
 
It's not really possible.

Plus, GCD isn't at all designed to work with cards. OpenCL is. GCD by design can only work with onboard RAM. In order to use Phi, you have to use the RAM on the Phi card, which only OpenCL knows how to do.
I wasn't thinking of Intel, Xeon or Xeon Phi, but instead of a radically new approach like e.g. going for ARM clusters on one PCB...
 
I wasn't thinking of Intel, Xeon or Xeon Phi, but instead of a radically new approach like e.g. going for ARM clusters on one PCB...

Same issue. Anything that's not the main CPU needs to have it's own memory, which means you need OpenCL to talk to it.

I'm also not sure what using ARM clusters would bring to the table when OpenCL on a GPU would be a better option, and is the option we already have.
 
That would suck! Sure hope not! They already make an iMac why would they need to turn the Mac Pro into one. They would more likely drop the line.

I don't think they would compromise on expandability but I'm almost certain they would rid of the optical drive.

Hopefully like you said, I am wrong. I heard or read somewhere that Intel won't have their Ivy Bridge Xeon processors ready until late next year though. So if that's true then we won't see a Mac Pro update till then. :(
 
It does not need to be solely focused on the raw technical ability. There would be more to a move away from Intel. And users would not necessarily have to sacrifice Windows compatibility, as Win8 is also running on ARM processors. Intels current dominance will shrink!

Why would someone move a $1000+ personal computer off onto ARM? A $300-600 one? Sure. But a $1,000 one probably not.

In a $999+ price zone ( essentially the whole Mac line up, with the Mac mini as the only oddball exception) there is plenty of room to pay for $/peformance ratio that Intel provides with their $180+ CPUs.

The just announced ARM 57/53 offerings will just be offering roughly current core i3-i5 like performance in 2014. If Apple wanted to take the Mac line up stagnant for 4 years they'd move to ARM. That idea has worked so fabulously well for the Mac Pro over the last 3 years ... just spread it to the rest of whole line up. Brilliant *cough* ( Not !! ).

Apple has already got a much larger ecosystem that is going to leverage the 2014 versions of ARM. The iOS devices are going to be in an even higher potential cannibalization position at that point. Dragging Mac back to even closer computation parity with them is close to suicidal from a Mac perspective. The only way Mac is going to survive is by adding enough value for the $999+ price point so that find a broad enough audience that needs them.

That means mini's picking up higher end iMac workloads. iMacs picking up workloads folks currently more restricted to xMac class PCs. MBA 11" picking up MBP 17" like work and Mac Pros pulling in more work that currently requires bigger iron.

Going to ARM would be switching to "race to the bottom" pricing mode. I extremely doubt Apple is even going to entertain that notion over the next 5-6 years at all.


Except for the fact that that is merely a coprocessor as you said, it is still "just another Intel chip" and thus a Mac with that would still be "just another Intel box".

The labels on the internal parts don't matter. If OS X can't make a significant differentiating case for Macs then OS X isn't very valuable. Apple would shift to an OS component that did make a difference. Porting to ARM is addressing the wrong root cause issue.

Right now OS X does have value. Mac growth has been outpacing Windows PC growth ( in part because just eating away at Windows portion. but Apple can eat their share of the pie for a long time. )


If you are arm flapping over the "hackintosh" issue if Windows 8 notion of booting only from signed/secured OS images takes off by 2014 you'll probably find that OS X probably just doesn't work on generic x86 clones anymore if that issue starts to get out of hand. Apple is tolerant because most folks follow the rules and measures-countermearsures is a rat-race that consumes resources.






I do believe that Apple sooner or later will try to get rid of that notion to differentiate themselves better on the market.

You mean go back to the Mac past were Apple used out of PC mainstream alignment processors and rom dongles to protect the Mac. One, I don't think Apple is going back to that era because it really didn't turn out so well. Two, ARM is exactly mainstream personal computer market of the next decade. They have already positioned themselves for that market with iOS.




Plus - any Xeon is produced in low numbers and without having much competition on the market, therefore prices are comparatively high.

It isn't the low numbers that is the dominate factor for Xeon. Relative to the desktop Mac market they are comparable. It is the fact that they are much physically bigger that is a major driver.

It is one of those can't get something for nothing situations. If want to do extremely well on dynamically branching/accessing scalar code/data then end up with stuff like the Xeon.







The criticism on Apple's ever-increasing system prices could be countered by moving to a significantly cheaper hardware basis.

Except for some over exaggerated shifts in Mac Pro pricing, overall Mac pricing is not increasing. It is roughly staying the same. Racing Mac prices down into the iOS device zone is something that Apple is extremely unlikely to do. It will do more harm than good for OS X since iOS is so vastly more entrenched in that price space with vastly larger numbers than Mac devices are.

That iOS 'buzzsaw' is aimed at the sub $900 personal computer market. Not where Macs are targeted and sold.





While those are nice incremental updates, i'm not sure it would classify as "something really great" as Cook has put it.

You've got to be kidding me? Cook in no way, no how is promising post Haswell like functionality in his comments. It will be "something great" just like the iPhone ad that says "the laws of physics are just suggestions" (as if the iPhone5 defies the laws of physics).

Absolutely ridiculous expectation setting here. A Mac Pro with Sandy Bridge and modern mid-to-upper range GPU card would be great.

The notion that Apple is working on some Area 51 alien technology project is loopy.



I'm not sure if Apple would go this comparatively complex route.

It is not complex if Intel/Nvidia/AMD just hand Apple a integrated component that has collapse the majority of the onto a chip or at most a PCI-e card that just plugs in.

Making PHI work on OS X is simply the matter of writting a driver to give a virtual TCP/IP connection over PCI-e to the Phi card. That is about the level of complexity. It isn't complex; it is far more so of just allocating the resources to getting it done.

It is far more a pressing matter of how many can they sell if the card is in the $4000 range.

Of course they might use a Sandy/Ivy Xeon together with GPGPU or some ARM processors as support,

Apple already has to do Nvidia/AMD GPGPU support. And they sell alot more of them.

But I do think that they are strategically missing the competitive workstation boat if don't t work to support Phi solutions. Perhaps not the first one since it is priced relatively high, but to ignore that Intel won't eventually be competitive with the other two GPGPU vendors is something shouldn't beat on. Intel is dropping tons of money into leveling the playing field on that front. And show no signs of slowing down.

Folks laughed at Intel integrated graphics 5 years ago. Now that's all you get on Apple's most popular Mac ( MBP 13" ).






but i think it'd be more Apple to leave out that "main" (Xeon) processor completely and "only" have a multitude of ARM cores clustered - which by the way would probably be way cheaper than adding more hardware to an already expensive set.

It is cheaper but you have thrown scalar performance out the window. That is not going to make the Mac competitive to those who didn't throw the baby out with the bathwater.

well change again after the recent personnel changes with Forstall gone, but that is really only speculation.

Two people left and that is both a major personnel change and a major strategy inflection point. What are you smoking? Newsflash, those two weren't doing the work. They also weren't the primary advocates against 'race to the bottom" pricing. Those folks are still there.

----------

I'm also not sure what using ARM clusters would bring to the table when OpenCL on a GPU would be a better option, and is the option we already have.

ARM chips don't cluster (cache coherency past the package is nonexistent) to form a virtual single image well. The external I/O connectivity is for relatively low end stuff like Wifi/10Mbit Ethernet and flash drives with little or no controllers. Any current setup has glue logic around it. The AMD/Inteldon't need on motherboard and AMD/Intel/Nvidia don't need on a PCI-e card.


ARM clusters are being used in lower power, high efficiency data centers. Like virtual hosting and/or on-demand workloads. It has traction there. Not so much for top end performance but for lowering operational costs for modest workloads.

----------

When is the (Mac Pro suitable xeon) Ivy Bridge release date supposed to be as of this writing? This is all I could find lately from September:

Go back to post 63 in this thread. Engadget and others ran a story in October. It is very unlikely that is the most significant factor driving Mac Pro's 2013 arrive time.
 
Go back to post 63 in this thread. Engadget and others ran a story in October. It is very unlikely that is the most significant factor driving Mac Pro's 2013 arrive time.

So what is then? Seem more a case of procrastination and priorities to me. They certainly can't blame it on lamination process delays like the imac.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.