Had nothing to do with sales figures, it simply wasn't making any money. XServes, unlike the Mac Pro, were very expensive to design.
Do you not understand how manufacturing economics works?
If they're not making money, they didn't meet their minimum target sales volume (units sold = break even point = no profit, but no loss either).
That minimum target is designed to cover the R&D, manufacturing, and all indirect costs associated with the number of units they initially order. If they don't sell that many machines, there's a
loss. Anything over that, generates a
profit, and where they want to be. Since they want high margins, they need to sell a fair number of systems above that.
Now I can understand some confusion with how the numbers are generated internally, as they're figuring their targets with an expected profit margin already built-in. But I'm trying to keep it as simple as possible to illustrate my point.
But since you're claiming they lost money, they didn't even meet the bare minimum target that keeps them from losing money (aka "In the Red", which is an old method of accounting; profits were written in black ink, losses in Red).
So how can you claim that Sales Volume (units sold) does not matter?
This is what I don't get from how you're explanation on the EOL causality of the XServe.
Being familiar with what is in FCPX, they started this work back in 10.4. Tons of OS APIs are in place just for FCPX. The app itself is very complex, very costly to write.
Dealing with inside information again?
No one else has access to it, so without a link to a source that supports it, it's against forum guidelines as
Hellhammer recently mentioned elsewhere and can only be viewed as pure speculation (difference between "is" and "might" ways of presenting information; one comes off as fact without support, and the other as a conclusion that may or may not be correct in the end).
Minimal budget? No. No way in hell. Looking at FCPX it was a massive undertaking. Large portions of AVFoundation (which is not a small API) are targeted specifically at FCPX. I just don't see anything here which says minimal budget. We're looking at a ground up re-write of QuickTime for FCPX, as well as FCPX itself. Not cheap. Not cheap at all.
My point, is that even if they started from scratch (assumes it uses absolutely nothing already written in OS X in terms of API's), they didn't have to deal with spaghetti code, which tends to eat significant time during debugging phases. In my experience at least, starting from scratch allows for better expectations as to how much development time is spent on each section, and if done properly (not released as beta-ware), tends to have a better final result.
Then consider what they did release is beta-ware from what I'm seeing from various posts (mainly lack of features for the type of professional application it is, not things like loaded it and got a Kernel panic sorts of things).
So it appears there's a reduction of man-hours from what it should have been = not as expensive to do had they fully debugged it/added all the features professionals need at the initial release.
Now I'm not saying what they did do only cost Apple the equivalent of a couple of bags of peanuts, but it's not complete for professional software IMO. I'm actually more accustomed to such software retailing for much more than what they're charging for FCPX. Adobe and Avid both sell for at least $1k USD from what I've seen.
Now consider that electronic simulation software suites (by companies like National Instruments and Synopsys) go for $10k+ per isn't unheard of, and a single TCAD license (also from Synopsys) can actually cost as much as a house in some markets (stuff PCB and chip designers use respectively).
So thinking along these lines, FCPX @ $300USD is cheap, and and what seems to be an incomplete suite is the most reasonable explanation why (assumes Apple is at least trying to break even on it, rather than use it as a loss-leader to sell systems). If it is a loss leader, they could theoretically give it away if they didn't care how much they lost (not likely, given they're in business to make money

).
Now considering this and other factors (Apple focusing on the consumer product lines, loss of the XServe, ... things that have already been discussed), it's no wonder graphics professionals see FCPX as an "old, dried out bone with no meat thrown to an old mangy dog" = Apple no longer cares about the professional market as they once did, if at all - sorts of sentiment.
It's an evolving app. Report that came out today basically said that all pro features will be restored except direct FCP project importing (XML importing will return.)
I think they wanted to show progress, and it didn't go like they had planned. Which fits with a long history of 1.0 Apple product releases that are really beta. This is pretty classic Apple.
Bad move for professional software IMO. If Synopsys or National Instruments blew it like this, I and others in the field would raise absolute hell. Of course as you now see how much this sort of software goes for, you can probably figure out why.
I guess it all comes down to what someone's definition of professional software is.
And that gets back to the point of Apple's definition vs. the rest of the world. What they call professional is pro-sumer quality, not professional by my and others' definitions in the enterprise industry.
Do you spend any real time outside of the Apple environment at all in the enterprise market?
If not, it could explain why the point I'm trying to make seems to be lost.
Yes and no. Macs are popular because they're machines that have everything you need out of the box, and a fair number of video pros like Mac OS.
Ever used a Dell Dimension? They're not well designed machines.
For me, no system has everything I need out of the box.
I always have to add additional hardware (think memory capacity, RAID systems, and bench instrument interfaces <GPIB adapter of some sort; USB to GPIB these days, so it's no longer an internal card>). And there's the software of course...
From what I'm seeing of true professional graphics professionals, they also have to add hardware and software as well, so I don't see the MP as "ready to go" out of the box for them either.
Now I realize what you're getting at (just add software), but as you know base systems from any vendor have bottlenecks that need to be addressed for professional users, namely increased memory and storage. Since these sorts of users add hardware to get it to a usable machine, I don't agree with your sentiment on this issue.
As per OS platform, that's dictated by the software they want to use. It just happens in the case of OS X, that means getting a MP if they need the slots (particularly for discrete graphics cards) and HDD bays for upgrades (based on hackintosh = bad idea for professionals earning a living IMO).
Despite all the talk, Apple is still working with Adobe, and Adobe is still working with Apple. CS Suite is pretty neutral territory for them. I don't see Premiere or After Effects getting dropped.
I've not said they've stopped working with one another, but there is strong evidence they don't like each other all that well. But they still do it as it's mutually beneficial (each company helps the other sell their products).
I don't know what to say. Macs are very common in science. It's the entire reason XGrid exists.
I don't get to university campuses much (last time was 1999), so most of my time with workstations is spent in engineering. The rest of my enterprise exposure doesn't use Macs either (mostly banking and telecommunications data centers for clients, such as Chase and AT&T).
I said all professors. That means all departments.
I thought it was overkill, but whatever.
What schools?
I'm curious to see what schools, and how many have shifted to Macs, as out in the wild, I don't see them too often. I do know of some used by medical professionals, but it's not many (local hospitals still use PC's for servers).
Friends/colleagues I keep in contact with in other areas of engineering don't either (mechanical and petroleum for example).
Coming up through engineering, I worked with a lot of Linux machines. But honestly? I saw a lot of Mac Pro cardboard boxes in the department too. And unless they just liked ordering the boxes by themselves...
What kinds of engineering?
I don't think people are using Mac Pros just for Office...
I've not seen them much in engineering (what the guys that did <technically CS, not engineers doing contract work for AT&T>, used Apple laptops and provided test data from cell towers in a .xls spreadsheet file). Just office staff otherwise in engineering offices.
And Windows Server can't serve to iPads. Instant deal breaker for Apple. They have to do OS X services. And let's face it, Minis don't cut it for a lot of serving and Apple knows it. You can't avoid a powerful headless Mac.
To me, this is yet another clue that indicates Apple isn't really interested in the Server market though.
They could even choose to take OS X and license it out to other vendors (or at least one). Granted, this opens up other cans of worms (increased hardware support is a big one), but it's theoretically possible at some future date.
Specifically, I keep thinking of this possibility with their own Data Center they've created for iCloud. Without XServes, they'll need to run PC's from other vendors, as shelves full of MP's isn't really the way to go IMO (too much physical space required, even if they do come out with a convertible case <one that can be used either as a desktop or rackmount if you screw the ears to it>). Just not enough units per rack I suspect to keep it as cost effective as slimmer server rack-mountable cases would be able to offer. Floor space and electricity usage are major factors in Data Center operational costs, and has drawn a lot of attention lately.
Why do you think Intel is working so hard to keep power consumption down, and still increase the cost/performance ratio simultaneously = needs of large enterprise users?
Plus, do you really think Apple wants to give Microsoft that power over them? Even if they don't plan on strongly competing with Windows Server, they need that failsafe in place in case Microsoft screws them over.
If you consider if Apple really is getting away from Servers, it all makes sense. MS won't have power over what Apple does, as they won't be in that market.
The only area I can see Apple needing Server capabilities, is with the iCloud services, and you've already pointed out that it's not really reliant on what OS X Server even is. Which means Apple already has a solution to this problem.
Hardware is the only area that the Data Center will run into issues, and Apple could easily get that fixed by hacking their own OS + whatever they've created to handle iCloud servicing to run on whatever specific machines they'll be running.
Which may be why Apple wants a rumored custom chip. Stop supporting the ultra high end of the workstation market, bring it down to the large majority of pros. It would differentiate the Mac Pro as well.
They wouldn't need a custom chip for this though. Just select something that's already available, particularly a single processor based design.
The problem is, it's either going to have to be a Sandy Bridge E based part to keep the performance anywhere near what users already have with current SP MP's.
If they selected an LGA1155 for example, it would be a step backward vs. what users are accustomed to now in a base SP Quad. Yes, the SB's used in the current iMac are quick in terms of single threaded performance, but they fall flat on their face for I/O throughputs, and multi-threaded performance past what 4x cores can do. These versions of SB are meant primarily for mainstream computers and will be used as low-end Xeons (there is an LGA1155 due out).
So I can see where the idea of a custom chip comes in. The problem is, for a truly custom chip (totally different part from the ground up), it means a lot of R&D cost, and that's not likely divided by enough systems if it's only available in a MP.
Why do you think Intel is willing to sell their chips to multiple vendors?
It's not because they think it's a nice thing to do.

It's so they can recover their costs and still sell parts (keeps the R&D per unit low enough they're affordable to system vendors, and ultimately users once all the costs and margins are added). They have to use economy of scale to keep it affordable. I don't know about you, but I'm not willing to pay in the hundreds of thousands
or more per CPU (all comes down to how many units will be manufactured).
Now if they do a "custom" chip that's based on what they've already designed, such as the Sandy Bridge E, that is possible. But there's still cost issues to deal with, and to keep it low enough to be viable, there's either little changes (i.e. skip the IHS, which isn't changing the circuit topology), or cut something (i.e. eliminate the additional QPI's so it's an SP only design).
The reasoning behind this, is costs. They won't be reduced much, if at all (no IHS = naked = won't save much). Where the "if at all" comes in, is with cutting it down as a reduced part (reduce the transistor count as a means of getting more parts per wafer), as it could even increase a bit as there's still additional R&D involved that will only be divided by Apple's machines.
It all depends on how much smaller it gets as to whether or not Intel can get a significant increase in yield per wafer (more parts per wafer = less production cost per part, as the cost of shooting a wafer is ~ the same if it's 1 part or 1000 for the same process and layer count; small differences if there's less materials used, but the time and energy expenses are the same for the process). Now we're talking about a special run for Apple (they're quite willing to do this), but even if it works out (can get more parts per wafer), Apple still has to order enough of them to make it financially viable (how they'd be able to keep the cost per unit down, as there are minimum part counts to make it viable - why parts are made in lots).
Now as time goes on, a future ARM based design could get them most of the way (PA Semi is a licensee of ARM), and make modifications if the ARM design being considered is actually sufficient for workstation use. But this isn't possible right now.
Either way (now or future), it comes back to how many MP's models they can sell as to whether or not it's profitable.
In the case of the iPhone, it works out. I suspect it could for a future Mac consumer grade computer (laptop or even the iMac if they skip Intel or AMD). But they have the sales volume high enough it's feasible.
Unfortunately, the available information on the MP doesn't lend me to think this is the case.
Again, I think the general point here that you are missing is that if Apple is willing to sink money into redesigning something, it's probably not going anywhere.
I'm not sure they've sunk that much into FCPX, given what it is, and what they're selling it for (see above).
Speaking from the academia side of things - I see a great many Macs in science and quantitative fields. A fairly heavy representation in places that benefit well from it being "UNIX but friendly". Statistics, bioinformatics, etc.
I don't see much of this now, but the school may matter as the 12 years that have passed since I was last even on a university campus.
Back then, they mainly used Linux on PC's from what I saw (didn't see any Macs in their offices, and those I was invited to, their homes either). They usually wrote the software themselves, as there wasn't anything available commercially (research afterall, so it's usually on something that's not been discovered already, save peer review).
Less so engineering, and the fields that take engineers. I blame MATLAB
Blame MATLAB!?!?! Did you fall off of your rocker and end up brain damaged?
Seriously though, I suspect it's the most common professional software application across multiple engineering disciplines, and for good reason (for stuff a spreadsheet can't do).
