Like-new 1965 Corvette doesn't cost more now than back in 1965?
A like-new Spider-Man #5 doesn't cost more now than back then?
That Apple I aren't being auctioned off for 10's of thousands of dollars.
Merely more cherry picking on your part, all of which are outside of the context here of being in Information Technologies.
And even so, if we ignore prices due to relative scarcity (hence, the market for collectibles that you listed) and focus on the capability afforded by a product, the performance akin to a 1965 Corvette is achieved by many modern cars at a cost less than $29,500 (after inflation, what $4100 from 1965 represents).
Many items change price over time that isn't always monotonically down.
But the context here is IT, not "All Goods". Due to factors explained such as by Moore's Law, the price : performance trend for digital information technologies has most indisputably been down.
Case in point: the MSRP of the Apple ][ was $1298 (4K RAM) ... that's substantially higher than a Mac mini today, even before any inflation considerations.
And if we do include inflation, its $4848.66 ... which is $1K more than the price of a 12-core (dual-CPU) Mac Pro today.
what happened with FCPX
Highly not likely. Quite old waterfall model that typically leads to massive specs and failed projects. Far more likely they were actualy following modern software development practices like
http://en.wikipedia.org/wiki/Extreme_programming
http://en.wikipedia.org/wiki/Agile_software_development
That the objective is *not* to come out with a "Big Bang" release. Just like how iOS and OS X are now on yearly updates. Same thing. There likely never was an intention to release match feature-for-feature 100%. That would have been an extremely goofy plan for a ground up rewrite.
From evidence that leaked over time is that they threw out the
non software folks who insisted that "Big Bang" was the only possible way and/or that legacy hardware features were highest priority. They then proceeded to finish a re-write from ground-up project.
Sorry, heard it all before...its just the old old "Evolution vs Revolution" routine merely with new buzzwords and the same old debates of how to perform a transition. The question is if "as cheaply as possible" gets higher priority than doing it right, including the graceful legacy support planning. Given how well Apple has handled this sort of transition before, FCPX was a clear disaster for them.
The problem is that rapid incremental spirals only works for major rewrites if you don't promptly burn the old app to the ground upon the release of your first spiral...and even then you still have to actually rapidly deliver your additional spirals to build up the capability to support the legacy application in order to propertly sunset it.
If there's a publically published Road Map that details what features are expected to be restored when, then you have reasonable confidence that there's a commitment...
without it, you have the FCPX situation.
What they didn't do is fork the development team into two groups. maintainers of the old and developers of the new.
Which happens all the time .. frequently blamed as 'not possible' due to some vague resource constraint and additionally challenged as 'unnecessary' because the new team will be so highly successful and spiral like mad, that the users of the legacy will all jump immediately, thereby making the legacy investment support 'unjustified'.
corporations are not immune to procrastination. The reality is that OS X belongs to Apple. So the big picture time-span window of when folks need to move to stay aligned is driven by Apple because it belongs to them.
Let's be more cynical: corporations are not immune from being cheapskates, particularly when some middle-high level VP can score a big bonus for himself next year from slashing staff this year ... and if the subsequent train wreck crash that occurs some quarters later can be on someone else ("failing to embrace & execute the vision") instead of his own decisions, that's icing on the cake.
How this applies here is that Apple can 'big picture' all they want, but it is still up to the individual business units (particularly 3rd party) to decide if, when and how to invest in said 'vision'...and unless there's some really clear business case that shows a huge advantage to being an early mover, the tendency will be to procrastinate and defer the expense.
Right so making the lower priced entries in the product portfolio deliver more performance is a very good thing. Again the Mac Pro is largely aligning with the rest of the Mac line up in going OpenCL capable top-to-bottom. The Mac Pro has a product differentiation edge in that it can go up higher on the performance curve but the issue is have to be one the curve somewhere to be a Mac at all in 2013 and going forward.
Sure, except that we've been aggressively preached to about how the Mac Pro is a miniscule percentage of total Macintosh sales ... as such, having the last 2% of product sales finally 'align' with the rest of the product line-up smacks of a "Tail wagging the Dog" rationalization.
"The practical implications are that the only companies who will risk the additional investments are those who have meaningful competition with other software vendors so as to have a sales upside to benefit from higher internal costs."
No. Software companies need to folow where the customers are going.
Unfortunately, 80%-90% of the personal computer industry illustates otherwise: very little of the WinTel hardware clone makers do any significant investment in advancing the future direction - - they compete as a simple commodity market largely based on the software that Microsoft hands over to them in the form of Windows, IE and Office.
If software company only primarily target upper end Mac Pro users and now the users are buying lower end Macs not certifying and/or optimizing on those "new" platforms will eventuall mean lost business.
Only if you choose to interpret the statement in a
Reductio ad absurdum fashion. As we've discussed, the set of products which have performance suitable to upper-tier has grown, which means that the target customer demographic is no longer just the top end (dualCPUonly) Mac Pro, but it now also encompasses all of the Mac Pros, as well as the 27" iMac configuration, a few similarly solid laptops, etc ... it simply is excluding the bottom tier hardware becuase their customer demographic won't buy that low.
Alot niche software companies get into death spiral where loose volume pass costs to few customers... which gets them less customers and .....
Correct, but not applicable per the above. In any event, the same software company runs the risk of losing customers if they cater to the lowest common denominator and in doing so, end up with a non-compelling product. Sure, they'll have something that they can sell to the masses if they price it cheap enough, but the lack of performance features means that the more demanding customer demographic will drop them and go upscale.
With software developers who have multiple platform applications .... long term it really doesn't. The question is more when to make the transition not if.
Actually, multiple platform support raises its own ball of hairy wax -- to what degree to drive to a "lowest common denominator" of common code so as to minimize their in-house development and support costs. Once again, so long as the vendor isn't being pressured by their customers (plateau factor) or competition (loss of sales), there's no strong motivation with which to justify incurring the expenses of tackling the issue anytime soon.
-hh