Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Which still doesn't address that is not what he said.




False. Most iXXX products are not contract driven. iPods? Not. iMac? Not. iPad? Not ( in fact by default you get a "no contract" 3G service arrangement... if even have a 3G radio in it at all. ) If want to throw in the AppleTV even though doesn't technically have an "i" in the name... again not.

Yet again another example of a sweeping generalization about the iPhone into the whole iStuff economy. Not even remotely true.



Another one of the narcissistic driven arguments completely detached from what the relative size of the different markets Apple's solutions are being targeted at.




format over function probably will get disconnected from future Apple designs.



Rigidly committed to dogma as opposed to pragmatic, solution focused? Probably.

GNU Hurd is largely are historical footnote and Linux took over as being the dominate open system Linux. Can also see Apple tossing GPL3 stuff when they can.

That and Apple (and most users ) going in different directions.

There is little to no reason why folks can't use multiple boxes to solve problems with storage. The classic iPod model was built that way. External doesn't necessary mean 1,000 miles away. But 1 inch away isn't necessarily always better either. Right tool for right problem.

Except I responded to what I thought he said..if you don’t like it thats really just too bad.

Tell that to all the customers who buy there stuff through their cell provider. I’d guess well over half probably 70% of iPhones are bought on contract. I’d bet that well over half the 3G iPads are. Obviously WiFi would not be.

In the end I care not at all what direction most users let alone what Apple users going in.

Again if you don’t like my opinion and all opinion is really narcissistic you can go fly a kite.
 
The average user doesn't know what they want.

I can't agree. This might be true of iMac users but it almost certainly doesn't apply to MacPro users.
Think about it:
"I dunno what I want so I'm going to go spend between $4,000 and $7,000 without doing any research or assessing my needs and wants"?​
Sure, Apple users are generally more gullible and uninformed but not by THAT much...!!!

This seems to say the same thing: https://forums.macrumors.com/posts/17511919/
 
Last edited:
A few people mentioned Apple stock so I decided to look. In the past every time Apple announced (an actually) new MacPro their stock shot up noticeably. This time it's taken a dive. I wonder if that has anything to do with the MacPro and Cloud-centric designs? I think it does myself.

You are smoking drugs if you do. It isn't coupled at all. Everytime Apple does a major dog-and-pony show and doesn't do one of the two major revenue drivers ( iPhone or iPad ) the stock dips. Primarily, it is the normal "buy on rumors , sell on news " impact.

The Mac Pro is a complete non factor in Apple's stock price. The only thing talking alot about the Mac Pro would do to reduce the stock price is confusion as to why spending so much time talking about it when it makes no difference to the stock price. ( market watchers wonder if have eye on the ball or not instead of checking out the geeky cheerleader up in the stands. ).

That some folks are fluttered about the iOS7 changes would also have impact.

As far as being Cloud centric being a problem. LOL. The iCloud/Store segment of Apple is on track to pass the Mac system segment this year (or maybe next depending how this years Mac systems do) in revenue. So it would be iPhone , iPad , iCoud/Store, and then Mac . That grow is what is help keeping the stock up.

Apple's stock price was a bubble. This is only rational pricing gaining a bigger foothold coupled to Apple starting to issue substantive dividends ( so stock owners can make money without having to sell any Apple stock to do so. Real actual money; not unrealized gains. )
 
I can't agree.

There is some truth to it, in that while users absolutely know what they want to achive, they may not always be aware of how to best achive it, what's possible with new technology, where things are moving etc.


I don't think you can correlate stock market to a single event like the release of a Mac Pro with any certainty, look at the last 6 month for example.
 
The average user doesn't know what they want.

So no, i don't think apple are too concerned.

It isn't actually "want" as much as "need"/"requirement" ( there is a identified group of folks sufficiently large that has a common root cause problem they need/required solved ).

The Henry Ford quote.
"“If I had asked people what they wanted, they would have said faster horses.”

drives at the difference between wants and needs/requirements. What people needed/required was faster transportation not necessarily horses. Once again it is a basic human cognition trap that is often engaged of not doing a generalization in a general fashion that is driven by root causes/constraints but by extrapolating on too narrow of corner cases and/or specifics.

Folks who say "what I want is a requirement" don't really get it.

----------

Since when have Apple cared what people think, regardless of the overall response?

Apple cares about what people think when they are actually using the products. Apples listens to feedback. They may or may not think it is good feedback, but they do watch and pay attention to what problems people are trying to solve. But they also look a broad spectrum of folks so there isn't one single best answer that covers everyone.

Arm chair , friday evening or monday morning quarterbacking about "If I was in charge of product development or CEO or had Steve powers " ? Care what folks think about that? No, they don't.

Primarily they care about what people are going to buy not just think about.
 
Apple has something that none of us have... data. Lots of data about how currently deployed Macs are configured and used. They know how much disk storage is configured, how much memory is installed. They know what apps are used, and in what combinations. They know how many times CDs and DVDs are mounted and unmounted. Etc. etc. etc.

Really? So they know us better than we know ourselves. By the way, if that's the case, where did this come from?
300px-PowerMac_Cube.jpg


----------

Yup, entertaining is the right word for it too. :p

A few people mentioned Apple stock so I decided to look. In the past every time Apple announced (an actually) new MacPro their stock shot up noticeably. This time it's taken a dive. I wonder if that has anything to do with the MacPro and Cloud-centric designs? I think it does myself. Everything could change based on the entry price. I guess most people are assuming >$4K or over $3.5K for sure? Anyway, here's the market reaction:

I read in some of your other posts that you like this design. Are you saying that you would buy one, but you acknowledge that it is unpopular and may not do well?
 
Toss a-e out and the number of unique complaints will plummet. There are more than just a small few. But a major fraction of the bulk is the above.

lol you spent a lot of time making everyone's arguments sound like crap.

I'm going to go with price here too! I want a Haswell i7 desktop with 16gb of ram and room for 64 that doesn't cost >$1600. Hell, for $1500 I'd take an i5 4670 rig from Apple. And I'd totally expect 2 PCIe x16 slots on it - one for a video card and one for something else (possibly).

I don't want a half inch thin desktop that belongs in a Bang and Olufsen catalog... I want an affordable desktop and reasonable performance!

But apple doesn't make it, so I don't even care anymore. lol just enough to post this and.. yea whatever

I like the mac pro enough that if I won the lottery I'd buy one.
 
There was also this poll of the macrumors forum which is about 3:2 against the new design.

yeah.. i don't know. i didn't even read all the comments or threads etc prior to posting that.. ssdd i imagine.

seriously though.. give me a fast computer that does what it's supposed to do every time i ask it to.. 24/7 when need be.. a stable OS

that's all i actually need from the computer.. (and while i may be in the minority here with these sentiments, i honestly believe most of the pro world feels similar)

the rest of the stuff- the stuff that really matters when seeking real productivity gains is the software.. engage the devs if you're really trying to make work better
 
I'm not a computer specialist and probably don't even know half of what you guys know when it comes to the technical part but couldn't another argument be that for one - the current MP also has its limitations when it comes to upgrading, its not as if you can keep upgrading it for years to come, either way a case holding all of your hardware is going to have its limitations also when it comes to space, different connections for future hardware...etc

When keeping all that in mind isn't the nMP just as upgradable as the current MP, after all you can keep expanding it with external hardware over the years which could give it a life cycle of more than 3 years and therefore comparable with the current one. All systems have their limitations i guess. Sure the current MP is upgradable but lets be honest, its not as if there are tons of CPU or video card options out there in comp. with the PC market...

Yes, all lifespans are ultimately finite. The new Mac Pro product's "Best Value" assessment and its lifecycle costs (and management thereof) are presently unknown, including if it will be worse(better) than the legacy Mac Pro. This answer will ultimately come down to where Apple chooses to price the product (including its options), which we will know after it launches.

However, there is some conjecture based on known principles that can be started today: the Tube inherently has fewer upgrade options available to it than the current (legacy) design, and the cost to utilize these can only higher, due to a combination of its basically "external-only" design constraint and its new unique/proprietary interfaces, including the 'Thunderbolt Tax'.

Yes, this suggests a higher lifecycle cost, which represents a decline in value and is a bad thing from the end item consumer's perspective.

But time will tell. For example, if Apple were to somehow bring in the new Mac Pro at $999, the MR response would probably be very positive and suggest that it's a huge game-changer.


To me, the Mac Pro announcement at WWDC seemed to be a message aimed at pro app developers: Adopt OpenCL or get out.

Pretty effective message if you ask me.

Or would have been more effective to just buy Adobe?

Final Cut Pro X employs OpenCL (and GCD)...but given its reception at launch, to suggest that Apple has been successful in "leading by example" has had some major shortcomings.

In general, I'm inclined to suspect that OpenCL has probably been getting more traction in iOS than in OS X, if for no other reason than that there's a smaller base of "just works - leave it alone" legacy code that gets retained from update to update.

Maybe 10.9 will be able to break through, but considering that OpenCL & GCD dates from 10.6, it has been a long time coming.


-hh
 
Impossible. EU law prevented them from doing this and a change was inevitable.

EU law required a few minor tweaks to the design to incorporate a fan guard and some electrical changes. The EU have not banned tower PCs! Yes, there would have been some expenses associated with such a change, but they would have been easily justified if the Mac Pro had been making serious money.

The EU situation is just additional evidence that Mac Pro sales were low and heading south.
 
...
However, there is some conjecture based on known principles that can be started today: the Tube inherently has fewer upgrade options available to it than the current (legacy) design,

When are upgrade options most needed in a product's lifecycle. On launch day , 2-3 years out, or 4-5 years out ?

and the cost to utilize these can only higher,

Given the above where in time are these costs actually occur ?

due to a combination of its basically "external-only" design constraint and its new unique/proprietary interfaces, including the 'Thunderbolt Tax'.

For how many more years is Thunderbolt going to be new?


Yes, this suggests a higher lifecycle cost, which represents a decline in value and is a bad thing from the end item consumer's perspective.

Costs don't necessarily go down over time. RAM for the 2008 Mac Pro is more expensive than for the more current models. Many upgrades components go through a high-low-high price cycle over time.


Final Cut Pro X employs OpenCL (and GCD)...but given its reception at launch, to suggest that Apple has been successful in "leading by example" has had some major shortcomings.

In the FCPX brouhaha there are few if any complaints about the technical use of OS X libraries. About ignoring tape sunk costs? Sure. About the multi camera editing? Sure. About totally new projects files and organization? Sure. But OpenCL is evil and GCD is bad. Even many folks who didn't like those things said it was fast.


Maybe 10.9 will be able to break through, but considering that OpenCL & GCD dates from 10.6, it has been a long time coming.

As individual components yes, but not as a unified solution:

" ... Going beyond the standard, OS X v10.7 adds integration between OpenCL, Grand Central Dispatch and Xcode ... "
https://developer.apple.com/library...L_MacProgGuide/Introduction/Introduction.html

Any apps shooting for backward compatibility with 10.6 has to give up this integration ( or code libraries twice ). It has been a evolutionary expansion of individual subcomponents of the OS X libraries and of the underlying deployed hardware infrastructure.

10.9 ( and broader spectrum OpenCL drivers ) and across the entire 2013 hardware line up enabled is sending a message to any developer that doesn't have their head in the sand.

I think Apple expects that if introduce a library in 10.x that by 10.(x+3) developers get serious about leveraging it. Sure everyone can't be an early adopter, but this is far from aggressive adoption expectation.
 
In general, I'm inclined to suspect that OpenCL has probably been getting more traction in iOS than in OS X, if for no other reason than that there's a smaller base of "just works - leave it alone" legacy code that gets retained from update to update.

The would be odd considering iOS doesn't support OpenCL.

10.9 ( and broader spectrum OpenCL drivers ) and across the entire 2013 hardware line up enabled is sending a message to any developer that doesn't have their head in the sand.

This. The message from Apple is pretty clear. If you don't support OpenCL, you need to now.

Great thing is if you support OpenCL, the new Mac Pro is competitive with more top end workstations.
 
When are upgrade options most needed in a product's lifecycle. On launch day , 2-3 years out, or 4-5 years out ?

Depends on the use case.

Given the above where in time are these costs actually occur ?

Ibid.

For how many more years is Thunderbolt going to be new?

Jsut which historical use case do we want to use for guidance? For example, if we use SCSI and FW, the estimate will be at least three more years.


Costs don't necessarily go down over time. RAM for the 2008 Mac Pro is more expensive than for the more current models. Many upgrades components go through a high-low-high price cycle over time.

General trends are not invalidated by cherry-picked exceptions.

In the FCPX brouhaha there are few if any complaints about the technical use of OS X libraries. About ignoring tape sunk costs? Sure. About the multi camera editing? Sure. About totally new projects files and organization? Sure. But OpenCL is evil and GCD is bad. Even many folks who didn't like those things said it was fast.

The old adage "Speed {Power} is Nothing Without Control" applies. There were two main complaints with how Apple mismanaged the FCPX program - the one was a failure to adequately communicate the depreciation of features prior to their termination and the second was the subsequently-reversed termination of the prior version.

IMO, what probably really happened inside of Apple was that the original program plan probably promised an equal/bettter replacement but as the project ran over budget/behind schedule, it was de-scoped and the plans for the features which would have maintained parity were eliminated.

In any case, the real point is that Apple's opportunity is to lead by example ... and the merits of the technology (ditto its downside risks) can be gaged by how well they themselves do this. So while it is a Good Thing that OpenCL is in FCPX, is it also now in iTunes? In iPhoto? In iMovie, In iEtc?

GCD dates from 10.6 Snow Leopard (2009)

As individual components yes, but not as a unified solution:

" ... Going beyond the standard, OS X v10.7 adds integration between OpenCL, Grand Central Dispatch and Xcode ... "
https://developer.apple.com/library...L_MacProgGuide/Introduction/Introduction.html

Any apps shooting for backward compatibility with 10.6 has to give up this integration ( or code libraries twice ). It has been a evolutionary expansion of individual subcomponents of the OS X libraries and of the underlying deployed hardware infrastructure.

10.9 ( and broader spectrum OpenCL drivers ) and across the entire 2013 hardware line up enabled is sending a message to any developer that doesn't have their head in the sand.

I think Apple expects that if introduce a library in 10.x that by 10.(x+3) developers get serious about leveraging it. Sure everyone can't be an early adopter, but this is far from aggressive adoption expectation.

One would certainly hope that by ~5 years after its first Golden Master roll-out...but time will tell with the likes of Adobe as to if they'll really start to invest in it, or if this will become yet another Carbon/Cocoa mess where nothing less than forced obsolescence prompted change (reinvestment)...

{EDIT2} ...and another factor in if companies will be motivated to make this additional investment is the same observation that's been made before: with customer performance requirements hitting a plateau, there's minimal customer demand for faster. The practical implications are that the only companies who will risk the additional investments are those who have meaningful competition with other software vendors so as to have a sales upside to benefit from higher internal costs. Feel free to review the relevant niche segments to see who this might be. For example, who competes against Photoshop to be a motivation for Adobe? Businesses don't spend money just to spend money.


The would be odd considering iOS doesn't support OpenCL.

Sorry, I misread a webpage on the subject. In any case, there has been evidence of interest for OpenCL in iOS for quite awhile ... one of the webpages I found while trying to cross-check this noted that a beta of iOS 4.3 had hooks.


The message from Apple is pretty clear. If you don't support OpenCL, you need to now...

Although there's also questions on how it competes (on multiple levels) with NVIDIA's CUDA...



-hh
 
Last edited:
The would be odd considering iOS doesn't support OpenCL.

As a 3rd party developer API? iOS is likely somewhere Apple would run this for a generation purely as an internal facility for iOS libraries (i.e., just Apple's code).

They also have a problem with a glut of legacy hardware ( more iOS devices sold than Macs over its entire lifetime. The oldest of which are just as old as these older Mac Pro models being discussed in these forums. )

But the hardware is ready

http://withimagination.imgtec.com/index.php/news/powervr-sgx-cores-get-opencl-conformance

http://semiaccurate.com/2012/05/14/imagination-makes-the-case-for-mobile-opencl/


This. The message from Apple is pretty clear. If you don't support OpenCL, you need to now.

Should be on the current "next app update" release window. What planning to ship in Q2 '13 - Q2 '14 there should be a leverage OpenCL where it fits or leverage OS library that enables clue to what doing.

I don't think Apple expects developers to change gears and be ready by 10.9 lauch to fully exploit. Coupled with likely de-support for 10.6 and it should be pretty clear.

It will take several months for 10.9 to roll out and help push more 10.6 (and lower) deployed percentages down substantively more.


Great thing is if you support OpenCL, the new Mac Pro is competitive with more top end workstations.

Especially since those platforms have already been there. This is more a "keeping up with the Joneses " thing . Apple moving to OpenGL 4 ( good fraction of recently deployed user base has hardware for) and making things more uniform for OpenCL should make this a more uniform target to aim at than the "everything for everybody" Windows market.
 
As a 3rd party developer API? iOS is likely somewhere Apple would run this for a generation purely as an internal facility for iOS libraries (i.e., just Apple's code).

Nope. It is possible internally there is support for Apple somewhere.

Developers have been waiting excitedly for the support to come. But it's not here yet.

My best guess is that Apple is worried about the sandbox/security implications. Probably the same reason apps can't use JIT JavaScript on iOS.

OpenCL is interesting as Apple's entire support of it has been based on their support of the Mac pro market. There are no secondary motivations for it right now, Apple is pushing it entirely for pros.
 
....
General trends are not invalidated by cherry-picked exceptions.

Like-new 1965 Corvette doesn't cost more now than back in 1965?
A like-new Spider-Man #5 doesn't cost more now than back then?

That Apple I aren't being auctioned off for 10's of thousands of dollars.

Many items change price over time that isn't always monotonically down.
In my book that is general. I didn't put a fixed width time window on the effect.


IMO, what probably really happened inside of Apple was that the original program plan probably promised an equal/bettter replacement but as the project ran over budget/behind schedule, it was de-scoped and the plans for the features which would have maintained parity were eliminated.

Highly not likely. Quite old waterfall model that typically leads to massive specs and failed projects. Far more likely they were actualy following modern software development practices like

http://en.wikipedia.org/wiki/Extreme_programming

http://en.wikipedia.org/wiki/Agile_software_development

That the objective is *not* to come out with a "Big Bang" release. Just like how iOS and OS X are now on yearly updates. Same thing. There likely never was an intention to release match feature-for-feature 100%. That would have been an extremely goofy plan for a ground up rewrite.

From evidence that leaked over time is that they threw out the non software folks who insisted that "Big Bang" was the only possible way and/or that legacy hardware features were highest priority. They then proceeded to finish a re-write from ground-up project.

What they didn't do is fork the development team into two groups. maintainers of the old and developers of the new.







. So while it is a Good Thing that OpenCL is in FCPX, is it also now in iTunes? In iPhoto? In iMovie, In iEtc?


http://www.geeks3d.com/20130611/apple-adds-opengl-4-support-in-os-x-10-9-mavericks/

In particular comments about Apple extensions that provision moving to some custom OpenCL additions.


CLUT computations (presuming cl_APPLE_clut has to what what the acroymn usually stands for )

2d and 3d images into buffers ( cl_khr_image2d_from_buffer and cl_khr_3d_image_writes )

and

Unified memory addressing



A "needs to get fixed before launch" aspect is the lower performance AMD OpenCL reported there. Probably won't close gap completely but probably have work to do that isn't done yet.


but time will tell with the likes of Adobe as to if they'll really start to invest in it, or if this will become yet another Carbon/Cocoa mess where nothing less than forced obsolescence prompted change (reinvestment)...

corporations are not immune to procrastination. The reality is that OS X belongs to Apple. So the big picture time-span window of when folks need to move to stay aligned is driven by Apple because it belongs to them.




{EDIT2} ...and another factor in if companies will be motivated to make this additional investment is the same observation that's been made before: with customer performance requirements hitting a plateau, there's minimal customer demand for faster.


Right so making the lower priced entries in the product portfolio deliver more performance is a very good thing. Again the Mac Pro is largely aligning with the rest of the Mac line up in going OpenCL capable top-to-bottom. The Mac Pro has a product differentiation edge in that it can go up higher on the performance curve but the issue is have to be one the curve somewhere to be a Mac at all in 2013 and going forward.





The practical implications are that the only companies who will risk the additional investments are those who have meaningful competition with other software vendors so as to have a sales upside to benefit from higher internal costs.

No. Software companies need to folow where the customers are going. If software company only primarily target upper end Mac Pro users and now the users are buying lower end Macs not certifying and/or optimizing on those "new" platforms will eventuall mean lost business.

Alot niche software companies get into death spiral where loose volume pass costs to few customers... which gets them less customers and .....



Although there's also questions on how it competes (on multiple levels) with NVIDIA's CUDA..

With software developers who have multiple platform applications .... long term it really doesn't. The question is more when to make the transition not if.
 
Like-new 1965 Corvette doesn't cost more now than back in 1965?
A like-new Spider-Man #5 doesn't cost more now than back then?

That Apple I aren't being auctioned off for 10's of thousands of dollars.

Merely more cherry picking on your part, all of which are outside of the context here of being in Information Technologies.

And even so, if we ignore prices due to relative scarcity (hence, the market for collectibles that you listed) and focus on the capability afforded by a product, the performance akin to a 1965 Corvette is achieved by many modern cars at a cost less than $29,500 (after inflation, what $4100 from 1965 represents).

Many items change price over time that isn't always monotonically down.

But the context here is IT, not "All Goods". Due to factors explained such as by Moore's Law, the price : performance trend for digital information technologies has most indisputably been down.

Case in point: the MSRP of the Apple ][ was $1298 (4K RAM) ... that's substantially higher than a Mac mini today, even before any inflation considerations.

And if we do include inflation, its $4848.66 ... which is $1K more than the price of a 12-core (dual-CPU) Mac Pro today.


what happened with FCPX

Highly not likely. Quite old waterfall model that typically leads to massive specs and failed projects. Far more likely they were actualy following modern software development practices like

http://en.wikipedia.org/wiki/Extreme_programming

http://en.wikipedia.org/wiki/Agile_software_development

That the objective is *not* to come out with a "Big Bang" release. Just like how iOS and OS X are now on yearly updates. Same thing. There likely never was an intention to release match feature-for-feature 100%. That would have been an extremely goofy plan for a ground up rewrite.

From evidence that leaked over time is that they threw out the non software folks who insisted that "Big Bang" was the only possible way and/or that legacy hardware features were highest priority. They then proceeded to finish a re-write from ground-up project.

Sorry, heard it all before...its just the old old "Evolution vs Revolution" routine merely with new buzzwords and the same old debates of how to perform a transition. The question is if "as cheaply as possible" gets higher priority than doing it right, including the graceful legacy support planning. Given how well Apple has handled this sort of transition before, FCPX was a clear disaster for them.

The problem is that rapid incremental spirals only works for major rewrites if you don't promptly burn the old app to the ground upon the release of your first spiral...and even then you still have to actually rapidly deliver your additional spirals to build up the capability to support the legacy application in order to propertly sunset it.

If there's a publically published Road Map that details what features are expected to be restored when, then you have reasonable confidence that there's a commitment...without it, you have the FCPX situation.

What they didn't do is fork the development team into two groups. maintainers of the old and developers of the new.

Which happens all the time .. frequently blamed as 'not possible' due to some vague resource constraint and additionally challenged as 'unnecessary' because the new team will be so highly successful and spiral like mad, that the users of the legacy will all jump immediately, thereby making the legacy investment support 'unjustified'.




corporations are not immune to procrastination. The reality is that OS X belongs to Apple. So the big picture time-span window of when folks need to move to stay aligned is driven by Apple because it belongs to them.

Let's be more cynical: corporations are not immune from being cheapskates, particularly when some middle-high level VP can score a big bonus for himself next year from slashing staff this year ... and if the subsequent train wreck crash that occurs some quarters later can be on someone else ("failing to embrace & execute the vision") instead of his own decisions, that's icing on the cake.

How this applies here is that Apple can 'big picture' all they want, but it is still up to the individual business units (particularly 3rd party) to decide if, when and how to invest in said 'vision'...and unless there's some really clear business case that shows a huge advantage to being an early mover, the tendency will be to procrastinate and defer the expense.


Right so making the lower priced entries in the product portfolio deliver more performance is a very good thing. Again the Mac Pro is largely aligning with the rest of the Mac line up in going OpenCL capable top-to-bottom. The Mac Pro has a product differentiation edge in that it can go up higher on the performance curve but the issue is have to be one the curve somewhere to be a Mac at all in 2013 and going forward.

Sure, except that we've been aggressively preached to about how the Mac Pro is a miniscule percentage of total Macintosh sales ... as such, having the last 2% of product sales finally 'align' with the rest of the product line-up smacks of a "Tail wagging the Dog" rationalization.


"The practical implications are that the only companies who will risk the additional investments are those who have meaningful competition with other software vendors so as to have a sales upside to benefit from higher internal costs."


No. Software companies need to folow where the customers are going.

Unfortunately, 80%-90% of the personal computer industry illustates otherwise: very little of the WinTel hardware clone makers do any significant investment in advancing the future direction - - they compete as a simple commodity market largely based on the software that Microsoft hands over to them in the form of Windows, IE and Office.

If software company only primarily target upper end Mac Pro users and now the users are buying lower end Macs not certifying and/or optimizing on those "new" platforms will eventuall mean lost business.

Only if you choose to interpret the statement in a Reductio ad absurdum fashion. As we've discussed, the set of products which have performance suitable to upper-tier has grown, which means that the target customer demographic is no longer just the top end (dualCPUonly) Mac Pro, but it now also encompasses all of the Mac Pros, as well as the 27" iMac configuration, a few similarly solid laptops, etc ... it simply is excluding the bottom tier hardware becuase their customer demographic won't buy that low.

Alot niche software companies get into death spiral where loose volume pass costs to few customers... which gets them less customers and .....

Correct, but not applicable per the above. In any event, the same software company runs the risk of losing customers if they cater to the lowest common denominator and in doing so, end up with a non-compelling product. Sure, they'll have something that they can sell to the masses if they price it cheap enough, but the lack of performance features means that the more demanding customer demographic will drop them and go upscale.

With software developers who have multiple platform applications .... long term it really doesn't. The question is more when to make the transition not if.

Actually, multiple platform support raises its own ball of hairy wax -- to what degree to drive to a "lowest common denominator" of common code so as to minimize their in-house development and support costs. Once again, so long as the vendor isn't being pressured by their customers (plateau factor) or competition (loss of sales), there's no strong motivation with which to justify incurring the expenses of tackling the issue anytime soon.


-hh
 
With software developers who have multiple platform applications .... long term it really doesn't. The question is more when to make the transition not if.

OpenCL and CUDA can be syntactically different, but they pretty much do the same thing and work in the same way. The continued questions about OpenCL proving it can perform as well are pretty silly. It's like comparing the performance of cars who's only difference is paint color.

There could be negligence in OpenCL drivers, but that's not a fundamental fault of the technology, and is still better than the zero acceleration you get with CUDA on AMD hardware.
 
It isn't actually "want" as much as "need"/"requirement" ( there is a identified group of folks sufficiently large that has a common root cause problem they need/required solved ).

The Henry Ford quote.
"“If I had asked people what they wanted, they would have said faster horses.”


OK, let me be more specific: the average end user doesn't know what they need. You've actually supported my point of view here.

What people are whining about here is how the new Mac Pro isn't just a faster version of the previous one (a faster horse).

Yes, there will be teething pain as people adapt both their workflow and the software to the new platform. And yes, Apple are quite pushy with regards to this. For example, I'm quite sure they deliberately used non-CUDA compatible AMD GPUs in the new Pro, specifically to force the hand of lazy developers who are still using CUDA years after OpenCL has been pushed out the door as the way forwards.

Apple don't want their processing intensive apps tied to one vendor. Being able to switch vendor at the drop of a hat has saved the company many times; Motorola 68k to PPC, PPC to intel. And I have no doubt they have OS X running on AMD (and ARM, and likely MIPS and others) hardware internally both as a bargaining chip with Intel and as a contingency plan in case something happens with intel.

Amiga and Atari (the ST/Falcon/TT) were pretty much killed with the death of the 68k platform - their hardware design (custom chips) and software was too tightly tied to that architecture.

In the long run, OpenCL is the way forward. Apple is just giving developers who have been slow to get on board a bit of a push. Those who adapt will sell software. Those who don't will fall by the wayside.

If NVidia were to go under or fail to provide a competitive product what then? I'll tell you what: CUDA is a dead duck. The way intel are going at the moment, this may well be a reality we face in the next 5 years (i.e., the life time of machines sold today). Like with CPUs, Apple want to be GPU agnostic.


Ditto for PCIe slots, internal storage bays, etc.

Many people think they "need" them, but they really don't. They are currently using them, but there is more than one way to accomplish these things.

My only real problem with teh spec of the machine is the single Gig-E port (should be 10 Gig-E), but in the real world very few have 10 gig at their desk and it can be added via thunderbolt anyhow. And maybe you'd rather have Fiberchannel instead? In which case it would have been a waste of money, space and PCIe lanes.

Internal storage? If you're planning on putting say 16TB of storage in the machine, how do you plan on backing it up? Far better to run large amounts of storage on a proper dedicated storage machine (via fiberchannel, NFS, etc) and have that locked away in a controlled environment with redundant power, automated backup, a real file system with data integrity, snapshots, etc (e.g., ZFS). It doesn't belong in a workstation, which lives in an environment which is far less controlled.

Apple currently have no product in that space, and that's fine. But trying to turn a Mac Pro into a data warehouse when the filesystem is a bit of a joke (HFS+) is not really productive. Yes people have been doing it in the past. Doesn't mean it's really sane though.

Just because you have a hammer, it doesn't mean everything is a nail... use the right tool for the job!
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.