Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I have no idea what you actually spec'd out, as it's impossible to get absolute parity. As you increase the clock frequency, the price gaps diminish rapidly, and can go over the MP, according to the web pricing.

The only place I couldn't really get parity was the graphics card, so I went with a FireGL that was equivalent to the 5770. I went with the equivalent to the base 8 core Mac Pro.

It's designed for 2x graphics cards (16x lanes) and has the necessary power cables (designed for 225W max per).

The T5500 can handle 2x cards as well, but have a lower TDP of 150W per card.

It still doesn't have more cables than the Mac Pro. I've had to do the whole molex adaptor mess in a T7500 before. Although I suppose you could claim that at least the T7500 has molexes you can reroute, but it's not great...

You can cut yourself in the internals of any system, including the MP (done it). Nor do I find them nearly as difficult as the basic business machines either (consumer CPU's, as Xeon's aren't needed, but don't have the crapware or extra toys that come with the typical consumer oriented systems).

Never cut myself on a Mac Pro. My Mac Pro is great to open up and replace parts in, I have trouble getting a T7500s side door back on, that's how bad it is...

With the T7600 it seems that Dell finally redid the case, but I haven't gotten to play with one.

As per internal layouts, they vary from model to model (don't see the T7500 as too bad; T5500 OTOH, is more difficult due to the smaller space). I've seen some really good cable routing in some of Dell's systems, horrid mess in others (seriously if it was the difference between a "hung over Monday" vs. mid week "Wednesday Sober build"). The latter HP's were actually fairly clean inside.

Haven't played with the T5500, but by Mac standards, the T7500 has a really awful internal layout. Just totally awful. Even replacing the hard drives is a PITA because they're mounted vertically from what I remember.

Not heard of complaints on this before (Ethernet controller = Broadcom 5761, Audio = Analog Devices IIRC).

The audio I managed to fix by changing some settings in the BIOS. The ethernet was always a problem. Disabling and re-enabling the link seemed to fix it (or if that didn't work, a restart), so it's possible it's a software thing.

Yet Photoshop can only use 2x. Other applications have limits too (listed in other threads here on MR), such as some H.264 encoding done on one core.

If your H.264 encoder is only using one core, you need a new H.264 encoder. Free H.264 encoders can even use multiple cores.

Some software can do true n core multi-threading. But not all software out there now can even do multi-threading of any kind, and what can, has a tendency to be a fixed core implementation.

I'm aware. Multicore research was a focus of my CS degree.

I'm not sure I would say the above is correct, actually. OS X forces a lot of multithreading. The reason Adobe has such trouble is basically they're basically using a abstraction layer to run their Windows version on Mac, so basically they're avoiding a lot of Mac APIs.

I have no idea what you actually spec'd out, as it's impossible to get absolute parity. As you increase the clock frequency, the price gaps diminish rapidly, and can go over the MP, according to the web pricing.

The only place I couldn't really get parity was the graphics card, so I went with a FireGL that was equivalent to the 5770. I went with the equivalent to the base 8 core Mac Pro.

It's designed for 2x graphics cards (16x lanes) and has the necessary power cables (designed for 225W max per).

The T5500 can handle 2x cards as well, but have a lower TDP of 150W per card.

It still doesn't have more cables than the Mac Pro. I've had to do the whole molex adaptor mess in a T7500 before. Although I suppose you could claim that at least the T7500 has molexes you can reroute, but it's not great...

You can cut yourself in the internals of any system, including the MP (done it). Nor do I find them nearly as difficult as the basic business machines either (consumer CPU's, as Xeon's aren't needed, but don't have the crapware or extra toys that come with the typical consumer oriented systems).

Never cut myself on a Mac Pro. My Mac Pro is great to open up and replace parts in, I have trouble getting a T7500s side door back on, that's how bad it is...

With the T7600 it seems that Dell finally redid the case, but I haven't gotten to play with one.

As per internal layouts, they vary from model to model (don't see the T7500 as too bad; T5500 OTOH, is more difficult due to the smaller space). I've seen some really good cable routing in some of Dell's systems, horrid mess in others (seriously if it was the difference between a "hung over Monday" vs. mid week "Wednesday Sober build"). The latter HP's were actually fairly clean inside.

Haven't played with the T5500, but by Mac standards, the T7500 has a really awful internal layout. Just totally awful. Even replacing the hard drives is a PITA because they're mounted vertically from what I remember.

Not heard of complaints on this before (Ethernet controller = Broadcom 5761, Audio = Analog Devices IIRC).

The audio I managed to fix by changing some settings in the BIOS. The ethernet was always a problem. Disabling and re-enabling the link seemed to fix it (or if that didn't work, a restart), so it's possible it's a software thing.

Yet Photoshop can only use 2x. Other applications have limits too (listed in other threads here on MR), such as some H.264 encoding done on one core.

If your H.264 encoder is only using one core, you need a new H.264 encoder. Free H.264 encoders can even use multiple cores.

It's perfect for students, or pros that run Photoshop all day. To claim it's there to fill a hole is foolish, as they won't produce what won't sell.

Pretty sure that machine is called the 27" iMac.

Apple hasn't made a prosumer tower really since the beige G3. Which was about the time the first iMac came out.

If you take a closer look at recent system purchase information, quite a few members here are indicating they went with Quad core or SP Hex core systems (several stated they would have gone with more had their software supported multi-threading on the max number of cores in the system).

Not disputing that some people here have bought them... But you're the one saying it's a bad deal. I don't really care, as I said, it seems kind of like a "well, we might as well make a low end single core system" sort of thing.

The biggest group I see of people buying the single core Mac Pro are people who play games. It's extremely rare to find creative apps that are not really multicore. The only one I can think of these days is Photoshop, which doesn't even use the GPU, begging the question of why a Photoshop user should be buying a Mac Pro in the first place...

It seems to me that you're discounting the software lag. Core counts are out pacing the software development rate.

Well, given that multicore software (and CUDA/OpenCL) is what I write... Some stuff, like Final Cut, is lagging just because it's being slow to be updated. But again, the only major creative app I can think of that's not multithreaded is Photoshop (and Illustrator and InDesign if you want to nit pick)....

Generally, if your software can thread for 6 cores, it'll thread for 8, and it'll thread for 12. Once an application goes multicore, generally the concept of "core count outpacing software" goes away because we've already rewritten our algorithms to scale to x number of cores. Every so often someone will optimize their code a bit by locking in a max number of cores, but that's usually because they don't want to run the math at runtime to decide how many cores to target, and they just need to go in and update the number. Usually the stuff I write is more adaptive and will scale to a new number of cores without needing a recompile.

(Note: this is the problem Grand Central is built to solve. Grand Central doesn't make multicore necessarily easier to program, but it is basically a bunch of code that basically handles deciding how many threads to make for you.)

As we move to OpenCL, the whole software/number of cores thing is going to go away even further, because a GPU is basically a 500 core CPU. Software is going to have to thread extremely well anyway to use something like OpenCL (and targeting a 500 core CPU is not nearly as hard as it sounds.)
 
If your a professional editor, your likely to have several expansion cards to connect to things, storage, video monitors, 100s of other random perhiperals that lie about your edit suite (Seriously Apple, Less than 10 USBs isnt enough). And if your a Musician, in a Studio, youll have a capture card, and if you use ProTools, several internal cards being connected out for a full setup. I cant see LightPeak ever being able to catch-up to the simplicity of having it all in 1 box. I didnt say they were the only people who needed it, I was just saying people who do Home Movies arent likely to have the same amount of gear, same for the Band in a Garage - unlikely theyll have a full ProTools setup, so if thats where your coming from you wouldnt necessarily see the need for the Pro. - Apple wont can it for 1 other reason - no Editor likes being told what Display they "have" to use. So either Apple produces a 24-core Mini (Likely to be impossible in 2014), or they keep going with the Pro (Likely) - There are enough people who will buy them for the prices the Pro goes for that really Apple may as well keep it, as its making them money. My point was if your a professional, your not going to want to run your programs on something without internal expansion, it doesnt matter how powerful the iMac gets, we still need expansion cards for stuff.

Well in the case of Logic studio, I understand where you're coming from and the significance of internal expansion, but I still think you're narrowing down your view of what a "professional" is greatly.

I'll tell you now, you do not need anything more than a simple USB midi keyboard to do the types of music found in simple, yet amazingly successful games found on iOS. Example being titles such as Angry birds, Cut the rope, Plants vs. Zombies, etc. I personally enjoy the music of these titles and I find it effective and well done, but you can get this music done on Logic Express running off a Mac Mini. Seriously.

Yet these are "professionals" that get their paycheck just like the rest of these developers in the industry.

My point is not everyone has to be a hollywood film composer working on multi-million dollar budget or are platinum record winning producers. If I recall, John Powell, film composer for the Bourne trilogy films runs three Mac Pros linked together. So don't think I'm trying underscore the Mac Pro here at all, as it obviously has been a standard and essential for big budget, high production material, but it's not some pre-requesite just to be working in the industry. Smaller projects simply don't need a gargantuan workstations with PCIe expansion cards. Yet you can still work and get paid. So sorry, I simply disagree with your point. You do not need internal expansion just to be working in the industry and to be considered "professional". Its definitely essential for higher paid gigs, but not required for smaller dev houses.
 
Last edited:
The only place I couldn't really get parity was the graphics card, so I went with a FireGL that was equivalent to the 5770. I went with the equivalent to the base 8 core Mac Pro.
GPU was the hardest IMO as well.

Where it can get messy, is if a price comparison uses the base system as much as possible (3rd party upgrades for memory, HDD, ..., whenever possible) vs. getting the same HDD capacity, memory capacity, ... between the systems using the system vendor as the source (has merit when a single point of contact for support is needed, but that's not for everyone, particularly the independents who tend to have tighter budgets and need to use 3rd party sources for upgrades).

It still doesn't have more cables than the Mac Pro. I've had to do the whole molex adapter mess in a T7500 before. Although I suppose you could claim that at least the T7500 has molexes you can reroute, but it's not great...
It's about the ease and not needing to go get special adapters (i.e. graphics cables in the MP aren't standard, and there's only a couple of sources since there's really no other place to tap into properly).

Never cut myself on a Mac Pro. My Mac Pro is great to open up and replace parts in, I have trouble getting a T7500s side door back on, that's how bad it is...

With the T7600 it seems that Dell finally redid the case, but I haven't gotten to play with one.
PC vendor's change cases like underwear - just about with each new model. Some are great, others not so much. Another area that varies quite a bit for the internals, is the cable routing (more important IMO, as that's usually the source of the frustration). That is, open up say a pair of the same model system, but one will be nice and clean (and easy to find what you need), while the other unit look like a rat's nest inside. It depends on who did the final assembly.

Haven't played with the T5500, but by Mac standards, the T7500 has a really awful internal layout. Just totally awful. Even replacing the hard drives is a PITA because they're mounted vertically from what I remember.
I'm not a big fan of vertically mounted drives either, but the implementations vary (i.e. no space, and the quick-release mounting tabs are too cheap).

This is why I've always had a "soft spot" for Sun's and Silicon Graphics, as they tend to put the time into internal layouts. DEC did as well, but they're gone now... :(

The audio I managed to fix by changing some settings in the BIOS. The Ethernet was always a problem. Disabling and re-enabling the link seemed to fix it (or if that didn't work, a restart), so it's possible it's a software thing.
Definitely sounds like software rather than hardware. If you were having other issues (firmware), then I'd go ahead and change out the CMOS battery.

If your H.264 encoder is only using one core, you need a new H.264 encoder. Free H.264 encoders can even use multiple cores.
In this particular case, there's options. But there may not be for a particular application type (either no other choices, or what do exist, suffer the same problem - no multi-threaded support or if it does, is highly restricted, such as what exists with Photoshop).

I'm not sure I would say the above is correct, actually. OS X forces a lot of multi-threading. The reason Adobe has such trouble is basically they're basically using a abstraction layer to run their Windows version on Mac, so basically they're avoiding a lot of Mac APIs.
From the information I've seen, it doesn't actually force it, as that would cause problems from hell (think of forcing a word processor to multi-thread... I can see the stalled cores now... ouch). :p

But rather it provides the API's that will allow developers to implement multi-threading a little easier. This assumes a lot, as either their applications are built on other tools in the past that aren't compatible = re-write (i.e. custom written tools), they haven't dedicated sufficient resources to get it done, or the application is of a type that multi-threading won't benefit it at all.

Pretty sure that machine is called the 27" iMac.

Apple hasn't made a prosumer tower really since the beige G3. Which was about the time the first iMac came out.
If they can use an iMac, as they may actually need the HDD and PCIe expansion, a better monitor, or multiple displays not possible on the iMac (I'm getting the impression the class assignments are expected to be more advanced and polished than what was acceptable in the past for BS/BA level = need more of a system for some students, such as creative degrees).

Not disputing that some people here have bought them... But you're the one saying it's a bad deal. I don't really care, as I said, it seems kind of like a "well, we might as well make a low end single core system" sort of thing.
They have their place (not all workstations need to be high-end; there are entry and mid level units in existence from other vendors as well).

My issue with that unit from Apple, is the price compared to the competition (proper systems = workstations using the same CPU P/N, not consumer grade gear).

The biggest group I see of people buying the single core Mac Pro are people who play games. It's extremely rare to find creative apps that are not really multi-core. The only one I can think of these days is Photoshop, which doesn't even use the GPU, begging the question of why a Photoshop user should be buying a Mac Pro in the first place...
From what I see of the professionals that use those systems, are those that stick to Photoshop for 2D graphics work (no animation). H.264 encoding is a bit slow, but it's not where the majority of time is spent from what I've been told by such users (hence they make a compromise, as the additional funds aren't really justifiable for say ~10% or less of their workload).

That's not to say gamers don't buy them as well, but I'm sticking to actual professionals that post here on MR, not our hobbyist/enthusiast members.

Well, given that multi-core software (and CUDA/OpenCL) is what I write... Some stuff, like Final Cut, is lagging just because it's being slow to be updated. But again, the only major creative app I can think of that's not multi-threaded is Photoshop (and Illustrator and InDesign if you want to nit pick)....
IIRC there's others that get listed here (sub parts of application suites).

But software development always follows the hardware. Professional software is just slower about it, as the software is much more complicated, and the developer tends not to want to spend the funds to get development out faster (tend to release incremental upgrades with pieces and parts of their suites upgraded, not the entire package).

Generally, if your software can thread for 6 cores, it'll thread for 8, and it'll thread for 12. Once an application goes multi-core, generally the concept of "core count out-pacing software" goes away because we've already rewritten our algorithms to scale to x number of cores. Every so often someone will optimize their code a bit by locking in a max number of cores, but that's usually because they don't want to run the math at runtime to decide how many cores to target, and they just need to go in and update the number. Usually the stuff I write is more adaptive and will scale to a new number of cores without needing a recompile.
Not true from my POV. I keep noticing things that are fixed to 4x cores (typically, as Photoshop is an oddball at only 2x) per CPU more often than true n core multi-threading capable. So a Hex will run 4x cores total, and Dodeca 8x total.

Sucks, but there's nothing the user can do.
 
My sincere hope is that Apple continues to improve the Mac Pro, but my fear is that they will decide to offer something along the lines of a "Pro Mini" that is much smaller and offers much less options for expansion and upgrades.

I admit that I don't need a Mac Pro but I would kill for a Pro mini. I keep holding back on replacing my computer because I don't want to settle for the iMac or the mini.
 
From the information I've seen, it doesn't actually force it, as that would cause problems from hell (think of forcing a word processor to multi-thread... I can see the stalled cores now... ouch). :p

Plenty of reasonable ways to multithread a word processor, actually. But the more basic issue is that a word processor can't even consume one whole CPU.

Anything with a GUI could make decent use of multithreading (and this is the most common way Cocoa will force multithreading.) But again, you might not be doing enough work to consume more than one core anyway.

But rather it provides the API's that will allow developers to implement multi-threading a little easier. This assumes a lot, as either their applications are built on other tools in the past that aren't compatible = re-write (i.e. custom written tools), they haven't dedicated sufficient resources to get it done, or the application is of a type that multi-threading won't benefit it at all.

OS X does force multithreading, especially if you are using Cocoa. I once wrote an app that had no explicit threading, and yet when I ran it, it was running with 16 threads.

Multithreading also doesn't require a rewrite. I'm also not sure what these tools are you're referring to. Pthreads was ratified in... 1995? Mac OS X has supported threads since launch, OS 9/Carbon supported threads.... Again, I'm not sure what you mean by tools, but if you have a compiler that doesn't support multithreading, you're probably working on a 386 running DOS.

If they can use an iMac, as they may actually need the HDD and PCIe expansion, a better monitor, or multiple displays not possible on the iMac (I'm getting the impression the class assignments are expected to be more advanced and polished than what was acceptable in the past for BS/BA level = need more of a system for some students, such as creative degrees).

You can do HD and a second display on an iMac (not that the built in display isn't really nice.) Why would a student need PCIe expansion for Photoshop?

My issue with that unit from Apple, is the price compared to the competition (proper systems = workstations using the same CPU P/N, not consumer grade gear).

Apple simply doesn't care to compete in all markets. They've made this widely known over the years... The prosumer tower market is one place they've repeatedly made clear they're not competing. It's been this way for many years.

From what I see of the professionals that use those systems, are those that stick to Photoshop for 2D graphics work (no animation). H.264 encoding is a bit slow, but it's not where the majority of time is spent from what I've been told by such users (hence they make a compromise, as the additional funds aren't really justifiable for say ~10% or less of their workload).

H.264 encoding, however, is a critical part of a workflow. I may need to do multiple encodes to make sure my video quality is good over an entire film. If it takes me X hours to do that (during which I can't really do much work), I'd pay good money to do that in X/2. And honestly, if I'm cutting film, I'm probably getting paid enough to where spending an extra few thousand on that horsepower is not going to bother me.

Most likely, if I'm in pro video editing, my software costs more than the top end Mac Pro anyway.

But software development always follows the hardware. Professional software is just slower about it, as the software is much more complicated, and the developer tends not to want to spend the funds to get development out faster (tend to release incremental upgrades with pieces and parts of their suites upgraded, not the entire package).

Which is true, expect in this case, where threading came before consumer machines. True, we didn't get an OS that supported multiprocessing until 2001, but threading is not some new thing. Developers have been doing it for a long time. In this case, the hardware had to catch up to software.


Not true from my POV. I keep noticing things that are fixed to 4x cores (typically, as Photoshop is an oddball at only 2x) per CPU more often than true n core multi-threading capable. So a Hex will run 4x cores total, and Dodeca 8x total.

Not sure where you are seeing this. After Effects and Premiere can both use 8 cores, as an example.

There is absolutely no reason things should be fixed to 4 cores unless a developer put in a hard limit. All the software I write scales on the fly, and I have been able to very easily use all 8 cores.

(Heck, I noticed even Starcraft 2 will use 8 cores.)
 
Apple simply doesn't care to compete in all markets. They've made this widely known over the years... The prosumer tower market is one place they've repeatedly made clear they're not competing. It's been this way for many years.

Yep. My last Mac Pro was a dual G5 2.0GHz.

I'm mid level and my Adobe software (including After Effects and Premire Pro) and Cinema 4d run just fine on Windows and I enjoy more power (better video cards and overclock if I want to) for probably 40% of the cost.

At this point I find the Adobe workflow FAR SUPERIOR to Final Cut Pro.

The single CPU Mac Pros are just a horrible value.
 
After following this conversation for several days now I still don't get why Haswell would end the Mac Pro line. Even if it is a more expensive part, it won't be that much more expensive than current Xeons. I understand the wafer scale and all, and that there will be more bells and whistles on chip, but I believe that this is just the general direction of high end systems for the coming years. A solution on a chip for the high end. Apple will still continue to sell the Mac Pro, and people will continue to buy it. Especially so if Apple's market share continues to grow.

Personally, my plan is to buy a new Mac Pro with each Oregon rotation. I will likely skip Sandy Bridge/Ivy Bridge, and buy a Haswell machine as soon as they hit the market. I have been looking forward to Haswell more than any other microarchitecture for several years now. FMA could be big (particularly Intel's streamlined implementation). A new on package vector coprocessor assembly will be nice too. If the machine has 75% of the planned features I will likely end up buying a Haswell machine in 2013, and a Rockwell machine right behind it in 2014 (I have plenty of uses for more than one Pro).

I do believe you guys (nanofrog, goMac) have done some fine analysis here. I guess it just comes down to hoping nanofrog's technical/financial objections to a Haswell Mac Pro are overcome before the processors leave the gate at Intel.

One more note, and this one is a very long term outlook. It is entirely possible that we will see Apple (and many PC manufacturers) dump x86 for the ARM platform as it advances over the coming years. If that were the case then the Mac Pro would be an (likely much cheaper to manufacture) ARM based workstation, making all of our hopes and fears in this particular thread moot. :eek:
 
One more note, and this one is a very long term outlook. It is entirely possible that we will see Apple (and many PC manufacturers) dump x86 for the ARM platform as it advances over the coming years. If that were the case then the Mac Pro would be an (likely much cheaper to manufacture) ARM based workstation, making all of our hopes and fears in this particular thread moot. :eek:
[/b]

I highly doubt this, ARM has technical limitations that make it a great power sipping chip at the cost of performance.

That said, ARM could eventually be retrofitted, but then it would probably suck just as much power as Intel's chip line. :)

There isn't really anything implicitly better about the arm design besides that it has less features than a modern Nehalem to keep power usage low. But a modern Nehalem based chip would be faster clock for clock than any modern ARM chip. But of course what you give up with Nehalem is there is no way at present to put that in a (good) tablet or cell phone.
 
One more note, and this one is a very long term outlook. It is entirely possible that we will see Apple (and many PC manufacturers) dump x86 for the ARM platform as it advances over the coming years. If that were the case then the Mac Pro would be an (likely much cheaper to manufacture) ARM based workstation, making all of our hopes and fears in this particular thread moot. :eek:

I think we are already there. I don't really see the ARM killing the workstation anytime soon, but it can definitely kill off the laptop and low end desktop market.

x86 and the whole desktop/laptop full OS is about to get marginalized.

Apple has been very ahead of the curve here and they have to get mad props for that. First the iPhone and then the iPad and soon a ton of crappy tablets until SOMEONE steps up on the Android or WebOS side of things and offers some competition.

I got through college on a 286 computer with 1MB of ram. Pretty sure a lot of college kids today could get by with an iPad and a keyboard for longer typing projects.

Our sales teams hardly ever use laptops anymore.

Most people just don't need this much power.

So in some ways Apple has positioned the Mac Pro well. Solid. Few updates. Low Maintenance. Low Marketing priority.

I still think they were wrong about the i7 based $1500 desktop, but that's just me.
 
I highly doubt this, ARM has technical limitations that make it a great power sipping chip at the cost of performance.

That said, ARM could eventually be retrofitted, but then it would probably suck just as much power as Intel's chip line. :)

There isn't really anything implicitly better about the arm design besides that it has less features than a modern Nehalem to keep power usage low. But a modern Nehalem based chip would be faster clock for clock than any modern ARM chip. But of course what you give up with Nehalem is there is no way at present to put that in a (good) tablet or cell phone.

Indeed, I agree with everything you say. I was just throwing a nightmare scenario out there. My only point was that if Apple moved to a completely consumer gadget oriented product lineup in the future... ie iPhone, iPad, ARM based MacBook Air, ARM based iMac... then the only job the "Pro" machine would really have is development for that ARM gadget line. In that case I don't believe Apple would maintain an x86 Mac Pro or the like just for high end users. In other words they would in effect kill off their present Pro user base.

Do I really believe something like this could happen? Who knows. I certainly hope not, as I love having a little "supercomputer" at home. I greatly look forward to the Haswell Mac Pro and beyond.
 
I think we are already there. I don't really see the ARM killing the workstation anytime soon, but it can definitely kill off the laptop and low end desktop market.

x86 and the whole desktop/laptop full OS is about to get marginalized.

Apple has been very ahead of the curve here and they have to get mad props for that. First the iPhone and then the iPad and soon a ton of crappy tablets until SOMEONE steps up on the Android or WebOS side of things and offers some competition.

I got through college on a 286 computer with 1MB of ram. Pretty sure a lot of college kids today could get by with an iPad and a keyboard for longer typing projects.

Our sales teams hardly ever use laptops anymore.

Most people just don't need this much power.

So in some ways Apple has positioned the Mac Pro well. Solid. Few updates. Low Maintenance. Low Marketing priority.

I still think they were wrong about the i7 based $1500 desktop, but that's just me.

First, I can definitely see the equal of an 11.6" MacBook Air with iOS on ARM. It could show up with any major OS revision. That would be the beginning of the end for the x86 Mac line.

Second, if the above were to happen, as I said in my previous post, I cannot see Apple keeping an x86 Mac Pro around anymore than they would keep a Powermac G5 around today. But again, I am not saying I see this as the clear path to the future, nor do I want it to be.
 
I highly doubt this, ARM has technical limitations that make it a great power sipping chip at the cost of performance.

I think the #1 feature of the iPad was the real 10 hour battery life. That is why the x86 laptop will die sooner than later. I could use it all day and it would still have 30-50% of the battery left.

Do I really believe something like this could happen? Who knows. I certainly hope not, as I love having a little "supercomputer" at home. I greatly look forward to the Haswell Mac Pro and beyond.

I think we will have access to little supercomputers, but we will just pay more. Kind of how the Mac Pro is now. But since we will use them for professional money making ventures, it will just be the cost of doing business.

What we may lose out on, is all the cool things kids and accidentals learn and figure out on their own as parents stop buying real computers and shift to tablets and devices.

I was a liberal arts major. I found a Mac IIfx ($5000 in 1992 and 40MHz for those keeping score) in an often empty room at school that I would sneak into frequently and I taught myself how to use it and eeked out a computer based career.
 
Last edited:
I think we will have access to little supercomputers, but we will just pay more. Kind of how the Mac Pro is now. But since we will use them for professional money making ventures, it will just be the cost of doing business.

What we may lose out on, is all the cool things kids and accidentals learn and figure out on there own as parents stop buying real computers and shift to tablets and devices.

You are so right. If we lose the x86 Mac Pro then we will once again have to turn to SUN and the like and we will once again be buying machines that cost as much as a new cars. People who complain about the price of the Mac Pro have no idea what the alternative is, cost wise. I remember all too well when a nicely loaded consumer PC was $4000+ and a similar Mac was significantly more.

As far as the kids go, that has sadly already happened to a huge degree. My first machine was an Atari 800. As I got older I got my hands on all sorts of nice hardware including a couple of sparcstations. I learned Basic > Assembly > Fortran. Kids today take "computer science" and learn all about extremely techincal stuff like "spread sheets" and "power point presentations." It's a dark time, for certain.
 
Plenty of reasonable ways to multi thread a word processor, actually. But the more basic issue is that a word processor can't even consume one whole CPU.

Anything with a GUI could make decent use of multi-threading (and this is the most common way Cocoa will force multi-threading.) But again, you might not be doing enough work to consume more than one core anyway.
The devil's in the details. I can seriously see recycled code (development process using shortcuts) causing issues with releasing resources, as it won't work with the API's properly. And even knowing this, can the team developing the API's actually close all the doors for illegal operations that cause such issues. It just doesn't seem realistically possible due to things like time constraints and all the potential ways of writing code. I figured you'd see this is what I was getting at with that example.

But as you mention, it's core usage is minimal, and not necessary at all = it shouldn't even be attempted.

OS X does force multi-threading, especially if you are using Cocoa. I once wrote an app that had no explicit threading, and yet when I ran it, it was running with 16 threads.
If you don't mind, can you point me to some proof as to how this actually works?

I can't see it as truly forced from what I recall (currently under the impression that what you're talking about here has to do with out of order processes running simultaneously, such as running the GUI elements simultaneously on separate cores from the actual data process). Not processes that are based on recursive algorithms.

Multi-threading also doesn't require a rewrite. I'm also not sure what these tools are you're referring to. Pthreads was ratified in... 1995? Mac OS X has supported threads since launch, OS 9/Carbon supported threads.... Again, I'm not sure what you mean by tools, but if you have a compiler that doesn't support multi-threading, you're probably working on a 386 running DOS.
Tools that were developed in-house, as there wasn't anything commercially available at the time those tools were needed, and are still in use. This is how I understand is part of what's going on with Adobe for example.

Now as those tools are likely based on previous tools, there's a good chance they don't conform (or fully) to the open specifications you're referring to.

You can do HD and a second display on an iMac (not that the built in display isn't really nice.) Why would a student need PCIe expansion for Photoshop?
Yes, the iMac can use a second display. I was talking about more than that or sizes the GPU can't handle sufficiently as well (i.e. trying to run say 3x or more 30" monitors).

As per Photoshop needing PCIe, it's more to do with drives. The MP, as any system, is limited to how many disks you can place internally. The iMac is worse, as it can only take up to 2x for some models.

Now it's possible they can use USB or FW for expansion (will allow them access to additional capacity), but it can't increase speed (particularly relevant to the iMac). For the MP, they can use RAID (software or hardware based) to increase their speed as well as capacity.

Now I wouldn't have thought that this was necessary for students, but the information I'm getting indicates otherwise (I do assume their information is accurate, not exaggerated).

Apple simply doesn't care to compete in all markets. They've made this widely known over the years... The prosumer tower market is one place they've repeatedly made clear they're not competing. It's been this way for many years.
But in the case of the SP MP, they do chose to compete, as that product exists.

H.264 encoding, however, is a critical part of a workflow. I may need to do multiple encodes to make sure my video quality is good over an entire film. If it takes me X hours to do that (during which I can't really do much work), I'd pay good money to do that in X/2. And honestly, if I'm cutting film, I'm probably getting paid enough to where spending an extra few thousand on that horsepower is not going to bother me.

Most likely, if I'm in pro video editing, my software costs more than the top end Mac Pro anyway.
As I said, it all comes down to specific usage. If yours is that high, it has more relevance as to which machine will better serve your needs.

But for others, this may not be the case (H.264 encoding doesn't exact as much of their time for whatever reason).

This is why when the RAID questions come up, I have to try and get the smallest details out of the user to figure out how to best go about getting them a usable solution, not one that ends up useless, and a waste of funds. Those details are critical.

Which is true, expect in this case, where threading came before consumer machines. True, we didn't get an OS that supported multiprocessing until 2001, but threading is not some new thing. Developers have been doing it for a long time. In this case, the hardware had to catch up to software.
As mentioned before, I'm sticking with the enterprise market, not the consumer side (so I'm skipping the likes of the P4). Multi-threading started there, not for consumers.

So it's still a Top Down/Trickle Down approach.

Not sure where you are seeing this. After Effects and Premiere can both use 8 cores, as an example.
Logic is still limited to 4x cores per CPU, as IIRC, those with SP Hex and DP Dodeca systems noticed that they were only using 4x cores per (4x on the SP Hex, 8x on the Dodeca).

There is absolutely no reason things should be fixed to 4 cores unless a developer put in a hard limit. All the software I write scales on the fly, and I have been able to very easily use all 8 cores.
Exactly my point. The code that set this limit was developed years ago (i.e. single, maybe dual core CPU's out at the time, and more cores per CPU weren't expected any time soon). So they took a short-cut, and fixed the core limit, with the expectation that when more cores finally came, they'd fix it with a re-write at that time.

And what's being seen, are the vendor's that have been dragging their feet at producing the necessary re-write. Again, this is where cost reasons are the most likely culprit, as both the necessary tech and personnel that know how to do it already exist.

After following this conversation for several days now I still don't get why Haswell would end the Mac Pro line. Even if it is a more expensive part, it won't be that much more expensive than current Xeons. I understand the wafer scale and all, and that there will be more bells and whistles on chip, but I believe that this is just the general direction of high end systems for the coming years. A solution on a chip for the high end. Apple will still continue to sell the Mac Pro, and people will continue to buy it. Especially so if Apple's market share continues to grow.
It's not so much Haswell, but Apple's margins applied to a system based on those parts.

As the price keeps going up, the sales volume will shrink if it's too high. This has already happening as a result of the Nehalem based machines, as the enthusiasts are falling by the way-side, and at least a fair few of the independent pros here on MR are indicating they're nearing their breaking point/questioning the viability of the OSX platform at higher financial costs. As users that will by the MP shrink (can't or refuse to pay the MSRP at that time), there's fewer system sales to divide the R&D expenses. Even the component costs go up, as Economy of Scale is going backwards. So the per system costs will increase (direct and indirect costs applied, but pre profit = total system cost to Apple). In order for Apple to retain their desired margins, prices will have to increase again.

So at some point, the price will exceed what pros can actually afford to pay (classic Supply and Demand economics), and will hit a crux point for Apple where they will determine the MP is EOL, and kill it.

Now it's possible Apple could reduce their margin on the MP to keep it a viable, selling product line (I've mentioned this before, but I'm not sure if it's been noticed). But given the recent trends with Apple (consumer goods taking off and has become a significant portion of their total profit margins, the enterprise market shrinking for Apple, and their desire for high margins), I don't see this happening. :(


One more note, and this one is a very long term outlook. It is entirely possible that we will see Apple (and many PC manufacturers) dump x86 for the ARM platform as it advances over the coming years. If that were the case then the Mac Pro would be an (likely much cheaper to manufacture) ARM based workstation, making all of our hopes and fears in this particular thread moot. :eek:
Not going to happen.

ARM was never intended for that, and can't be as it currently exists (designed to be a low power consumption embedded processor, particularly for portable consumer devices). If they were to adapt it for workstation use, it would become another competitor with Intel and AMD (far larger, more complex, and use as much power).

This is why their products are filling the processing role for things like netbooks and the iPad (Apple's A4 = built from an ARM Core 8). These types of products is what it was actually designed for.

It's theoretically possible they could try to create enterprise grade parts, but it's not their specialty, and they'd have to contend with Intel. The ARM processor filled a hole, and why they've been successful with it.

x86 and the whole desktop/laptop full OS is about to get marginalized.
Depends on where you're talking about geographically speaking. Developing nations, particularly Asia, are where the desktop parts are selling like mad (they're after cheap systems). As their income continues to increase, they'll want more portability and move to laptops and netbooks (need something a smartphone can't fill, particularly where screen real estate isn't sufficient enough).

Chip makers such as Intel are well aware of this, and is where they're aiming their marketing strategies at.

For developed countries, they're aware that the consumer market is changing. It's also why they're focused on the enterprise market here (i.e. shift to cloud computing/clusters).

When this will truly happen, is when ISP speeds are sufficient everywhere and cheap enough for the vast majority of consumer users to utilize such implementations. Software vendors are happy with this shift, as they can make more money with the pricing models that will accompany it (i.e. usage based, not only make money when a new version is released). Intel's fine with it (AMD too, whether they like it or not :p), as they'll be selling the CPU's that the clouds/clusters will be built from.

My only point was that if Apple moved to a completely consumer gadget oriented product lineup in the future... ie iPhone, iPad, ARM based MacBook Air, ARM based iMac... then the only job the "Pro" machine would really have is development for that ARM gadget line. In that case I don't believe Apple would maintain an x86 Mac Pro or the like just for high end users. In other words they would in effect kill off their present Pro user base.
This is my point.

The difference is, I see the sales volume dropping to the point it's actually a realistic outcome as things are now. :eek: ;)

If we lose the x86 Mac Pro then we will once again have to turn to SUN and the like and we will once again be buying machines that cost as much as a new cars. People who complain about the price of the Mac Pro have no idea what the alternative is, cost wise. I remember all too well when a nicely loaded consumer PC was $4000+ and a similar Mac was significantly more.
Intel and other CPU makers are aware of these issues, and are taking pains to keep the value high (cost/performance will be better than with previous Unix based workstations and servers of the past).

The trick, will be what system vendors decide on for acceptable margins, which ultimately will be reflected in the MSRP.

But there will be some price increases. How much is the real question...

As far as the kids go, that has sadly already happened to a huge degree. My first machine was an Atari 800. As I got older I got my hands on all sorts of nice hardware including a couple of SPARCstation's. I learned Basic > Assembly > Fortran. Kids today take "computer science" and learn all about extremely technical stuff like "spread sheets" and "power point presentations." It's a dark time, for certain.
Even the engineering side is affected, as there's no real-world experience while in college, and the mentorship system that existed in the past has de-evolved (intended to be an additional 4 years of on-the-job training, is in the process of disappearing entirely IMO) due to the senior guys being forced to retire or laid off because they're deemed "too expensive". So there's no one there to teach them what they don't learn while getting their degree. What they do get, is inconsistent and has gaps (those they learn from teach what they can, but they can't teach what they don't know, or if they're no longer there = bits and pieces from multiple people - it's messy).

As a result, design and product quality suffers. Parts selection and testing is one notable area from what I've seen (build off of simulation software, but don't test out their parts for real world behavior based on the presumption that the simulation data is comparable with real world, which it typically isn't).
 
Just to throw this in here as you guys were talking about the prices of MP and Dell 7500.

The significant difference of more than $1000 does not only apply to the SP Pro. If you spec out the 7500 with 2.93GHz hex cores you'll end up with exactly $4000. Apple charges $6200 for a similar machine!

But than again, one of them runs Windows the other OS X, so they are not really comparable. Hefty premium for the OS, though.


End of interruption, keep on talking! :D
 
Kids today take "computer science" and learn all about extremely techincal stuff like "spread sheets" and "power point presentations." It's a dark time, for certain.

They definitely dont, at least not in the UK. Thats called ICT/IT. 16-18, you have A-Level Computing, which consists of CPU Architecture, Fetch/Execute Cycle, Independent Programming Projects, Memory Subsystems, Binary, De Morgans Laws, 2s Complement. Then at Degree Level it continues - generally more Programming, Systems Architecture (Inside in more details), Some Math and Some Software Engineering in Year 1, before developing through to more advanced things in Years 2 and 4, with an optional Year 3 consisting of a Year in Industry to apply and learn the industry stuff. (Im currently at Uni, doing a Computer Science degree).
 
The significant difference of more than $1000 does not only apply to the SP Pro. If you spec out the 7500 with 2.93GHz hex cores you'll end up with exactly $4000. Apple charges $6200 for a similar machine!

Are you sure about that? The price to upgrade to those processors on Dell's site is $4,110 alone.

"Dual Six Core Intel® Xeon® Processor X5670, 2.93GHz,12M L3, 6.4GT/s, turbo [add $4,110.00]"

I'm assuming you're doing DP. If you are, make sure you are selecting the dual processor base machine on Dell's site. They don't let you upgrade to DP on any of the SP processor base machines.

Speccing that out for me a total of $6,958 for:

Genuine Windows® 7 Professional, No Media, 64-bit, English
Dell Precision T7500 Workstation
Dual Six Core Intel® Xeon® Processor X5670, 2.93GHz,12M L3, 6.4GT/s, turbo
Microsoft® Office Starter 2010
3 Year Basic Limited Warranty and 3 Year NBD On-Site Service
Precision T7500 Power Supply
6GB, DDR3 RDIMM Memory, 1333MHz, ECC (6 DIMMS)
512MB ATI FirePro™ V5700, DUAL MON, 2 DP & 1 DVI
C1 All SATA or SSD drives, Non-RAID, 1 drive total configuration
Trend Micro Worry-Free Business Security Services, 30-days
Integrated LSI 1068e SAS/SATA 3.0Gb/s controller
1TB SATA 3.0Gb/s, 7200 RPM Hard Drive with 32MB DataBurst Cache™
16X DVD+/-RW w/ Cyberlink PowerDVD™/Roxio Creator™, No Media
No Monitor
No Floppy Drive and No Media Card Reader
Recovery Media for Genuine Windows® 7 Professional,64bit,Multiple Language
My Accessories
No Speaker option
Dell MS111 USB Optical Mouse
Dell QuietKey Keyboard
My Services & Warranties
Also Included
No Resource DVD
Documentation, English, with 125V Power Cord
Quick Reference Guide, English, Dell Precision T7500
Shipping Material for System

Result: Win for Apple by $700.

(I'll be back later to reply to the rest... Busy iPhone programming. :) )
 
Is the Mac Pro selling reasonably well?

I ask this because I have a 2006 Mac Pro and it's still serving my needs incredibly well. I've upgraded memory (to 8GB) and the video card (Radeon 3870, thinking of getting a 5xxx series one) since, but the core machine has stayed the original one I bought in 2006.

This thing was so fast when I got it that it has kept up well with the industry, to the point that it still feels fairly speedy today. I bet if I replace the boot disk with an SSD, it will feel like a bat out of hell. Most of my grunt-work on it is Aperture, which it runs incredibly well.

So I wonder if Mac Pros are selling, if they have such an incredibly long useful life. Only recently have I been poking my nose at the new ones; I may keep this one another year and upgrade in 2012.
 
Reports of the workstation market's death have been greatly exaggerated.

JPR reporting 31.8% YoY growth in Q3 following on a strong 32% YoY showing in Q2.
That report is primarily to do with what happened to the workstation sales volume from the Great Recession to the Q3 2010 reports. A slump occurred when the recession hit, as companies were worried about the economic outlook, and curtailed their buying (loss of work and layed off employees, ..., to just scrutinizing their upgrade cycles and see how far they could stretch their existing equipment in terms of usable lifespan). Sales volumes are finally picking up now as corporations, and even SMB's to a lesser extent, loosen their purse strings. But IIRC, it's not yet up to pre-recession sales volumes.

What's been happening with the graphics subsection is more relevant IMO, as 2D is giving way to 3D cards as what's being purchased by professionals (from a general POV). This aspect isn't unexpected, as it's been predicted for some time (when was the hard part, as software vendors aren't always able to deliver new features in accordance to their published time lines). But we've finally reached that point (hardware, drivers, and application support have reached the point where 3D features are available for enough professional users the scales have tipped).
 
They definitely dont, at least not in the UK. Thats called ICT/IT. 16-18, you have A-Level Computing, which consists of CPU Architecture, Fetch/Execute Cycle, Independent Programming Projects, Memory Subsystems, Binary, De Morgans Laws, 2s Complement. Then at Degree Level it continues - generally more Programming, Systems Architecture (Inside in more details), Some Math and Some Software Engineering in Year 1, before developing through to more advanced things in Years 2 and 4, with an optional Year 3 consisting of a Year in Industry to apply and learn the industry stuff. (Im currently at Uni, doing a Computer Science degree).

The "computer science" I was describing is what is taught at local high schools near me.

I went to school for Math and Physics when I was 18-22, then went back to school a decade later for Electrical Engineering and Computer Science.
 
Are you sure about that? The price to upgrade to those processors on Dell's site is $4,110 alone.

"Dual Six Core Intel® Xeon® Processor X5670, 2.93GHz,12M L3, 6.4GT/s, turbo [add $4,110.00]"

I'm assuming you're doing DP. If you are, make sure you are selecting the dual processor base machine on Dell's site. They don't let you upgrade to DP on any of the SP processor base machines.

DOH!!!! Me so stupid! I assumed that the T7500 is always a DP machine and haven't seen that you can select that separately. The Dell site definitely is too complicated for me. Well, was a long day... :eek:
 
The "computer science" I was describing is what is taught at local high schools near me.

I went to school for Math and Physics when I was 18-22, then went back to school a decade later for Electrical Engineering and Computer Science.

You can rely on high-schools to mess it up. Thank goodness the UK curriculum is developed by people who at least know what Computer Science is. (I also believe the UK educational system is better than the US one, mainly as we cover more material faster - hence the 3 year Standard (excluding time in Industry - which can be got over the Summer) for getting a BSc/BA over 4 years in the States. I considered going Stateside for my Uni, but looking at the courses Id already covered year 1 and parts of year 2 by the time I finished my Sixth Form (16-18 Education)).
 
You can rely on high-schools to mess it up. Thank goodness the UK curriculum is developed by people who at least know what Computer Science is. (I also believe the UK educational system is better than the US one, mainly as we cover more material faster - hence the 3 year Standard (excluding time in Industry - which can be got over the Summer) for getting a BSc/BA over 4 years in the States. I considered going Stateside for my Uni, but looking at the courses Id already covered year 1 and parts of year 2 by the time I finished my Sixth Form (16-18 Education)).

I agree that the United States public school system (Kindergarten through grade twelve) is one of the worst in the developed world. That said, I also believe that the United States higher education system (University and most especially graduate school) is second to none. I was in the UK last summer and the schools I visited there were atrocious.
 
I agree that the United States public school system (Kindergarten through grade twelve) is one of the worst in the developed world. That said, I also believe that the United States higher education system (University and most especially graduate school) is second to none. I was in the UK last summer and the schools I visited there were atrocious.

Then you mustve visited the rubbish end of University in the UK. Im pretty sure that for the amount Im paying (About $6000/yr), Im getting an education that I could get in the US, for an awful lot more at somewhere like MIT. This is the maximum they can charge right now, and even when it increases in 2 years its still going to be a LOT cheaper than college in the US (and better too - 3 years over 4, so you get more time in industry at the same age).
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.