Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Quantitative for me!!My SSD Died after three months. It has my Bios on it, which i have copies of and it was easily replaced. At this point, i do not trust them enough to store media from current and ongoing projects, no way.

I consider it the professional sector, because we are people who rely on our equipment to earn a living. I don't care how new the latest and greatest is. An improvement in speed and storage is USELESS if it is unreliable

Apple may retain 1-2 3.5" bays in a revised Mac Pro ( more likely if drop both 5.25" bays). However, a 4 x 3.5" set up would be ignoring several significant factors that already have deep traction.




$/GB perhaps. $/GB/random-IO-Mbps not really. Drop 'speed' and that is closer to being an accurate assessment. The question though is whether one of the Mac Pro's primary duty is bulk storage server.

For example, on Seagate's site:

" ... Switch to 2.5” drives for cost-effective performance

Improve your performance by 15% by switching to Savvio 15K. Additionally, these drives deliver up to 10% faster random reads and up to 8% faster random writes vs. legacy 3.5-inch 15K drives. ... "
http://www.seagate.com/internal-hard-drives/enterprise-hard-drives/3-5/cheetah-15k/





There is little quantitative evidence that SSDs are significantly less reliable in general. There are bad SSD models, but there are bad HDD models also. The expectation management has been bad with SSDs. Due to "Solid State" in the name folks expected them to last longer. That hasn't really proven true. Flash wears out and controller/firmware can be problematical in both HDDs and SSDs.




15Krpm 3.5" HDDs introduced over last 12-18 months ? [There are older ones around but since SSDs have gained deeper traction, power efficiency is higher priority, and HDD platter density has gone up the trend is toward 2.5". ]

You are cherry-picking the 3TB capacity threshold. In about 12-18 months there will be 3TB 2.5" drives.

If go back to the era where the current Mac Pro's basic design parameters were form there weren't folks stuffing 3TB of data inside of a Mac Pro. [ e.g., in 2006 4 x 500GB would get you about 2TB total. ] So "pros" needing to deal with 3+TB have been dealing with external storage for a long while. It doesn't disappear now. There is always going to be a subset of the Mac Pro population that needs to use storage outside the box.

3-4 2TB 2.5" drives ends could provie 6-8TB of usable space. It isn't like internal capacity would not be making progress over time.





So they shouldn't skate toward where the puck is going. They should skate to where the puck has been?



The "legacy" sector is probably more accurate adjective than the "professional" sector. Lots of professionals who handle large data sets have been moving to 2.5" drives to handle near-line storage for several years already.
 
pretty much everything I predict for the Mac Pro I am basing off of visible trends in IT (I don't know if you can even buy a server with 3.5" drive slots) or Apple design aesthetics.
I am positive they will go for a smaller box. Not because any of my users or staff are worried about how big the box under their desk is. Indeed, who was complaining that the 2011 iMac was too thick?
But just last month, if you watched the event where they released the Ipad mini and the iMac, they could not stop gushing over how thin the new iMac is.
I seriously expect a fusion of the Mac Mini and the iPhone 5 aesthetic vocabulary. I am thinking a dark slate grey/black rectangular slab. Probably big enough to accomodate a couple cards, as well as a minimum of 3 drives.
I expect a fusion drive configuration with 2 HDD in raid 1 and 1 SSD.
Unfortunately I think they will deep six firewire and count on the pros to get thunderbolt/1394 adapters for legacy gear.
I'm certain they will ditch the optical, and expect us to get the USB super drive.
I expect some kind of clever design to hide away most of the buttons and ports. The emphasis seems to be on clean profile at the expense of ergonomics. Hence the absurd placement of heahphone, USB and SD card ports on the BACK of the mac mini and iMac. Even after they both lost their optical drives!
I also expect the line in and toslink connectors to disappear as well.
As far as GPU as processing host.
That is a model that has a lot of potential. But it requires folks like Avid, Adobe and Maxon to code for such a scenario. Maxon is already half way there.

I am hoping they come out with a new keyboard as well. Fingers crossed its black, backlit keys like on the MBP, but with a dark grey aluminum body, as with the iphone 5/black ipad mini.
 
I'd like Apple to go the direction of leveraging gpu capability. It would be awesome. We seem to agree that if they did this with the mac pro, the price would likely drift into astronomical territory. The problem is also likely coupled with uncertainty regarding long term support from Apple, given how this line has been treated.

Exactly.

... . May I ask how gpu rendering holds up with heavier scenes or a lot of textures or if you've tried any workflow aside from Cycles? ... .

The short answer is I'm just getting started with the Blender and CUDA tech. Cycles is all that I've tried with CUDA tech because Blender is a free app. The 3d apps I'm trained to use are Cinema 4d and Maya. I haven't even configured my CS upgrade yet. If you look at this site: http://www.nvidia.com/object/gpu-applications.html , you'll see that for the other apps for Media and Entertainment [with categories such as (1) ANIMATION, MODELING AND RENDERING, (2) COLOR CORRECTION AND GRAIN MANAGEMENT, (3) COMPOSITING, FINISHING AND EFFECTS, (4) EDITING, (5) ENCODING AND DIGITAL DISTRIBUTION, (6) ON‐AIR GRAPHICS and (7) ON‐SET, REVIEW AND STEREO TOOLS] the word "real time" is mostly used for categories 6 and 7, above}; the word "TBD" {or to be determine} for category 5; "TBD" and 7x for category 4; "TBD" and single digit multiples, except 50x for Eyeon Fusion and 27x for Adobe After Effects CS6 (significantly Multi GPU capable) for category 3; mostly "real time" for category 2; and mostly "TBD" and a few "10x" for category 1. Significant to me is that in category 1, the most often cited feature is "Increased model complexity, larger scenes" and where I do see "CUDA-based GPU Rendering" it applies only to Otoy Octane Render, Chaos V-Ray RT, CentiLeo GPU Render, Cebas finalRender, and Autodesk 3ds Max + NVIDIA iray - the vast majority of which aren't even OSX apps. I've tried to stay with apps that are multiplatform, but even that may have to change.

Here's the doc if you want more details: http://www.nvidia.com/docs/IO/123576/nv-applications-catalog-lowres.pdf . I looked at a similar PDF about 4 or 5 months ago and it had nowhere near as many apps. So CUDA support seems to be really accelerating now. Also significantly, the majority of the entries in my area of interest indicate current support for only one GPU, but I hope that changes.
 
Exactly.



The short answer is I'm just getting started with the Blender and CUDA tech. Cycles is all that I've tried with CUDA tech because Blender is a free app. The 3d apps I'm trained to use are Cinema 4d and Maya. I haven't even configured my CS upgrade yet. If you look at this site: http://www.nvidia.com/object/gpu-applications.html , you'll see that for the other apps for Media and Entertainment [with categories such as (1) ANIMATION, MODELING AND RENDERING, (2) COLOR CORRECTION AND GRAIN MANAGEMENT, (3) COMPOSITING, FINISHING AND EFFECTS, (4) EDITING, (5) ENCODING AND DIGITAL DISTRIBUTION, (6) ON‐AIR GRAPHICS and (7) ON‐SET, REVIEW AND STEREO TOOLS] the word "real time" is mostly used for categories 6 and 7, above}; the word "TBD" {or to be determine} for category 5; "TBD" and 7x for category 4; "TBD" and single digit multiples, except 50x for Eyeon Fusion and 27x for Adobe After Effects CS6 (significantly Multi GPU capable) for category 3; mostly "real time" for category 2; and mostly "TBD" and a few "10x" for category 1. Significant to me is that in category 1, the most often cited feature is "Increased model complexity, larger scenes" and where I do see "CUDA-based GPU Rendering" it applies only to Otoy Octane Render, Chaos V-Ray RT, CentiLeo GPU Render, Cebas finalRender, and Autodesk 3ds Max + NVIDIA iray - the vast majority of which aren't even OSX apps. I've tried to stay with apps that are multiplatform, but even that may have to change.

Here's the doc if you want more details: http://www.nvidia.com/docs/IO/123576/nv-applications-catalog-lowres.pdf . I looked at a similar PDF about 4 or 5 months ago and it had nowhere near as many apps. So CUDA support seems to be really accelerating now. Also significantly, the majority of the entries in my area of interest indicate current support for only one GPU, but I hope that changes.

this is really interesting stuff. The problem I see is that, from an IT perspective, I'd rather build a farm of cheap supermicro 1RU servers to render our C4D projects. They are competitive with most high end workstation GPU cards for price. And I dont have to worry about overtaxing a Mac Pro's power or thermal envelope. In fact, we do this now. We have the art guys on maxed out Mac pros, and they send C4d projects to the render farm in a server rack downstairs. Managed from a pretty ugly web interface.
I am surprised to see that broadcast graphics get only secondary mention after all the clinical scientific stuff. I mean sure there is a lot of money in R&D, but sportscasting is pretty money flush too!
Shame C4d only suports single GPU offloading too.
 
Quantitative for me!!My SSD Died after three months.

That isn't quantitative. There is approximately zero statistical relevance in a sample size of one. That's just qualitative hand waving with merely the pretense of being quantitative.
 
That isn't quantitative. There is approximately zero statistical relevance in a sample size of one. That's just qualitative hand waving with merely the pretense of being quantitative.

well, i've been in business for 20 years. Time and time again, brand new tech has shown a tendency to be less reliable. I take it on very cautiously, and wait a couple of years to jump in with both feet. Maybe this is not quantitative, and i can't prove it scientifically - but this is the way that me and my colleagues work. I think it's called the boots on the ground method.
 
well, i've been in business for 20 years. Time and time again, brand new tech has shown a tendency to be less reliable. I take it on very cautiously, and wait a couple of years to jump in with both feet. Maybe this is not quantitative, and i can't prove it scientifically - but this is the way that me and my colleagues work. I think it's called the boots on the ground method.

at the broadcast facility I work in we have several dozen machines on SSDs.
The only failure so far was with one server, but I think the whole box was running hot.
As far as reliability, there really isn't a leg to stand on as far as rotational hard drives are concerned. Every year there are recalled batches now. I have a whole shoebox of dead magnetic drives waiting for our next ewaste pickup.
Magnetic hard drives fail pretty regularly.
 
I was beginning to consider getting a new maxed out 27" iMac in December (the specs are quite nice for $3500), but the fact that it's non-upgradeable, combined with the new news of the 7xxx graphic card drivers in 10.8.3, I've decided I will continue waiting for the new Mac Pro.

It looks like they're going to come through. :)
 
...., I've decided I will continue waiting for the new Mac Pro.

If it wouldn't be so sad, this could be quite funny. I'll join you though in said wating game. Hopes are up and one just has to love those posters complaining about MacPro frontpage rumours instead of iTunes news.
 
I was excited... last year. I ended moving to PC. I know it's not an option for everyone, but for me it made sense.
 
If thats the same article I read over at OCN is says LGA is going to stay on high end processors. That wouldn't effect the MP at all.
 
Most desktops will go that way, but there is still a large-enough need for socket CPUs that can up upgraded. By 2020 there could be no more such computers, but I don't think the sockets will disappear for at least 5 years.

Even if the Pro solders down, I just hope they continue double processors. I wish the iMacs would have double processors on the highest model! Single CPU computers are such dogs.
 
In the long forgotten past CPU's had already been soldered on in various desktop computers (e.g. certain Amiga models, 386(sx) PC's etc.).

Nearly all components that are soldered onto the motherboard today (or are even included into highly integrated IC's) had been on dedicated expansion cards or devices in the past.

No one questions the decision to have those components onboard and integrated nowadays. Why is it suddenly the end of the world if yet another component may come soldered on in the future?

Fact is that soldered components increase reliability and lower cost (not to mention overcoming contact issues with growing age and frequencies).

Most people don't upgrade the CPU separately anymore. Except for the few DIY people who love to configure their "ideal" system or the even fewer Mac Pro die-hards who try to get a little more life out of their aging systems as Apple is dragging its feet.

Simply makes no sense when you have to change the motherboard as well with each new CPU generation. Granted - owning a Macintosh makes it harder to simply swap the motherboard for a 08/15 PC replacement. But with Apple going more and more to highly integrated systems this is probably not an alternative anyway.

Computers are about to become a commodity that is replaced completely - like it or not. At least having a good build quality helps protecting environment as the replaced machine can live longer as 2nd or 3rd hand machine instead of going to the dumpster directly.

Surely the desktop segment will (continue to) suffer in the future, but not because yet another component will come soldered onto the motherboard, but because the whole computer industry is undergoing a massive transition towards smaller and more mobile devices.

And Apple has it's share in pushing this transition with its iPhone and iPad products.
 
If it wouldn't be so sad, this could be quite funny. I'll join you though in said wating game. Hopes are up and one just has to love those posters complaining about MacPro frontpage rumours instead of iTunes news.

I think that if we really get a totally new Mac Pro in 2013, completely redesigned from the ground up, then Apple will most likely return the Mac Pro to its "regularly scheduled programming" of sorts; making routine and regular investments to upgrades and enhancements just like all the other systems.

In such case, the wait (for me since 2006) will have been worth it.

Or 2013 could come and go and we could all be SOL.

Besides, Tim Cook seems like a Mac Pro guy to me.
 
In the long forgotten past CPU's had already been soldered on in various desktop computers (e.g. certain Amiga models, 386(sx) PC's etc.).

Nearly all components that are soldered onto the motherboard today (or are even included into highly integrated IC's) had been on dedicated expansion cards or devices in the past.

No one questions the decision to have those components onboard and integrated nowadays. Why is it suddenly the end of the world if yet another component may come soldered on in the future?

Fact is that soldered components increase reliability and lower cost (not to mention overcoming contact issues with growing age and frequencies).

Most people don't upgrade the CPU separately anymore. Except for the few DIY people who love to configure their "ideal" system or the even fewer Mac Pro die-hards who try to get a little more life out of their aging systems as Apple is dragging its feet.

Simply makes no sense when you have to change the motherboard as well with each new CPU generation. Granted - owning a Macintosh makes it harder to simply swap the motherboard for a 08/15 PC replacement. But with Apple going more and more to highly integrated systems this is probably not an alternative anyway.

Computers are about to become a commodity that is replaced completely - like it or not. At least having a good build quality helps protecting environment as the replaced machine can live longer as 2nd or 3rd hand machine instead of going to the dumpster directly.

Surely the desktop segment will (continue to) suffer in the future, but not because yet another component will come soldered onto the motherboard, but because the whole computer industry is undergoing a massive transition towards smaller and more mobile devices.

And Apple has it's share in pushing this transition with its iPhone and iPad products.

The issue IMHO isn't so much the soldiered to the board is the limitation of choice. Which board OEMs will get to carry which CPU's? Who makes that call? Who determines price? will there be K chips? Now we move into coolers will there be a mounting standard? Will it be integrated? if so how much heat will it dissipate? this really makes sense for the AIO crowd but not the performance/high end desk top crowd.
 
I'm very, very tempted by the new maxed out 27" iMac. All I'm really asking for in a tower is something of that power, where the hard drive/s is accessible, video card upgradeable, etc.

I'd be happy with the specs of the new iMac. I just need it to be accessible so I can keep it for 5 to 7 years without having to take it into an Apple store.
 
The issue IMHO isn't so much the soldiered to the board is the limitation of choice.
Who says there has to be a limitation of choice?

Which board OEMs will get to carry which CPU's? Who makes that call?
In the end the user does. If demand is sufficient you will see a variety of configurations - high end chips on high end boards, low end chips on low end boards and a wild mixture inbetween.

Granted - maybe your favorite board maker would not offer the combination of your favorite board with your favorite CPU, but unless you are the only one asking for a specific combination, others will do.

Who determines price?
Same as today: The market. And this actually means: user demand.

will there be K chips?
If a sufficient amount of customers demand it and eventually put their money where their mouth is - why not?

Now we move into coolers will there be a mounting standard? Will it be integrated? if so how much heat will it dissipate?
Industry will surely make sure that a certain norm will be in place to keep costs down. Even if it will be integrated - where is the problem? Integrated / factory-mounted coolers for other components (like North-/Southbridge or memory modules) have been standard for so many years and no one complained.

Why is it suddenly a problem when a CPU would come with (integrated) factory-solutions for cooling?

this really makes sense for the AIO crowd but not the performance/high end desk top crowd.
I really can't understand the hype. The small and shrinking "performance/high end desk top crowd" will continue to find ways for tinkering with the system if required. Perhaps it won't be required if factory solutions for cooling are good enough.

Might even be better than a DIY combination, as the engineers in the producer companies normally should know best the requirements for their components and offer a perfectly suited solution. Not to mention that a factory fitted cooling solution could use approaches that are not possible when human fingers have to assemble individual components.

Think not k-chips, but k-boards including a perfectly adjusted cooling solution.
 
Mac Pro was my guess to be made in USA. It's very low volume and high profit. I have ordered 2 Mac Pros over the years and neither was shipped from oversea. Some of them were probably assembled in the USA to begin with.
 
Mac Pro was my guess to be made in USA. It's very low volume and high profit. I have ordered 2 Mac Pros over the years and neither was shipped from oversea. Some of them were probably assembled in the USA to begin with.

So we wait for a new factory to be built, tooled, and then they finally start new MP's? :eek:
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.