Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
In the end all Mac's without screen will be Apple-TV like sized boxes.

And all Apples with screens will be handhelds.


At the moment Moore runs much faster than we can keep up with our computer use.

Actually, I think that Moore is falling behind - application requirements (and user expectations) are growing faster than the computer resources. On the storage and networking side, files and bandwidth usage are growing faster than disk and networks. (I'm doing final pre-production testing on a system set up with four 10 GbE NICs teamed into a 40 GbE link.... I also have a system that's able to sustain 2400 MB/sec of disk traffic - and the disk is still the bottleneck. (Don't mention Apple's PCIe blades - the datastore is 48 TB.))
 
In the end all Mac's without screen will be Apple-TV like sized boxes.
At the moment Moore runs much faster than we can keep up with our computer use. The next 16nm Rockwell processors will bring the current 12 core in a Ivy-bridge like quad core TDP. With the move to 11nm you will have so much power in so little space, that main board sizes will move towards Raspberry Pi dimensions.

What do you mean? There are no suggestions that Broadwell/Rockwell will perform anything like that. If you mean the E/EP versions, those are dropping quad cpus with the next revision. They are more likely to continue to beef up integrated graphics and absorb more discrete components.
 
Actually, I think that Moore is falling behind - application requirements (and user expectations) are growing faster than the computer resources. On the storage and networking side, files and bandwidth usage are growing faster than disk and networks. (I'm doing final pre-production testing on a system set up with four 10 GbE NICs teamed into a 40 GbE link.... I also have a system that's able to sustain 2400 MB/sec of disk traffic - and the disk is still the bottleneck. (Don't mention Apple's PCIe blades - the datastore is 48 TB.))

No. User expectations and requirements are quite behind Moore and for a long time now. People are upgrading less and less because a 5 year old computer can pretty much do everything an average user needs today, which certainly wasn't the case 5 years ago.
 
Last edited:
No. User expectations and requirements are quite behind Moore and for a long time now. People are upgrading less and less because a 5 year old computer can pretty much do everything an average user needs today, which certainly wasn't the case 5 years ago.

I used to build/buy a new computer about every 2 years. Right now I'm running a 2008 3.1 Mac pro and other than my video editing I am perfectly fine. That is remarkable in my 30 years of computer owning experience. I may or may not order a nMP when it comes out, but you're right the upgrade velocity seems to be slowing quite a bit.
 
@OT: I don't think that with this usage OP would benefit much from having multiple slow cores. I also second the modern iMac with lots of RAM.

Actually, I think that Moore is falling behind - application requirements (and user expectations) are growing faster than the computer resources. On the storage and networking side, files and bandwidth usage are growing faster than disk and networks. (I'm doing final pre-production testing on a system set up with four 10 GbE NICs teamed into a 40 GbE link.... I also have a system that's able to sustain 2400 MB/sec of disk traffic - and the disk is still the bottleneck. (Don't mention Apple's PCIe blades - the datastore is 48 TB.))

You are talking about professional data serving - and I believe blanka meant the normal computer user (both producer and consumer). It should be obvious that the Mac Pro is not the right tool for the job - its first and foremost a content creation tool/data cruncher.
 
who is "average"?

No. User expectations and requirements are quite behind Moore and for a long time now. People are upgrading less and less because a 5 year old computer can pretty much do everything an average user needs today, which certainly wasn't the case 5 years ago.

One could say that the "average user" is satisfied with a low-end smartphone - at least by numbers. If you put the threshold at tablets, most people would be satisfied.

On the other hand, this is a thread about workstations - and more, bigger, faster is the rule in that space.

When time is money, you don't want to wait on a computer - and the tasks are becoming much harder (4K video vs HD video vs SD video for example).
 
When time is money, you don't want to wait on a computer - and the tasks are becoming much harder (4K video vs HD video vs SD video for example).

That's true up to a point. If I could shave 5 minutes off a 1 hour render time (that's an 8% increase in productivity), is it worth $500, $1,000, $2,500, $5,000? There's a point of diminishing returns for any business. It's an asymptotic curve and at some point the outlay doesn't justify the return.

Case in point…say I have a worker who I pay $100,000 per year (who works 2,000 hours a year - 8hrs per day * 250 days). That means the person makes $50/hr. Now let's say I buy him a rig that shaves 5 minutes off his render time and he renders this twice daily. I've saved $8.33 each day. Now let's say he does this every day of the year (250*$8.33) and now I've saved $2,083.

Now, let's say that in order to save the worker this 10 minutes daily, I have to go from the hex core to 8 core rig and upgrade to the D700s and give him a RAM boost from 16GB to 32GB. All told I need to spend an additional $4k (going from 6 to 8 cores is like $1k and a pair of D700s are going to be around $3000 with a few dollars thrown in for the RAM).

All of a sudden the break-even point is at about 2 years and then it becomes cheaper to just buy the person a new computer rather than spend it on the front end.

Here's another point to consider. Rather than spending the money on the rig, I could invest the money and earn an 8% return. Compounded over two years (instead of spending on the upgraded rig) and I've earned $665.60 on my original $4k investment. Still using the $8.33 per day savings, by not buying the person the upgraded rig and investing the money instead my opportunity cost has increased by ~80 days (665.60/$8.33).

Again, I won't deny that there's money to be saved by buying a faster, beefier rig, but at some point the costs outweigh the gains.
 
I don't think the future of Pro is with Apple.

Sadly I have to agree, like with FCP7/X Apple are targeting the semi pro market with the black can and deserting the high end - not even with external TB2 raid boxes the limitations of a single socket CPU no matter how many cores it has is plain to see. Compared to a tower server cased multi processor (2,4) LGA2011 Xeon system with a raft of PCIe slots and DIMM sockets to install additional storage and graphics/compute options tailored to their use, running Windows or Linux, or perhaps even hackintosh'd for OS X.
 
And all Apples with screens will be handhelds.




Actually, I think that Moore is falling behind - application requirements (and user expectations) are growing faster than the computer resources. On the storage and networking side, files and bandwidth usage are growing faster than disk and networks. (I'm doing final pre-production testing on a system set up with four 10 GbE NICs teamed into a 40 GbE link.... I also have a system that's able to sustain 2400 MB/sec of disk traffic - and the disk is still the bottleneck. (Don't mention Apple's PCIe blades - the datastore is 48 TB.))

Moore's law is getting closer than anyone realises - the current silicon process has tracks about 40 atoms wide, if it gets down to under 20 then quantum mechanics will come into play causing mayhem. The bar finally ends with single atomic carbon tracks made of graphene!
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.