Movies shot on film are many times scanned into the computer at 4k for editing and sometimes even higher for FX work and cameras like the Red One shoot natively at 4k (and higher res cameras are in the pipeline). Shooting a live action 3D movie basically doubles the amount of footage you are working with because you are shooting w/two cameras side-by-side (one recording for each eye).
I see. 3D doesn't sound that intensive but if 4k becomes standard I can see that being a problem.
Basically, yes. For example, it used to be common place for people to need hardware accelerator cards to work w/DV footage or to playback DVDs on their computers but obviously those types of devices aren't needed anymore.
Lethal
I agree. I mean it's like someone said, CPU won't be needed that much and it will all be done through video cards. This makes it sound like CPU's will go the way of RAM. RAM use to be highly important and going through much updates (from my knowledge) but as of now? RAM is rarely needed to be updated. Hell even in gaming there is yet a game that requires 4GB DDR2 RAM of any kind to be of
recommended spec and we've just seen the first game that has a quad core processor in its recommended specs, and it isn't even necessary according to the developer (I use gaming because they have much higher baseline specs than other programs). Really the only thing that seems to be going up are video cards.
IMO in the future Mac should just make a far cheaper Mac Pro line (not actual Mac Pro's but of a different extension) that will allow the user to just upgrade the video card and RAM.
You have a lot of questions, don't you?
Yeah sorry I'm just curious.
If it's annoying you guys I promise that I'll stop if you tell me to.
EDIT - Sorry to ask another question (I looked this one up on Google I swear!) but why wouldn't Motorola make power efficient CPU's for laptops? Why would they cut their ties with such a strong force?