You were the one posting inane silliness that Apple would never ship a product that might risk repeating past failures.
The Newton/iPad and Cube/Mini make your whole argument ridiculous.
Not really... no... You still haven't shown how they were related. Apple didn't think the Newton was a mistake, and the Mini is not analogous to the Cube.
The Mini is a consumer machine that is not serviceable and is not available with high end parts.
The Cube was a high end pro machine that was user serviceable and shipped with top of the line processors and graphics cards.
Again, stop comparing the two. They're not analogous. I actually still use both a Mini and a G4 Cube almost daily. Even though the Cube is much older it's an entirely different type of machine.
They are both small boxes. So I guess based on that a Fort Fiesta is a lot like a Ford Mustang.
Whatever Apple called it, the Newton was a categorical disaster in terms of reviews and sales. The iPad has been the perhaps the greatest retail story in history.
Apple never seemed to think it was a disaster, and as Apple stated, that wasn't why it was killed. Apple stated Newton was killed because they only wanted to work on one OS at a time, not because they hated the form factor or the concept. Jobs even implied at WWDC 96 that he appreciated the Newton and that it was actually selling decently, and said he was sad that he had to cut it, but they could only support one OS at that point.
His WWDC 96 Q/A session is on YouTube btw, if you want to have a look.
You're stating stuff with no evidence.
You mention some 1GB video. What is that? I never once said anything about using Centralized rendering for video editing or After Effects work. A 5 or 30 second 3d segment is another story...and lest we forget, video can be *compressed.*
You can't compress video for video editing (it ruins the quality for editing), and even if you did, you'd have to compress it on your side and uncompress it on the other, which could take hours alone...
(This is actually why QuickTime X couldn't be used for Final Cut Pro X. QuickTime X only speaks H.264 internally, not uncompressed video.)
The latest in game technology BTW is to use a computer or mobile device as a dumb terminal and let the processing/rendering be done remotely.
Sure, because the input is mouse and keyboard events, which are very small. The frame buffer coming back is larger, but typically the graphics are gimped.
I'll note that this concept has been not so hot commercially so far. Local computation devices, like the XBox 360, far outsell remote computation consoles, like the OnLive box.
Why? Local computing power is just so cheap, and bandwidth is not.
I emphatically concur that the bandwidth ball-busting that the telcos are employing right now is death to this idea...but it's also death to what Apple and Google are doing with their OS and their apps. Thus my contention: Apple/Google networks are coming.
Google is certainly working on a network. Is Apple? I've seen absolutely zero evidence.
With local computing power becoming so cheap why would Apple want to move computation to the cloud anyway? I mean, sure, the Mac Pro is expensive compared to other Macs, but if you're looking at the $/gigaflop rate, it's pretty darn cheap. Lowest rate that it's ever been.
Google is in an entirely different business, meanwhile. They don't want you to buy local CPU power. They can't even sell you that, the only thing they can sell you is the cloud. They're not doing it because it's necessarily better, but because it's their only product.