Couldn't you just repost it and leave out the Gates/Jobs part? I for one am curious about what you had to say about this part if it was at all relevant to the problems with which Yosemite has plagued us:
"the progression of the state of GUIs up until the present day between old school Mac, Amiga, MS-Dos, Windows 3.x and modern OSes,"
Regards, Etan
Well, I didn't save the message, so it's pretty much just gone. All I'll say about the Wikipedia pages is that they don't cover everything known or talk about every aspect (i.e. someone questioning what something I say is accurate). They are written by every day volunteers, after all. I lived through the early '80s computers until today and I've seen a LOT of GUIs come and go and change for that matter. Many had their strong points and also weak points as well. It depends on what you were looking for and what you liked to do.
For example, is MS-Dos 'better' than early Mac OS of the same period? Windows 3.0 didn't come out until 1990. That's a long time in computer years. Mac OS V1.0 might be limited, but certain things changed over those years. The Mac II in 1987 had a 16 million color palette with 256 color modes and could even be upgraded to drive multiple monitors. What the heck could MS-Dos do in 1987? VGA graphics didn't come out until 1987 and sure as heck didn't get any widespread use for many years and EVERYTHING and I mean EVERYTHING in MS-Dos required software driver support to use everything from a sound card to VGA, let alone any higher resolutions, etc.
By the mid-90s, you could run some interesting games from MS-Dos in 256 colors, but the Amiga could run 32-color games in 1985 (all Amigas from the start supported it so all games could use it). EGA with 16 colors didn't come out until 1984 and support was terrible and sound support was non-existent. This is why Amiga games RULED the latter half of 1980s for sheer visual and sound impact (with the C64 before it having some of the best early games ever made prior). I fully blame Commodore's management for screwing up their opportunities. They had half a decade lead on all competition and with the Video Toaster, they could have ruled the world at that point if it wasn't for their total lack of vision for business support with the Amiga and slow tempo to upgrade the GUI for better control from just the GUI without a CLI. Lack of hard drive support for so many years on the lower models ensured games would never progress along with so much custom chip tricks to get the best game performance making changes break games in the Amiga 3000 and 4000 series. Commodore sold a ton of C64s, but couldn't get the same traction with the Amiga, possibly partly due so many C64 users being happy with gaming on that platform (the amount of games made for it quite possibly dwarfs everything made for all other systems combined up until that point with over 23,000 games made for it during its lifetime).
In any case, there's a difference between what makes a great GUI and what makes a great machine for specific purposes like gaming, video editing, publishing, accounting, etc. Those tasks can often be accomplished even on bad OS systems so long as the software itself is good and the hardware supports it. You didn't even need a GUI on a C64 to run all kinds of software from Font Master II (I had excellent word processing results using it back then to the amazement of my teachers) to those 23k games. You loaded games from BASIC. Amiga games could normally be started by just putting a disc in the floppy drive and turning on the power (as simple to do a cartridge in a console).
Yes, the Amiga had awesome multitasking capability that neither the PC or Mac had until well into the mid 1990s (in the Mac's case, it didn't get true multitasking until OS X came out and Windows NT didn't come out until 1993 and it wasn't for gaming so most home users had to pick between a serious OS and entertainment. While Windows 95 offered "Windows" with real gaming potential for the first time, it was still sitting on top of Dos and crashed itself silly by comparison to Windows XP, which was the first "NT" cross-breed that supported things like gaming well and most people here know how long XP's life lasted given many are still using it on some computers to this day even).
But what's in a GUI? If I'm really only a gamer do I care? Maybe not. But most people still have to use the OS these days for the Internet, if nothing else. There's also more than just graphical appearances to a GUI. How stable is it? How well does it network? The Mac could network easily in a day and age where DOS was clumsy at best. But if you're not using it then it probably doesn't matter much so not everything is easy to compare. Then you have your CLI fans that will argue until they're blue in the face that it's way better than a GUI even today (e.g. hard core linux users). Is a mid to high-end Mac in 1994 better than a fully decked out hardware-wise PC running MS-Dos? It depends on what you're going to do with it and software availability so it's not as easy a question as it sounds.
Graphically speaking, the Amiga had a color window-based GUI in 1985. It was up to 16 colors in 640x480, but until the first flicker fixer boards became available for the Amiga 2000 (standard in an Amiga 3000), you had to put up with annoying interlaced graphics (whether usable depended on the color palette chosen and how sensitive you were to the flicker). However, if your purpose was logos for TV commercials via genlock devices, it didn't matter since NTSC was interlaced too! This made Amigas a hard sell for high-res business software until the Amiga 3000 and by then, the rest of the market was starting to catch up. If AGA had come out with the Amiga 3000, it might have helped, but alas we'll never know. The Amiga was never designed for the Internet, but the fact I could surf the Web pretty well on a 68030 based Amiga 3000 well in to the late '90s with only a few issues (Java was a non-starter if you liked to play card games on Yahoo for example) shows how versatile the machine was and well the OS could cope with new challenges (being library based; anyone could write their own custom libraries for improved API support even after Commodore was sold. This isn't quite the same level as open-source Linux, but it was certainly more flexible than trying to write your own Window system for MS-Dos (although clearly that is what Microsoft had to do in order to make Windows 95 through Windows ME work along with Windows 3.x and earlier).
The reason GUIs were so flat back then should now be obvious. They had limited color potential. How do you do transparent windows or quality shading with 16 colors? You can do basic drop shadows and line the window borders with lighter/darker colors to give some three-dimensional qualities to the windows and we started to see that even back then. But limited colors made some things impossible or at least they didn't look very good until we progressed to 24-bit color modes, which made skeuomorphism possible.
Early 24-bit color modes lacked much in the way of acceleration, so using them beyond a frame buffer for photo displays or editing wasn't very realistic. Thus you didn't get much in the way of change to GUIs throughout the '90s despite improvements in video cards that supported such modes. Gaming was still mostly limited to 256 colors until the latter '90s when 16-bit color started to be used along with 3D video acceleration. Not until Windows 7 and OS X (and Compiz in Linux) did you really start to get some 3D and transparency effects used to any degree in the mid 2000s.
Now we seem to have gotten bored with 3D layer and animation effects, real life imitation (skeuomorphism) and to be "different" we're not heading backwards to "flat" interfaces but in 24+ bit color. Inevitably that will get "old" as well and then what? Rotating spheres?
What some of us are arguing is that "splash" is for a "wow" effect and can be fun, but USABILITY is what actually matters. If you're going to screw with graphics just to keep things "wow" then you better be darn sure you don't screw usability up in the process or make eyes bleed. Unfortunately, that is EXACTLY what Apple has done with Yosemite and iOS 7/8.
The
SOLUTION is simple. Start offering
PREFERENCE selection for graphics and interfaces. That applies to Windows just as much as OS X. I and others no more want to use Metro in Windows than Yosemite in OS X. I don't mean newer OS features and/or associated bugs (those come and go). I mean the interface and graphical LOOK. There is no technical barrier to offering a CHOICE of visuals or even styles of operating something like Spaces or Metro's start screen. These are choices forced upon users by Apple and Microsoft. With Linux, you can pretty much just choose another distribution, but even there people have been let down with sudden "new directions" in their formerly favorite interfaces like Gnome 3's sudden departure from everything previous. This forces people to find alternate builds and brances that HATE the new interface changes.
With the Mac, however, this isn't a real option since newer Macs typically REQUIRE the newer OS to even operate. With Windows, you can typically expect things to work with older OS versions for quite some time (e.g. assuming you can buy a copy of Windows 7, most things will still work just fine).