First off upgradable video cards are not the answer. People are slaves to the idea that you need to upgrade your video cards every time nVidia and ATI release something new.
That's true.
Consoles have proven that that is not the case.
True to a certain extent.
Xbox 360 has some excellent graphics capabilities that rival some of the newest cards that have been released almost 2 years after.
Mostly False. The 8800GTX which came out a year ago far exceeds anything the 360 GPU is capable of both on paper and in the real world. Even considering the 360 renders a much smaller resolution(Halo 3 is running at 1152x640 and then upsampled by the 360's scaler). It's true that the 360 does rival some newer mid-tier cards and alot of ATI offerings, but when the 360 was less than a year old, it was already beaten about the head and shoulders graphically.
PS3 has a more powerful one even than the 360.
Even though it has an unusual architecture, technically the 360 has a more powerful GPU .
Both consoles are more than likely going to see cutting edge games for at least another two years.
Mostly true. The titles that come out on the PC will just feature higher resolutions, larger textures, more lighting and particle effects, better framerates, etc.(Video card dependent, naturally)
The reason PC gaming requires you to upgrade is because of poor gaming development. Thats why console games have actually started to surpass PC games. A closed hardware platform is much better for really pushing the capabilities of the system. The wider range of graphics cards the more programming junk has to be put in there. You won't see many games taking full advantage of the video cards because for them to do that they have to develop so many controls to turn so much off for the lower end cards. Very few people are going to buy a 700 dollar video card every time a new game comes out they want to play.
This is pretty much completely wrong. First PC gaming doesn't require you to upgrade anywhere near what you seem to be implying. Secondly, console games passing PC games is a pretty vague and subjective thing. Do you mean graphically? If so, you're pretty much incorrect. Now, in some ways they have, like Gears of War coming to the 360 long before the PC. But that's not because the closed platform is better for pushing the capabilities of the system; it's because console games sell a metric crap-ton more than computer games. Because consoles are cheaper than computers. Consoles sell more games which means they get allocated more resources by many software developers. And of course, console gaming is cheaper than PC gaming, even at an equivilant visual experience. Since it's standardized commidity hardware it's cheaper to build a console than a gaming computer, plus consoles are subsidized by the company whereas gaming computers and computer parts are seen as an enthusiasts market which means higher margins which means more expensive. The wider range of graphic cards does not make programming harder, that's what APIs are for. And I can't think of one game that would require a 700 video card to play. The most demanding video game announced is Crysis and it'll play WELL on a card that cost 500.00 2 years ago. The problem here is that ATI and Nvidia both offer cheap crap that won't play games and then people buy them and then bitch when a game comes out a year later than won't play and then assume you need to buy a new card every year. ATI and nvidia are basically taking advantage of uninformed consumers.
I think people are going to find a major revolution in computer gaming and I think Apple is going to come out the winner.
Not with their marketshare. More "Gaming" PCs are sold than the total number of Macs and most macs are completely **** for games because of the hardware.
Right now buying a new video card you have to run vista if you want to use its new DirectX10 features. Thats actually why EA and Apple are getting along so well. EA and nVidia tried to get Microsoft to release DirectX10 for XP but Microsoft wants to force people to upgrade to Vista to use that fancy new video card. That is hurting both nVidia and EA who do not want to develop for Vista since so many people hate Vista (The fact that microsoft sold a downgrader so you could go back to XP without erasing is big telling point).
This giant snafu is going to benefit consoles more than it will Apple.
While people are making fun of Apple's current video cards like the HD 2600 ATI card in the new iMacs those cards are OpenGL 2.0 cards which is far superior to Directx10 as far as developers are concerned. Take WoW for example. On the same hardware it took a 45% performance hit when Vista came out because it uses OpenGL which in Vista has to go through DirectX10 as an emulation. OpenGL also supports multicore processors and is rumored to support multicore GPU's.
Um? So? All the Geforce 8 series is Open GL 2.1 as well. Vista's implementation of OpenGL may indeed suck but that doesn't negate any of the comments made about the video card in the iMac.
Right now if you do not like Vista there is almost Zero reason to buy a new video card since 90% of their new features require DirectX10. Valve is being very stupid to not go for the Mac. Mac OS X is becoming a very popular OS and a lot of people are switching because Vista really does suck.
Numbers do not support your assertion in any meaningful way. People may indeed may be switching to Mac, but not in any great number. Most people don't care. They're literally indifferent towards computers. Most gamers stay with Windows because that's where all the games are. All the enthusiast gamers will stay with Windows because of the hardware. That's a lot of inertia to overcome.
I honestly suspect that Valve simply does not want to have to worry about programming efficiently and would rather just put out a game that pretty much requires you to upgrade your video card because they know there are many loyalist gamers who are willing to shell out big bucks for a new video card to play the latest and greatest that their favorite game company puts out.
Yes. Because the Orange box is so demanding that I had to shell out 250 bucks on a video card in 2006 to play it when it comes out in 2007.(Actually you didn't have to spend that much) I mean if you're going to throw out conjecture, you should think beforehand.
This won't change till people get tired of being told they have to upgrade because the company making the game didn't want to optimize its code to take advantage of current hardware.
Or until people stop repeating uninformed opinions as fact.