Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Demoman

macrumors regular
Original poster
Mar 29, 2005
194
0
Issaquah, WA
Any guess why Apple did not continue with the 7800 card and went with the 1900? Is the 1900 a better card, the extra memory not withstanding? If I understand correctly, the 7800 is almost identical to the 4500 except the 3D functionality has been disabled.
 

Chone

macrumors 65816
Aug 11, 2006
1,222
0
The X1900XT blows the 7800 out of the water and back TWICE.

Okay maybe not THAT much but seriously, the X1900XT is the better card, I don't know whats up with the benchmarks at Apple's site but the Quadro 4500 should not beat the X1900XT in any game, even Doom3, I don't know if the results were rigged so the 4500 doesn't appear inferior but the truth is, as a prosumer card, the X1900XT is the best the Mac Pro has to offer, leave the Quadro for the workstation related rendering chores.
 

eXan

macrumors 601
Jan 10, 2005
4,738
134
Russia
What about GeForce 7900GTX vs Radeon X1900XT?

I heard that FeForce is faster, but should they both be in the same performance category?
 

macgeek2005

macrumors 65816
Jan 31, 2006
1,098
0
Demoman said:
Any guess why Apple did not continue with the 7800 card and went with the 1900? Is the 1900 a better card, the extra memory not withstanding? If I understand correctly, the 7800 is almost identical to the 4500 except the 3D functionality has been disabled.

Are you kidding me? Compare the X1800XT to the GeForce 7800, and THEN you have a semi-fair comparison. The X1900XT blows the 7800GT out of the water.
 

solvs

macrumors 603
Jun 25, 2002
5,684
1
LaLaLand, CA
I'm sure they'll offer something else eventually, like whatever the next GeForce is. For now, that was probably the best they had for the price. A 7800 wouldn't have been worth what it would have cost.
 

Demoman

macrumors regular
Original poster
Mar 29, 2005
194
0
Issaquah, WA
macgeek2005 said:
Are you kidding me? Compare the X1800XT to the GeForce 7800, and THEN you have a semi-fair comparison. The X1900XT blows the 7800GT out of the water.

I was asking a question geek, OK? And, it was answered by others with more class than you.
 

Chundles

macrumors G5
Jul 4, 2005
12,037
493
Graphics cards are bloody annoying me now, X1900XTX, X1950XT Pro, 7800GTX Turbo, 7950XTX GT Pro Turbo supercharged...

Can someone fill in this for me? One card from Nvidia and one from ATI:

Low range:

Mid Range:

Top Range:

Seems to me that whenever someone comes out with a graphics card the next day the other company comes out with something twice as good.
 

Lollypop

macrumors 6502a
Sep 13, 2004
829
1
Johannesburg, South Africa
Chundles said:
Graphics cards are bloody annoying me now, X1900XTX, X1950XT Pro, 7800GTX Turbo, 7950XTX GT Pro Turbo supercharged...

Can someone fill in this for me? One card from Nvidia and one from ATI:

Low range:

Mid Range:

Top Range:

Seems to me that whenever someone comes out with a graphics card the next day the other company comes out with something twice as good.


I use to follow the graphics card market, knew all the chips and everything, btu you know what, it changes WAY to fast... so here is my opinion....

Low range: ATI: X13?? NVidia: 73??

Mid Range: ATI: x16?? Nvidia: 760??

Top Range: ATI X19?? NVidia 79??

The ?? is mostly just clock speeds, memory speeds and memory sizes, the basics are about the pipes and shares counts ect.

Me thinks Apple just went with ATI just to have both vendors release a graphics card for the new intel/EFI architecture.
 

Origin

macrumors regular
Aug 11, 2006
115
0
Nantes, France
Actually, performance speaking, the GeForce 7300 GT is more close to the X1600 (in most tests, the 7300 GT is just behind the X1600 Pro/Mobility) than the X1300 series ;)
 

ergle2

macrumors 6502
Aug 27, 2006
399
0
Origin said:
Actually, performance speaking, the GeForce 7300 GT is more close to the X1600 (in most tests, the 7300 GT is just behind the X1600 Pro/Mobility) than the X1300 series ;)

True, tho the new X1300 XT is identical to the X1600 Pro -- but with a 10MHz clock speed bump for the core.

Confused yet? Don't worry, you will be...
 

Chundles

macrumors G5
Jul 4, 2005
12,037
493
ergle2 said:
True, tho the new X1300 XT is identical to the X1600 Pro -- but with a 10MHz clock speed bump for the core.

Confused yet? Don't worry, you will be...

I was confused years ago when my Dad bought a Riva TNT card so I could play Quake 2. Haven't played a proper game in nearly ten years.

And another thing, how come "Pro" is not as good as "XT"? Surely the card makers want "Pro" to be associated with their best? Maybe it's just me or that the base model Ford Falcon is the "XT" but it just doesn't lend itself to a high-end card.

Time to rationalise the card industry, the human eye can't tell the difference above about 70fps anyway so there's not much to be gained from cards that will run some game at 200fps - they might as well start looking at running at 80fps but have it look photo-realistic. Then let the proper pros have their massive cards for all the motion picture and graphical stuff.
 

Lollypop

macrumors 6502a
Sep 13, 2004
829
1
Johannesburg, South Africa
Chundles said:
Time to rationalise the card industry, the human eye can't tell the difference above about 70fps anyway so there's not much to be gained from cards that will run some game at 200fps - they might as well start looking at running at 80fps but have it look photo-realistic. Then let the proper pros have their massive cards for all the motion picture and graphical stuff.

This is where 3dFx went wrong, they (before being gobled up by nVidia) focussed on quality, nobody brough any of their cards.... :eek:
 

Chone

macrumors 65816
Aug 11, 2006
1,222
0
Chundles said:
Graphics cards are bloody annoying me now, X1900XTX, X1950XT Pro, 7800GTX Turbo, 7950XTX GT Pro Turbo supercharged...

Can someone fill in this for me? One card from Nvidia and one from ATI:

Low range:

Mid Range:

Top Range:

Seems to me that whenever someone comes out with a graphics card the next day the other company comes out with something twice as good.


Low range: 7300GT, X1300XT (X1600Pro)

Mid Range: 7600GT, X1800GTO

Top Range: 7950GX2 (Quad SLI), X1950XTX (Crossfire)

I think thats about right.
 

ergle2

macrumors 6502
Aug 27, 2006
399
0
Chundles said:
Time to rationalise the card industry, the human eye can't tell the difference above about 70fps anyway so there's not much to be gained from cards that will run some game at 200fps - they might as well start looking at running at 80fps but have it look photo-realistic. Then let the proper pros have their massive cards for all the motion picture and graphical stuff.

It's about running recent games in VERY high resolutions with high levels of anisotropic filtering and anti-aliasing with full eye candy -- and about being able to do so for a long enough period to make buying such a card worthwhile.

Beyond this, the next generation of games are offering the ability to offload physics processing onto a secondary/tertiary GPU at least one of the upcoming third party physics libraries - Havok, I think.

Don't need it? Then the cheaper single card options will do you just fine. :)
 

Dont Hurt Me

macrumors 603
Dec 21, 2002
6,055
6
Yahooville S.C.
You mean a PPU:D and they have physics cards now, alienware has em as options. problem is your game has to have software that uses the card, most games dont.
 

ergle2

macrumors 6502
Aug 27, 2006
399
0
Lollypop said:
This is where 3dFx went wrong, they (before being gobled up by nVidia) focussed on quality, nobody brough any of their cards.... :eek:

3dFX's own mistakes/mismanagement is what killed them.
  • They insisted 16bit (pseudo-22bit) rendering was "good enough" when the competition offered 32bit.
  • Their cards only supported 256*256 textures long after the competition offered 2048*2048 support.
  • They bought STB and start making/selling cards themselves. OEMs didn't like that kind of competition and where did they run to? nVidia.
  • They were late to the "integrated 2D/3D" party -- and the Banshee was slower in many games than the 3D-only VooDoo 2 due to just having a single texture unit (tho it was around 10% faster in terms of clockspeed..). No Multi-texturing in a single pass meant Quake 2 was slower than when run on the competition.
  • They rejected hardware T&L in favor of the "T-Buffer" -- who knew what a T-Buffer was?
  • They were eternally late, constantly missing product cycles/deadlines and releasing contingency products -- the RAMPAGE chipset was originally due around the time VooDoo 2 shipped! This continued right up 'til their died. Pretty much every product actually launched after the original VooDoo.
  • All the VSA line (VooDoo 4-6) really had to offer was better AA without a major performance hit. The V4 card was slower than the GeForce2MX, which was nVidia's budget card of the day. Plus, V6 never saw the light of day.
  • They didn't supported DirectX acceleration under Windows 2000, only under the Win9x (OpenGL and their own GLIDE APIs were accelerated in W2K) . The competition did.
  • Their spending on employees was legendary, with reports of ~$30K on employee lunches right up til they closed the doors.

3dFX were very much the architects of their own downfall, I'm afraid.
 

ergle2

macrumors 6502
Aug 27, 2006
399
0
Dont Hurt Me said:
You mean a PPU:D and they have physics cards now, alienware has em as options. problem is your game has to have software that uses the card, most games dont.

No, I mean physics on the cards GPU. Both ATI and nVidia have demonstrated this.

Check this out.

The PPU is the PhysX card, and it's slow. PhysX is a direct competitor to Havok.

The current PhysX card is also notorious for slowing down games when it is enabled -- arguably due to adding support for it late in the game's lifecycle, but it's not impressive, especially with how much the card costs. More than many graphics cards, and it's PCI only, not PCIe.

The Havok library currently features software fallback for systems that don't have a "spare" GPU.
 

jiggie2g

macrumors 6502
Apr 12, 2003
491
0
Brooklyn,NY
Chone said:
Low range: 7300GT, X1300XT (X1600Pro)

Mid Range: 7600GT, X1800GTO

Top Range: 7950GX2 (Quad SLI), X1950XTX (Crossfire)

I think thats about right.


I think Money is what people care bout. For the most part this is right. your chart that is. All cards have been updated to newest version for comparison.

Ultra High End: $450-550 = 7950GTX / X1950XTX (DDR4)

High End = $350-450 = 7900GTX / X1900XTX (DDR3)

Lower High End = $250-350 = 7950GT / X1900XT (DDR3)

Mid End = $199-250 = 7900GS (20 pipe) / X1950 PRO (36 shaders)

Low End = $150-199 = 7600GT (12 pipe/128bit) / X1650XT (12pipe/256bit)

Budget End = $99-149 = 7600GS / X1650 Pro (8 pipe)

Basic End aka Crap End = 7300GT / X1300XT

There you have it.
 

Mr. Mister

macrumors 6502
Feb 15, 2006
440
0
ergle2 said:
The current PhysX card is also notorious for slowing down games when it is enabled -- arguably due to adding support for it late in the game's lifecycle, but it's not impressive, especially with how much the card costs. More than many graphics cards, and it's PCI only, not PCIe.
Or maybe "arguably" because the game detects the PhysX card and enables dozens of particle effects and fluid areas that otherwise wouldn't exist in the game, calculating them with a marginal slowdown while at the same time running the game with a physics detail that would slow the computer down 400% if it didn't have a PhysX card. Saying that adding a PhysX card slows games down is a very flat way of looking at it, it's like saying adding a 7900GTX and playing FEAR is inferior to running Quake II on a 9800 Pro because the latter gets a better framerate than the first, disregarding that FEAR is pushing realtime light and shadow effects while Quake II is barely 3D.
 

ergle2

macrumors 6502
Aug 27, 2006
399
0
Mr. Mister said:
Or maybe "arguably" because the game detects the PhysX card and enables dozens of particle effects and fluid areas that otherwise wouldn't exist in the game, calculating them with a marginal slowdown while at the same time running the game with a physics detail that would slow the computer down 400% if it didn't have a PhysX card. Saying that adding a PhysX card slows games down is a very flat way of looking at it, it's like saying adding a 7900GTX and playing FEAR is inferior to running Quake II on a 9800 Pro because the latter gets a better framerate than the first, disregarding that FEAR is pushing realtime light and shadow effects while Quake II is barely 3D.

Indeed. However, the benchmarks I saw suggested with full effects the game would be borderline playable, and the eye-candy -- and that's all it added in this case -- in the movie wasn't that big an improvement, so it seems rather pointless based on that example. I am however willing to accept that the support for the card was likely added very late in the dev cycle and thus the engine's support could be rather sub-optimal for the title in question (GR:AW).

Adding in the card should be pretty much transparent in the sense that it shouldn't drag the rest of the system down, which this obviously does, especially when the current PhysX card is the only one -- it's not a moving target like supporting the huge capability gulf between, say, GMA950 and GeForce MX cards at one end and the latest nVidia 7900GX2 or ATI 1950-range cards.

It shouldn't be about rendering "more", it should be about improving realism -- because rendering more is still hitting the GPUs hard, and if they don't have power to spare, adding more into the mix isn't helping...

It's also not really akin to Q2 vs FEAR, which would be several orders of magnitude in a very visible sense. It's more like changing the detail level in FEAR from low to high and comparing things that way.

Either way, what I've seen thus far suggests that the PhysX card isn't terribly impressive, especially considering the price-tag. Maybe some newer games will make better use of it, or maybe it's the hardware implementation itself that's the problem...
 

Lollypop

macrumors 6502a
Sep 13, 2004
829
1
Johannesburg, South Africa
ergle2 said:
3dFX's own mistakes/mismanagement is what killed them.
  • They were late to the "integrated 2D/3D" party -- and the Banshee was slower in many games than the 3D-only VooDoo 2 due to just having a single texture unit (tho it was around 10% faster in terms of clockspeed..). No Multi-texturing in a single pass meant Quake 2 was slower than when run on the competition.
  • All the VSA line (VooDoo 4-6) really had to offer was better AA without a major performance hit. The V4 card was slower than the GeForce2MX, which was nVidia's budget card of the day. Plus, V6 never saw the light of day.

3dFX were very much the architects of their own downfall, I'm afraid.

Spending and delayed product launches aside, you actaully colaborated my point, 3dFx believed that 30 fps was enough and that people wanted quality. nVidia and ATI through their own way focussed on performance and in that era people actually wanted their new generation games to perform well... its still the same, we buy a new card so that our new game can run decently, and our old games can look good... 3dFx had a vision, but it was ultimately a flawed vision.

Sorta back on topic, does anyone think that the goodness that will come in Leopard will require something more than a entry level card?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.