Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
A very basic question - I use a Mac Pro Quad with a 7300 GT graphics card. What new graphics card (if any) would give me a noteworthy performance improvement with Photoshop CS3 and Final Cut Express?

Thanks much.

I think Final Cut uses the GPU to help with rendering and color space conversons, live tpe and such. I don't thing PS CS3 uses the card for much or anything. A few of the filters I think do.
 
From a pure performance standpoint, NVIDIA's card is the victor. For the more price-conscious, ATI's HD3870 is an excellent deal.
If it is the single card we are talking about and not the double one. But since you claim it is a 512MB card, it must be the single one.


However, IF it does support Crossfire, you can get two of these and it will smoke the 8800GT. But that is unconfirmed, and it wont work in pre 08 Macs.


crysisbench.png


3dmark.gif


wic2.gif


image007.png


I think you get it :)

Yeah, we get it Tom Pabst, oops, I mean OP. FPS means zip when the quality is in the toilet. :rolleyes:
 
A very basic question - I use a Mac Pro Quad with a 7300 GT graphics card. What new graphics card (if any) would give me a noteworthy performance improvement with Photoshop CS3 and Final Cut Express?.
I don't think either of those take advantage of the GPU. Well Photoshop might with it's few 3D features. Adobe relies on CPU. Apple's Motion would take advantage of the new GPU though.
 
I don't think either of those take advantage of the GPU. Well Photoshop might with it's few 3D features. Adobe relies on CPU. Apple's Motion would take advantage of the new GPU though.
How much of Core Graphics and Core Video is hardware accelerated? HD Video isn't, for example. I wonder if more Apple apps would be hardware accelerated, and vendors like Adobe would use them, if the underlying Apple frameworks supported it.
 
ATI v NVIDIA

For what its worth, my Mac Pro shipped with the 2600XT and I upgraded to the 8800GT, but I have been incredibly disappointed with the 8800.

I have always felt that ATI have served the Mac market better than NVIDIA, it is unfortunate that the AMD 'merger' has robbed them of their focus.

If they do produce a 3870 for OS X, its very likely that I will replace the 8800. I feel that the ATI drivers are far better for OSX than NVIDIA and I would be very surprised if OS X gaming performance is equivalent. Crysis isn't available on the Mac so I am not overly concerned, yes I have played it, yes it is graphically stunning, but I will not keep the 8800 for one game.

I also have to wonder whether the ATI shader strategy will not start paying off soon.

Cheers
 
ATI typically writes their own drivers and hands them off to Apple. Nvidia hands over the source code to Apple and has them write their own drivers. We all know that apple isn't going to put more than a minimal effort into anything graphics related.

By the way BenRoethig, I love your signature! :D;)
 
I thought the xMac was something people not currently owning an Apple computer wanted. I see many contend Apple owners out there currently.

Really, not to start (yet another) xMac debate, but I would love Apple to release a mid-tower Mac.

I really don't need a $3000 8-core machine, since most of what I do is not going to benefit from that many processors. But the things that I actually do would be helped tremendously by a single faster processors. A 3.2GHz desktop-class dual core would be better for me than the 2 x dual or quad workstation/server-class chips in my current machine.

Most of the software I run is user-limited (word processing, page layout, etc.), so faster UI response is more important to me than cranking through multi-threaded 3 hour video encodes or 3D renders. 10 x 500MHz processors isn't going to make Leopard's UI any faster, but one 3,000MHz processor instead would be like night and day. For the users who need workstation-class power, obviously the octo-core (or higher) Mac Pros would still be available.

Even though I've got 4 hard drives in my machine currently, I'd be just as happy with three bays. And if Apple didn't put in these enormous workstation class Xeon processors, power supplies and heatsinks, a smaller mid-tower case could still accommodate 4 hard drive bays. Heck, my home-brew AMD machine has room for 4 hard drives, two opticals and a floppy!

And even though Apple doesn't want me upgrading the machine (since they'd rather sell me a new one), using a standard Intel desktop processor means more upgrade potential.

For me, I would never buy an iMac for my own use - I don't want to be restricted to the display the machine came with. And I might be tempted to replace my machine more frequently if it were cheaper. Its hard to replace $2700 towers every 18 months, you know.

I really think a mid-tower from Apple would sell like hotcakes. I can't imagine Apple would have a harder time turning a profit off smaller, less expensive towers. The cases, power supplies, motherboards/chips and processors would all be less expensive to buy at wholesale, so I'd think they could maintain their traditional 25% profit markup or whatever Apple tends to charge. Or perhaps it would be a bit less to remain competitive with cheap boxes like Dell and Acer, but still Apple could expect a premium for the brand name.
 
The 8800 is a great gaming card but NOT impressive running Apple Pro apps that rely on Core image. The Radeon 3870 may not run games as quite as fast as the 8800 but it's a better "balanced" card if you are using it for both games and "real work."

Is this because of the card itself or because of driver issues that could be resolved later?

Note that the 8800GT has had playback issues for many users, including myself (I just bought one). I mean, at present, I lose frames and get audio sync problems watching QT movies. My 7300GT (which is still installed) does better playing the same. Something is seriously wrong.

I highly doubt, if given good drivers, that the 2600XT could beat the 8800
 
Is this because of the card itself or because of driver issues that could be resolved later?

Note that the 8800GT has had playback issues for many users, including myself (I just bought one). I mean, at present, I lose frames and get audio sync problems watching QT movies. My 7300GT (which is still installed) does better playing the same. Something is seriously wrong.

I highly doubt, if given good drivers, that the 2600XT could beat the 8800

It has to do with drivers and architecture.

While the Geforce 8800 GT has 112 Stream Processor Units, the Radeon HD 3800 series has 320 Stream Processor Units. However not all these 320 Stream Processor Units on the Radeon HD 3800 series is equal. Every 5th ALU in a Stream Processor Unit can handle special functions and extra integer processing ops.

To make it short, the architecture behind the R600/RV670 makes it much more effective with General-Purpose computation on GPUs (GPGPU).

For comparison, the Radeon HD 2600 has 120 Stream Processor Units.
 
It has to do with drivers and architecture.

While the Geforce 8800 GT has 112 Stream Processor Units, the Radeon HD 3800 series has 320 Stream Processor Units. However not all these 320 Stream Processor Units on the Radeon HD 3800 series is equal. Every 5th ALU in a Stream Processor Unit can handle special functions and extra integer processing ops.

To make it short, the architecture behind the R600/RV670 makes it much more effective with General-Purpose computation on GPUs (GPGPU).

For comparison, the Radeon HD 2600 has 120 Stream Processor Units.


That's not correct. You CANNOT compare shader processors across companies like that. The RV670 and G92 are COMPLETELY DIFFERENT architectures. From a pure horsepower perspective, The 8800GT is definitely a faster card than the 3850/70 when it comes to DirectX and OpenGL in windows. The difference on the Mac is the drivers. As many have said in this same forum, ATI drivers seem to be much better optimized for the Core image/animation APIs than nVidia's drivers. Hence the reason why older ATI cards such as the X1900/X2900/etc appear to run OSX's interface and Apple "Pro apps" (those using Core Image/animation) better than more powerful nVidia cards such as the 8800GT.
Now of course it depends on how someone defines "Pro Apps". For 3D modeling applications on OSX that use OpenGL extensively, I would assume the more powerful 8800GT takes the cake against any of the Mac ATI cards, just as it does in OpenGL apps on windows. I don't have a link to benchmarks at the moment, but I'm searching.
 
That's not correct. You CANNOT compare shader processors across companies like that. The RV670 and G92 are COMPLETELY DIFFERENT architectures. From a pure horsepower perspective, The 8800GT is definitely a faster card than the 3850/70 when it comes to DirectX and OpenGL in windows. The difference on the Mac is the drivers. As many have said in this same forum, ATI drivers seem to be much better optimized for the Core image/animation APIs than nVidia's drivers. Hence the reason why older ATI cards such as the X1900/X2900/etc appear to run OSX's interface and Apple "Pro apps" (those using Core Image/animation) better than more powerful nVidia cards such as the 8800GT.
Now of course it depends on how someone defines "Pro Apps". For 3D modeling applications on OSX that use OpenGL extensively, I would assume the more powerful 8800GT takes the cake against any of the Mac ATI cards, just as it does in OpenGL apps on windows. I don't have a link to benchmarks at the moment, but I'm searching.

That's exactly why I am stating they ARE in fact different architectures. I didn't feel like getting too technical with how many Vec3 ADD/MUL/MADD and Scalar ADD/MUL/MADD calculations the chips are capable of per clock.

I am just stating that for General-Purpose computation on GPUs the 5 MADDs x 64 = 320 Stream Processor Units of the Radeon HD 3800 series is better than the Geforce 8800 GT.
 
Ok, since I do run Solidworks on my Mac Pro, which should I upgrade to ?


Radeon 3870 or 8800GT ?


Please, no "build a windows box" answers. I do have a windows box but I use my Mac Pro for work.


Solidworks seems to like OpenGL a lot.


I do not do anything on SW that is too intense, so a simple "this card" or "that card" will suffice.


Just want to see what other CAD folks will be upgrading to ?


TIA
 
info

Ok, since I do run Solidworks on my Mac Pro, which should I upgrade to ?
Radeon 3870 or 8800GT ? Solidworks seems to like OpenGL a lot.
I do not do anything on SW that is too intense, so a simple "this card" or "that card" will suffice. Just want to see what other CAD folks will be upgrading to ?
TIA

Well, let me preface this by saying I am not an expert, nor do I have direct experience with both cards using OpenGL Apps or OpenGL apps on the Mac specifically, either. I just did some basic searching around.

It was actually difficult to find ANY information on this topic, because nearly all of the discussion of graphics cards in the context of OpenGL and CAD/CGI apps invariably is about the professional workstation cards ala ATI FireGL and nVidia Quadro.

After searching around, it becomes obvious that for [current] workstation cards, nVidia's Quadro is faster than the equivalent ATI FireGL in professional OpenGL Apps. However, most likely due to nVidia crippling their Geforce OpenGL drivers to upsell you to Quadro, the Geforce line (including G92/8800GT) appears to be slower than the newest ATI Radeon 38xx series. I am actually somewhat shocked by this due to the domination by Nvidia's G92 (8800 series) in DirectX gaming performance and the industry-standard '3DMark' benchmark versus ATI's 3850/3870.

Since many of OSX's APIs are built on (i assume) OpenGL, this might explain why there has always been anecdotal reports (and benchmarks somwhere - I'm sure) of ATI cards dramatically outperforming more expensive nVidia cards in applications utilizing Core Image/Animation and even the OSX desktop feeling more fluid and responsive.

Unfortunately, I didn't write down all the references I found, but I found this one in my history:

Benchmark: SPEC viewperf (OpenGL CAD and 3D graphic apps benchmark)
http://forums.anandtech.com/messageview.aspx?catid=31&threadid=2168891&enterthread=y
Results: ATI 3870 2x-2.5x as fast as nVidia 8800GT in OpenGL rendering tasks
 
Looks like the 3870 will be my future card. I use Aperture and I would rather take advantage of more core image performance VS games. I do play games on occasion but Core Image performance is my primary objective for an upgrade.
 
Well, let me preface this by saying I am not an expert, nor do I have direct experience with both cards using OpenGL Apps or OpenGL apps on the Mac specifically, either. I just did some basic searching around.

It was actually difficult to find ANY information on this topic, because nearly all of the discussion of graphics cards in the context of OpenGL and CAD/CGI apps invariably is about the professional workstation cards ala ATI FireGL and nVidia Quadro.

After searching around, it becomes obvious that for [current] workstation cards, nVidia's Quadro is faster than the equivalent ATI FireGL in professional OpenGL Apps. However, most likely due to nVidia crippling their Geforce OpenGL drivers to upsell you to Quadro, the Geforce line (including G92/8800GT) appears to be slower than the newest ATI Radeon 38xx series. I am actually somewhat shocked by this due to the domination by Nvidia's G92 (8800 series) in DirectX gaming performance and the industry-standard '3DMark' benchmark versus ATI's 3850/3870.

Since many of OSX's APIs are built on (i assume) OpenGL, this might explain why there has always been anecdotal reports (and benchmarks somwhere - I'm sure) of ATI cards dramatically outperforming more expensive nVidia cards in applications utilizing Core Image/Animation and even the OSX desktop feeling more fluid and responsive.

Unfortunately, I didn't write down all the references I found, but I found this one in my history:

Benchmark: SPEC viewperf (OpenGL CAD and 3D graphic apps benchmark)
http://forums.anandtech.com/messageview.aspx?catid=31&threadid=2168891&enterthread=y
Results: ATI 3870 2x-2.5x as fast as nVidia 8800GT in OpenGL rendering tasks



winterspan, I cannot thank you enough for your perfect response.


Sometimes things can get out of hand when dealing with X vs. X card, etc. You answered my questions to the best of your knowledge and I really do appreciate the help.


With that said, I believe I will hold out for the Radeon.


I do not do enough SW to justify the expense of a Quadro or FireGL. If that day comes, business would be good, I will then use a dedicated windows box for CAD/CAM work.


For now, ATI it is :)


Now the wait begins !
 
Well, let me preface this by saying I am not an expert, nor do I have direct experience with both cards using OpenGL Apps or OpenGL apps on the Mac specifically, either. I just did some basic searching around.

It was actually difficult to find ANY information on this topic, because nearly all of the discussion of graphics cards in the context of OpenGL and CAD/CGI apps invariably is about the professional workstation cards ala ATI FireGL and nVidia Quadro.

After searching around, it becomes obvious that for [current] workstation cards, nVidia's Quadro is faster than the equivalent ATI FireGL in professional OpenGL Apps. However, most likely due to nVidia crippling their Geforce OpenGL drivers to upsell you to Quadro, the Geforce line (including G92/8800GT) appears to be slower than the newest ATI Radeon 38xx series. I am actually somewhat shocked by this due to the domination by Nvidia's G92 (8800 series) in DirectX gaming performance and the industry-standard '3DMark' benchmark versus ATI's 3850/3870.

Since many of OSX's APIs are built on (i assume) OpenGL, this might explain why there has always been anecdotal reports (and benchmarks somwhere - I'm sure) of ATI cards dramatically outperforming more expensive nVidia cards in applications utilizing Core Image/Animation and even the OSX desktop feeling more fluid and responsive.

Unfortunately, I didn't write down all the references I found, but I found this one in my history:

Benchmark: SPEC viewperf (OpenGL CAD and 3D graphic apps benchmark)
http://forums.anandtech.com/messageview.aspx?catid=31&threadid=2168891&enterthread=y
Results: ATI 3870 2x-2.5x as fast as nVidia 8800GT in OpenGL rendering tasks

That's not entirely correct though.

The FireGL V7700 beats every Quadro card made so far in professional applications, while being cheaper.
 
That's not entirely correct though.

The FireGL V7700 beats every Quadro card made so far in professional applications, while being cheaper.

In Maya, yes. In 3D Studio Max, it's about 30% slower. For Mac users, a non-issue, though. Wish Apple offered an option for the FireGL cards on the Mac Pro. It's hard to believe that after all this time of Intel Mac Pros, there's only one professional video card option for it, the 5600 Quadro. Though I'm not surprised Apple only offer the most expensive card, :rolleyes:
 
In Maya, yes. In 3D Studio Max, it's about 30% slower. For Mac users, a non-issue, though. Wish Apple offered an option for the FireGL cards on the Mac Pro. It's hard to believe that after all this time of Intel Mac Pros, there's only one professional video card option for it, the 5600 Quadro. Though I'm not surprised Apple only offer the most expensive card, :rolleyes:

Weird, reading the only review I can find of the FireGL V7700 it appears to beat everything else.
 
Very Impressive. Hadn't seen that. Thanks for the link. Off too the Apple store to see what that adds to the cost of a Power Mac I'll be getting next month. Oh, wait...:mad:

It retails for $1,099 but hopefully the plain-jane Radeon HD 3870 will mimick some of that. After all, the FireGL V7700 is just a rebadged Radeon HD 3870.

The bad news seem to be that the RV770 or the successor to the RV670 (Radeon HD 3800 series) is getting released this month (May 2008).
 
It retails for $1,099 but hopefully the plain-jane Radeon HD 3870 will mimick some of that. After all, the FireGL V7700 is just a rebadged Radeon HD 3870.

The bad news seem to be that the RV770 or the successor to the RV670 (Radeon HD 3800 series) is getting released this month (May 2008).

Yeah, the 4000 series. That's why the retention of the two generation old 2400 and 2600 cards in the recently "updated" :rolleyes: iMacs is so sad and why I will be getting a Mac Pro that at least has some upgradability. Maybe by the time the Nehalem machines arrive, we will have access to the entire lines of Quadro's and FireGL's and not just one, $2,500, 5600 Quadro. I can't say that I'm very hopefull, though, considering the recent 8800 fiasco and it's poor performance in pro apps compared to the 2 and 3 year old Ati cards.
 
And what About HDMI. Is the ATI going to include that? Really Important to HD output. Im About to buy my Mac Pro and I want to know that

Thanks a lot!

Diego
 
And what About HDMI. Is the ATI going to include that? Really Important to HD output. Im About to buy my Mac Pro and I want to know that

Thanks a lot!

Diego

Isn't it only vital for protected content?

I mean, DVI is already digital and pretty much the same as HDMI, just without sound.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.