Getting back to the benchmarking question, a couple of observations.
First, the Heaven benchmark is meant for Mountain Lion only under OS X. I don't know how much difference this might make but it could be very significant. If you're looking at the scores in Lion, then perhaps it would be more appropriate for everyone to run it under Windows?
Secondly, when I ran it under Windows, I noticed that the details in the scenes are far superior to those in OS X. The cobbles and brickwork; the ropes, conduits and screws on the gun carriages are all an order of magnitude more complex. It makes comparisons across operating systems very misleading since the OS X benchmark would seem to be a comparatively easy task.
My Mac Pro is a 4,1/5,1 hybrid with a 3.33 GHz hex, 24 GB of RAM and a GTX 680.
Here is the result of the extreme preset (1600 x 900, windowed, 8 x AA) under 10.8.3:
Image
Under Windows 7 64-bit, using the same preset with the OpenGL renderer, I get this:
Image
And with the DX11 renderer, here's the result:
Image
I would guess that another reason for the drop in performance is that Windows 7 runs flashed cards at PCIe 1.1 speeds but it's apparent from the frame rates that the Windows version of the benchmark is giving the graphics card a much tougher job to do. If this is the case, then differences between the hardware that is feeding data to the cards should make less difference than the benchmarks under OS X.
*edit* It's been pointed out that the lack of tessellation in OS X's implementation of OpenGL is the real reason.
Are any of you guys with 2,1 and 3,1 Mac Pros able to check this out under Windows?