Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Status
The first post of this thread is a WikiPost and can be edited by anyone with the appropiate permissions. Your edits will be public.
Status
Not open for further replies.
Tried installing my "new" GTX 570 in my Mac Pro 1,1.

Trouble is the screws holding the heat sink on the card stick out too much at the back for the card to fit the bottom slot, and when put in slot 2, the card can't be configured as using 16 lanes as far as I can tell.

Is that right? Is there a non destructive fix that I can apply to fit this card in the bottom slot? Can upper slot be configured as 16 lanes?
 
Ok thank you for this fast response.

You're right on one point, i can't use my MP under 100% and various loads for now, so i don't have a clue about future kernels.
As soon as i got time (right now, my GTX is on my i7 3770k) i will re-use it for Mac gaming / V Editing and encoding.
Right now, i didn't had any kernel. QE/CI works very well (it's fluid and translucent) dual DVI with my cinema HD. Cinebench show a pretty good score. (not really optimized too)

My 4890 mac-flashed via netkas, and his QE/CI patch doesn't work very well on 10.7.


The main problem is, nowadays, it's obvious that ours MacPro 1,1 are dying. no more MacOs support, neither components.
Don't get me wrong, i know this is not a "true official" Nvidia support, but i know there's a nice number of users with old and dying GPU.

My X1900 is dead (1y)
My GT8800 is dead (2y)
My 4890 is dying. (2y)
I won't buy a 5770 at freaking 300 bucks for my mac, and PC version are not sold anymore.
Be realistic, Ati 6xxx and 7xxx are not compatible for MacPro1,1
Old or poor(640-) Nvidia are compatible, but either hard to find and expensive (5xx) or not good for gaming.
I know that 670 is overkill for a MP, but at least, it's new, cold, quiet (Asus) and compatible.

At least, you can put an asterix in your OP for guys who want to take their chance, because in my case, i didn't had time to wait. (i needed a working card (EFI) to install 10.7)
As i wanted a gaming GPU, i turned toward ATi, after reading that post. Awful error. they won't do 32bit Kext anymore.
 
Tried installing my "new" GTX 570 in my Mac Pro 1,1.

Trouble is the screws holding the heat sink on the card stick out too much at the back for the card to fit the bottom slot, and when put in slot 2, the card can't be configured as using 16 lanes as far as I can tell.

Is that right? Is there a non destructive fix that I can apply to fit this card in the bottom slot? Can upper slot be configured as 16 lanes?

Had an XFX 570 with the same issue.

Removed the 4 screwcaps on the poles, then lifted up the heatsink with the fans.

Removed the 4 poles from the heatsink.

Then applied new thermal paste (Arctic silver) on the prior cleaned GPU, put heatsink back and aligned it. Then used standard 3 mm screws (15 mm long, if I remember correctly, you have to check yourself) inserted from the backside of the card and screwed them directly into the existing holes in the heatsink without using any poles. You have to be careful not to overtighten the screws or you might crack the board. You could of cause use customized distance tubes if you feel saver. Put tiny spring rings below the screwheads to compensate for thermal expansion of the heatsink.

I didn't reattach the square metal frame on the backside of the card as it has 2.5 mm holes so I had to make them wider to fit the 3 mm screws.

Assembly workes perfectly in slot 1 and is reversibly, though needs you to reapply thermal paste.
 
Last edited:
Had an XFX 570 with the same issue.(...)

Thanks a lot for your detailed feedback. I figured out only the bottom slot had 16 electrical lanes, even though all slots are full size in length.

I'll follow your guide when I will have some more time. For the time being, I'm back with my good old HD 3870.

BTW, I found the GTX 570 fans a bit loud. Could I re-use the Zalman VF1000 that I fitted to my HD 3870?
 
Last edited:
EVGA GeForce 660 Ti

I can confirm that the EVGA GeForce 660Ti with 2GB GDDR5 works. I followed the instruction in the first thread. I can also confirm that you need to install the NVIDIA CUDA 5.0 for MAC Drivers. I followed the link in the original post for the 10.8.2 drivers and was pleasantly surprised when it said that it now supports 10.8.3 with a new release of version 5.0.45. SCORE.

The card was plug and play. I had to order an additional power cord (again, look at the first post).


I am running a MAC Early 2008 3,1 Quad-Core Intel Xeon 2.8Ghz.
 
BTW, I found the GTX 570 fans a bit loud. Could I re-use the Zalman VF1000 that I fitted to my HD 3870?

Sounds you have experience with replacing GPU coolers, so my mod for the screws should be no problem for you. Probably makes sense to fit a quieter 3rd party cooler on your 570, when have removed the stock once already. Can't comment on your Zalman though.

I found the dual fan cooler on my XFX 570 not to be loud. But the 570 draws a lot of power even when idle, thus the Macs expansion slot fan ran a bit faster and was noticeable even on idle. So make sure it is the GPU fan, that is loud, and not the expansion slot fan before you invest money in a 3rd party cooler

Currently I have an AMD 7950 in the Mac Pro, which runs a bit cooler and quieter.
 
Sounds you have experience with replacing GPU coolers, so my mod for the screws should be no problem for you.

Turns out it will not be as easy as that, the poles appear to be soldered to the heatsink. Maybe with shorter screwcaps... or a third party sink? With regard to the VF1000, it has a different power connector and has only one fan, so it may not be adapted to the GTX. I wonder if I should not just resell the card and find something easier to fit.
 
Turns out it will not be as easy as that, the poles appear to be soldered to the heatsink. Maybe with shorter screwcaps...

Soldered to the heatsink, sure?

Using just shorter screwcaps was my first attempt as well. Actually instead of screwcaps I got 2.5 mm nuts. On my card the thread of the poles on the backside of the card was still too long to make the card fit into PCIe slot 1. So I'd had to shorten the threads of the 4 poles.

That would have worked but I found it easier to just replace the poles with 3 mm screws.

What model is your GTX 570?
 
Soldered to the heatsink, sure?

(...)

What model is your GTX 570?

The poles have an hexagonal section, but appear to be soldered or at least glued on the heatsink. I cant see any space between the base of the poles and the sink, more a rounded junction. And I tried unscrewing them (without too much force) without luck.

It is a Point-Of-View GTX 570 (1280MB), model VGA-570-A3-1280.
 
It is a Point-Of-View GTX 570 (1280MB), model VGA-570-A3-1280.

Mine is a Point-of-View VGA-570-A2-2560. Heatsink looks identical to your model. The poles have an hexagonal section with threads on both ends, 2.5 mm on the card side, 3 mm on the heatsink side. It is screwed directly into the aluminum of the heatsink, not soldered or glued.

http://picpaste.com/Foto-BceWCaRQ.JPG

Sorry for the blurred image. It was a quick take with my iPhone.

I think it is the same on your card but obviously cannot guarantee.

As your's is a 1.3 GB card you might sell it and get another 570 from another brand (i.e. EVGA). 1.3 GB cards are plenty on Ebay, while 2.5 GB cards are almost impossible to get.

That said the cooler on the POV card is very good and the card is well build. Only downside, other than the big screwcaps, is the fact, that the card is 2.5 slots high, so you loose PCIe slot 2 in a Mac Pro.
 
Last edited:
"16) My GPU has more than 2GB of RAM and OpenCL isn't working. What do I do?

Update: As of 10.8.3, this workaround is no longer needed and you should not attempt to modify the framework binaries."

Hey everyone.
I bought a GTX580 3GB from Macvidcards. Works great and hasn't blown up my computer, because he worked his magic and made it so it draws less power. :D

(note: I may be mixing up open GL & CL, not sure which one is in relation to AE and what's listed in the quote above)

Finally got around to making sure CUDA and Open CL work on 10.8.3 MP5,1, because of crazy deadlines at work and this 580 is in my home set-up. Cuda works great with the newest driver from NVIDIA when adding the card to APP and AE text files. But when I am in AE and look at the GPU information window under the previews settings, I am only getting 2GB memory for open GL under this window. Cuda reads 3GB fine. Just wondering if I do in fact need to do the open CL hack to get this working with full 3GB ram?

Texture memory is default set to 818MB, should I set this higher?

OpenGL info from AE:
Device: NVIDIA GeForce GTX 580 Open GL Engine
Version: 2.1 NVIDIA - 8.10.44 304.10.65f03 (this is the newest driver *5.0.45)
Total Memory: 2.00 GB
ShaderModel *Grayed out*

CUDA info from AE:
Driver Version: 5.0
Devices: 1 (GeForce GTX 580)
Current Usable Memory: 2.45 GB (at appplication launch)
Maximum Usable Memory: 3.00GB

Thanks in advance!
 
Last edited:
^No, I don't think you need OCL hack. Is seems that's OpenCL nature:
mitch_de on insanelymac said:
Other than running games, were to less VRAM produces little freezes (fps drops) by VRAM swapping, OpenCL needs enough free + unfragmented (one block!) onboard (VRAM) memory which cant be swapped like using OpenGL. OpenGL never can run put of VRAM - it only loose much speed / stalls.
One more thing: The memory OpenCL uses must be in one block - so not all memory available for OpenGL is also available for OpenCL.
For example an 512 MB card, running OS X desktop (uses little VRAM) uses already about 50 MB. You will not have the rest - 462 MB (512 -50) available for OpenCL tasks. Its less, i think fixed to some 32/64 MB "slots" like 384 MB or 328 MB and or reduced by some VRAM fragments.
 
Some programs only report 2GB max, (Photoshop CS6 and Steam) but in system profile it shows 3GB.
I think it's just cosmetic and wouldn't' worry about it
 
CUDA 5.0.58 now available

Hi all,

I checked the CUDA pref pane and it says CUDA 5.0.58 now available. Anyone updated yet?? :)
 
Hi all,

I checked the CUDA pref pane and it says CUDA 5.0.58 now available. Anyone updated yet?? :)

Tried to, but standalone update is not on Nvidia's site and when I try to update from the preferences pane I get the message saying update failed.

Lou
 
I found a drop in the Luxmark scores between NVIDIA 10.8.2 and Apple 10.8.3 drivers as well for my 570.

But the drop is by far not as drastic as you have on your 680.

Apple really screwed the OpenCL performance of the Kepler cards in their 10.8.3 driver. :(

Hopefully a new NVIDIA driver fixes the issue. Seems it takes a bit longer though this time ... :confused:

Till then the Fermi cards are probably still the better choice for GPGPU.

So Asgorath should wait a bit more before he updates the initial post.

Have you seen the EVGA GTX 680 Mac Edition benchmarks from barefeats?

No surprises, it leads in gaming tests.
But it beats the AMD 7950 in the Motion 5 test by 80% !?!

IMHO all Apple Pro Apps use OpenGL and not CUDA and according to the Luxmark benchmark, the 79xx Radeons run circles around the Kepler cards in OpenCL performance.

The 7950 is even slower than the old 4870 in the Motion 5 benchmark!

Wonder whether something is wrong either with the 7950, Apples's Radeon driver or barefeat's Motion 5 benchmark.


Furthermore according to this post http://www.tonymacx86.com/343-os-x-10-8-3-nvidia-6xx-opencl-benchmarks.html the lower Luxmark Score for Kepler cards in 10.8.3 would just reflect the real performance of that card and are in-line with the scores on Windows and Linux, while the scores in 10.8.2 would have been much too high.

That would backup Asgorath's initial statement, that Fermi cards perform stronger in GPGPU usage than Kepler cards.

So all could be right with the world again, wouldn't there be these disturbing benchmark results from barefeats. ;)

Still difficult to answer the question, whether a GTX 570, a GTX 680 or a Radeon 7950 is currently the best choice for GPGPU usage.
 
Last edited:
Tried to, but standalone update is not on Nvidia's site and when I try to update from the preferences pane I get the message saying update failed.

Lou

Now the Cuda Preferences Pane says "No new CUDA Driver available". I have a feeling that 5.0.58 will be available when 10.8.4 is released and further that the no new Nvidia sourced drivers will be available for rather 10.8.3 or 10.8.4.

Lou
 
Hey Guys,

I'm looking to get a EVGA 650ti Boost for my Mac Pro. Before I order it, I need to get the proper cables off the internet first. For the EVGA Card, it has this picture of the power supply
E145-0659_vgallery06_sdg_gl_7876676.jpg


Now I want to know, do I still need to get the cables suggested in the FAQ (1or 2), or do I need to get one that matches the above photo like this?
 
As an Amazon Associate, MacRumors earns a commission from qualifying purchases made through links in this post.
^^^^For a Mac Pro you need power cables that look like this.

Lou
 

Attachments

  • $(KGrHqR,!rYFCMJrsT69BQoyq6KzVw~~60_57.JPG
    $(KGrHqR,!rYFCMJrsT69BQoyq6KzVw~~60_57.JPG
    37.7 KB · Views: 92
PNY GTX 680 confirmed working with EVGA Mac BIOS

We can add another candidate for flashing with the EVGA BIOS, the PNY GTX 680 (PNY VCGGTX680XPB). All working at 5.0 GT/s, according to CUDA-Z, in OS X (PCIe 1.1 in BootCamp, according to GPU-Z) with boot screens.

:D
 

Attachments

  • Screen Shot 2013-05-01 at 14.43.12.png
    Screen Shot 2013-05-01 at 14.43.12.png
    120.7 KB · Views: 136
  • Screen Shot 2013-05-01 at 14.43.59.png
    Screen Shot 2013-05-01 at 14.43.59.png
    121.5 KB · Views: 122
  • Screen Shot 2013-05-01 at 14.45.32.png
    Screen Shot 2013-05-01 at 14.45.32.png
    95.6 KB · Views: 295
Cuda driver 5.0.58 while not on Nvidia's site is here:

http://forum.blackmagicdesign.com/viewtopic.php?f=11&t=6190&start=20

I just downloaded and installed.

Lou
 
Almost six weeks and no nVidia new web drivers. Getting a bad feeling about this.
 
Almost six weeks and no nVidia new web drivers. Getting a bad feeling about this.

I've been anxiously waiting too as I still haven't been able to resolve my sleep issues with the 660 ti. Maybe a reinstall will work but it's a hassle and there's no promise it will actually solve the issue.

I'd rather not leave the tower on 24/7 during Florida summers.
 
Status
Not open for further replies.
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.