Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Status
The first post of this thread is a WikiPost and can be edited by anyone with the appropiate permissions. Your edits will be public.
The most important temperature will be shown by the "GPU_Monitor_AMD" script, located in the same folder as HWMonitor. That temperature should be shown by HWMonitor as well, might be the one named "GPU Die" in your case. The GPU itself is specified for temperatures up to 90°C under full load.
In case your screenshot was taken when you haven't been doing any GPU intensive things for let's say 10 minutes, the difference between "GPU Die" and "GPU heatsink" seems too high (should be near to 0° difference then) - that would indicate a bad heatsink installation (this IS difficult to get right with the WX41x0s).

I had just ran the valley Unigine benchmark. I’ll have a look once I’m back and leave the computer on for a while to check on the temperatures, thank you.

I’ll also try to find out which exact setting shuts off my computer. There’s not that many differences between the plist files. Might take a while, but I’ll get to it.
 
The most important temperature will be shown by the "GPU_Monitor_AMD" script, located in the same folder as HWMonitor. That temperature should be shown by HWMonitor as well, might be the one named "GPU Die" in your case. The GPU itself is specified for temperatures up to 90°C under full load.
In case your screenshot was taken when you haven't been doing any GPU intensive things for let's say 10 minutes, the difference between "GPU Die" and "GPU heatsink" seems too high (should be near to 0° difference then) - that would indicate a bad heatsink installation (this IS difficult to get right with the WX41x0s).

They were briefly 34 and 32, but after a minute or so (no special usage other then Safari mostly), I see 43 (Die) vs 34 (Heatsink). I imagine the heatsink is the one "attached" to the heatsink? I'll probably need to research a little on that one on how to make sure it read the value correctly.

Edit: Another minute and I see 38 and 37. Not sure what I should expect. I'll see if I can get the data out of those histographs - otherwise what can I use to retrieve readings over a period of time?
 
They were briefly 34 and 32, but after a minute or so (no special usage other then Safari mostly), I see 43 (Die) vs 34 (Heatsink). I imagine the heatsink is the one "attached" to the heatsink? I'll probably need to research a little on that one on how to make sure it read the value correctly.

Edit: Another minute and I see 38 and 37. Not sure what I should expect. I'll see if I can get the data out of those histographs - otherwise what can I use to retrieve readings over a period of time?
In my current installation (using the low performing K4 thermal paste and a 2 mm copper shim between GPU die and heatsink) I get a difference of 6° to 7° when permanently scrolling in Safari. I don't consider that a good installation.

Hmm, now I do remember that my first WX-card was a WX4130, too. And it did cause system resets, but only when doing light tasks and never while benchmarking. I returned that card to China and bought an original WX4150 spare part from the UK - and that one is working perfectly stable.
 
Read post #1 or ignore it and buy this card and check it out and report back, here!

I did and saw the notice about Dell branded cards, and since this is an HP branded card, I thought maybe someone has tried it before. Otherwise I'll buy it and report my findings.
 
The way I read #1, it sounded pretty certain that 'only the "Dell" branded versions currently work.' That was not the worst part. WX4150 was rated at 9/10. The way I read that rating was the work was incomplete and you were expected to somehow finish it or wait for it to get finished while some functionality might be missing.

I am very grateful for the information that is put together. However, I have already lost enthusiasm in this, to the point that not only will I not upgrade my HD 5750, I won't even replace it when it dies but scrap the whole iMac. The risk is great and the work is difficult.

I welcome the great work being put into this, but to you and everyone else who is contemplating this mod my question is why? There are many many turnkey solutions to your problem, be it to play games, or just fix an ailing graphics card, such as upgrading the entire computer. How much is your time worth? Would you even be saving any money if you run into a compatibility issue and cannot return the card? Would it be worth keeping on upgrading a 9-11 year old computer when replacing the whole machine is so much easier?

This mod is great. It is just not for everybody, but only for the most committed.

e.g. I figured unless I go all the way up to a wx7100, this is not worth my time and effort. But at that price point, if all I am after is for my son to play games, then why didn't I just snatch up a used PS4 or XBox from my local craigslist for the same cost? If I am running non-linear movie editing such as DaVinci Resolve on the machine, would I be happy with an i7-870 CPU with no USB 3.0? I would rather upgrade the whole thing and if price is a concern, move to Windows. There is only a very narrow niche where you would want to modernize only the GPU but not the rest of the machine.
 
In my current installation (using the low performing K4 thermal paste and a 2 mm copper shim between GPU die and heatsink) I get a difference of 6° to 7° when permanently scrolling in Safari. I don't consider that a good installation.

Hmm, now I do remember that my first WX-card was a WX4130, too. And it did cause system resets, but only when doing light tasks and never while benchmarking. I returned that card to China and bought an original WX4150 spare part from the UK - and that one is working perfectly stable.
Just to make it clear:

running the regular AMD plist I get no crashes at all. I can run the benchmarks with no issues. I also let a game sit and nothing happened and the temperature did not raise to alarming levels.

what is happening, is that another plist is making my computer crash and Shut down as soon as I open the benchmark. I do not believe this is a temperature issue. The temperatures look reasonable enough, and having looked at the graphs over some time, the two temperatures are usually near enough each other. I think it’s just some erroneous configuration playing havoc (think blue screen).

if I do get a shut down during the next week or so, using the working plist file, then I’ll reconsider. I haven’t managed to find the shut down error in console, but maybe I’m looking wrong.
 
Just to make it clear:

running the regular AMD plist I get no crashes at all. I can run the benchmarks with no issues. I also let a game sit and nothing happened and the temperature did not raise to alarming levels.

what is happening, is that another plist is making my computer crash and Shut down as soon as I open the benchmark. I do not believe this is a temperature issue. The temperatures look reasonable enough, and having looked at the graphs over some time, the two temperatures are usually near enough each other. I think it’s just some erroneous configuration playing havoc (think blue screen).

if I do get a shut down during the next week or so, using the working plist file, then I’ll reconsider. I haven’t managed to find the shut down error in console, but maybe I’m looking wrong.
You might well be facing an isolation problem - as soon as the current drawn is high enough it might produce a short circuit between the board and the heatsink.
I guess that with the working plist the power drawn by the graphics card stays rather low.
Scrolling in Safari makes my card draw 14 Watt while it rises up to 41 Watt while running Geekbench 5. What's the power draw of your card while running the valley benchmark using the working plist?
 
** NVIDIA Geforce GTX880M Mac Edition ROM **
** NVIDIA Geforce GTX870M Mac Edition ROM **
** NVIDIA Geforce GTX860M Mac Edition ROM **

Genuine Native Boot Screen & Brightness Control


View attachment 942200


The following are the UGA equipped roms I put together for:

NVIDIA GeForce GTX 880M
N15E-GX-A2, MXM-B (3.0)
8GB VRAM

NVIDIA GeForce GTX 870M
N15E-GT-A2, MXM-B (3.0)
3GB VRAM

NVIDIA GeForce GTX 860M
N15P-GX-A1, MXM-B (3.0)
2GB VRAM

GTX880M - первая карта видеопамяти объемом 8 ГБ, работающая на наших машинах! Теперь у меня столько же врамов, сколько системной оперативной памяти. Рад сообщить, что macOS видит весь баран. Он полностью повышается даже на базовых тактовых частотах. Когда я купил эту карту, заклепки были очень длинными и удерживали слишком большую часть графического процессора от поверхности радиатора. Раньше было тепловое дросселирование, так как он почувствовал ситуацию перегрева. Когда я удалил заклепки и использовал винты, чтобы закрепить его, я смог получить гораздо более плотное уплотнение, и карта поднялась должным образом.

Протестировано на iMac 2011 года с использованием High Sierra 10.13.6. Пожалуйста, не стесняйтесь тестировать на других версиях MacOS, я буду обновлять этот пост по мере необходимости с успехом / неудачами.

  • Эти ПЗУ не требуют стороннего загрузчика, такого как OpenCore.
  • Для них потребуется модификация базового степпинга яркости AppleIntelPanelA / ApplePanels / F10Ta007.
  • Проблемы с глубиной кадрового буфера пока остаются, и их можно временно исправить, перейдя в режим ожидания.

Как и в предыдущем случае, эти ромы должны вернуть:

⦁ Подлинная естественная регулировка яркости
⦁ Настоящий серый экран ранней загрузки (индикатор выполнения этапа 1 и 2)
⦁ Соответствие оригинального загрузчика macOS



** обновление 9/4/2020 **
обратите внимание, что 870M_6GB_UGA.rom является экспериментальным и находится на стадии тестирования для тех, кто заинтересован и имеет карту.


"безумно здорово!"
-Стив Джобс

View attachment 940773 View attachment 940774
[/ QUOTE]

thank you very much for your work!!! I wanted to share my impressions, on hi Sierra the buffer problem was noticeable, on Catalina there are fewer such problems, cs go on ultra, in Sierra there were problems everything hung up! I am very much looking forward to solving problems with the buffer) since this still exists, Windows 7 also had problems. 10 Windows has not yet put, perhaps everything works much better there) I apologize for the translation)
 

Attachments

  • Снимок экрана 2020-09-07 в 16.10.06.png
    Снимок экрана 2020-09-07 в 16.10.06.png
    564.7 KB · Views: 342
  • Снимок экрана 2020-09-07 в 15.57.42.png
    Снимок экрана 2020-09-07 в 15.57.42.png
    130.2 KB · Views: 266
  • Like
Reactions: ZenSurfeur
You might well be facing an isolation problem - as soon as the current drawn is high enough it might produce a short circuit between the board and the heatsink.
I guess that with the working plist the power drawn by the graphics card stays rather low.
Scrolling in Safari makes my card draw 14 Watt while it rises up to 41 Watt while running Geekbench 5. What's the power draw of your card while running the valley benchmark using the working plist?

Thanks: this is a strong possibility. I ran Unigine and Geekbench at the same time (my current mouse scrolling is broken), and the GPU Rail sat at around 24-28 W. Idle, it goes back to 8-10.

I also ran Geekbench (alone) Metal and - like the Unigine Valley benchmark - it's considerably lower then the reported scores: 14k vs 19k of 2009 i5 and i7 27" iMacs.

The kexts however are the same, so I imagine some other configuration is capping the graphical card somewhere.

Where should I look to next?
 
Salve, ho sostituito la scheda video (6970) del mio iMac 2011 16gb ram i5 3.1mhz con Nvidia Quadro K2100M, seguendo i consigli degli esperti di questo forum, ho utilizzato Bios di @nick [D] vB e OpenCore, sembra tutto funzionare tranne che le informazioni sulla memoria del K2100 mostrano 1gb di ram, anche la temperatura dell'Imac mi sembra eccessiva. Spero che ci sia un'anima buona per aiutarmi.
Schermata 2020-09-07 alle 19.37.33.png
 
Salve, ho sostituito la scheda video (6970) del mio iMac 2011 16gb ram i5 3.1mhz con Nvidia Quadro K2100M, seguendo i consigli degli esperti di questo forum, ho utilizzato Bios di @nick [D] vB e OpenCore, sembra tutto funzionare tranne che le informazioni sulla memoria del K2100 mostrano 1gb di ram, anche la temperatura dell'Imac mi sembra eccessiva. Spero che ci sia un'anima buona per aiutarmi.View attachment 950996
There are several video memory versions of k2100m, and Samsung Hynix can correctly identify 2G video memory. Elpida version can only be 1G
 
Good afternoon, I have an iMac 2011 21.5 in, and thanks to my brother he upgraded the sad GPU with the Nvidia Geforce 770m graphics card. Brother removed the DVD drive to reduce power and heat, and have only an SSD connected. Windows 10 runs perfectly without any issues. The brightness toggle and night mode surprisingly works. It currently has the stock bios, and if I flash the one provided here, I lose the brightness toggle and night mode on Windows 10. It has the Nvidia drivers installed provided from their Website. The brightness toggle doesn't work on macOS. I have installed Mojave and Catalina, but I get a black display only with the stock bios. I can only get a display when the bios is flashed provided in this forum. And it runs smooth, no overheating or rebooting. The GPU temperature stays around 50C. One time my nephew was running Fornite and temperatures reached 99C, I was surpised it didn't reboot/fried. I won't be pushing the graphics card hard too much. I only use it for Photoshop and Illustrator. I use macOS externally with a sata to usb adapter and it runs well. I hope this information is helpful.


imacinfo.png
IMG_7990.jpg
 
  • Like
Reactions: iPlasm
[QUOTE = "jay508, post: 28860820, membro: 1224911"]
Esistono diverse versioni di memoria video di k2100m e Samsung Hynix può identificare correttamente la memoria video 2G. La versione di Elpida può essere solo 1G
[/CITAZIONE]
ciao, la scheda è stata venduta per 2 gb di ram, non sapevo ci fosse anche 1 gb
 
Does the "Dell only" guideline apply to _all_ MXM GPUs? Or can I use non-Dell NVIDIA (GTX and Quadro) GPUs?
 
I have reassembled my iMac re-installing the K1100M with the card's X-bracket and some insulating tape on the heat sink in addition to what was there.

The #3 LED lights on a boot up test run.

After installing the display screen and booting up again I ran into exactly the same problem as I had upthread.

I have concluded that the K1100M card I bought has some malfunction with respect to the video screen display. I will purchase another K1100M (or perhaps a K2100M, have to do the power calculation once again). I get the "snow" effect on the display, but it is displaying the boot screen.

I was able to boot with "option" key down and get a choice of disks.

Bummer, but I have learned quite a lot in the process.
 
Part of the project was to make some measurements of the iMac mid 2011 21.5" heat sink.... this is a summary drawing showing the height off the board:
21.5 inch heat sink.jpg


the image is flipped around the vertical and overlays the card. The dimensions are the distance towards the top of the card, the GPU contacts the central square.

This is shown on the overlay:
MXM-A model annotated.jpg

which indicates that this heat sink would fit any MXM-A standard card without modifications or additional insulation. The numbers in the square brackets indicates the additional clearance greater than the MXM-A standard.

This also indicates that if you are using thermal pads for the memory the thickness is 0.4mm.

The solid models for the parts was built in FreeCAD
MXM-A and heat sink tipped back.jpg


MXM-A and heat sink tipped front.jpg


MXM-A and heat sink side.jpg


the side is the memory chip side, the heat sink clears the highest points as described in the standards document.

In particular, the NVIDA MXM-A cards (K610M, K1000M, K1100M, K2000M, and K2100M, should fit this heat sink without any modifications (NVIDIA "invented" the standard).
 
Hi everybody,
my first post here, although I've read (almost) all 372 pages in this thread.
I have a 21.5 iMac 11,2 (mid 2010) with a faulty GPU, so I decided to replace it along with other upgrades.
Basically I replaced:
the CPU fom i3 540 to i5 680
the RAM with 2x 8GB DDR3 10600
the GPU from HD4670 to Nvidia Quadro K1100
the HDD with an SSD (not done yet, wanted to have it up and running before a fresh install).
I took a lot of photos and could set up a wiki if needed.

Problem is: it seems I cannot install *any* operating system.
I tried with High Sierra (latest officially supported OS), Sierra and El Capitan, but to no avail.
got to the utility page, I select to install on the freshly cleaned HDD (, install start and interrupts with 3 to 4 minutes left.
restarts and does the same again and again.
I made the bootable USB in various ways (downloaded DMG file and used TransMac, used dosdude utility from a running imac and terminal instructions, etc.) but ended up with the same output.
on the installer log the critical error is: "The Quartz framework's library couldn't be loaded".
could anybody please point me to the right direction? booting into recovery partiton seems to work ok, screen is visible and internet connection is running flawlessly

could the issue be related to the K1100 GPU? which OS should be easier to install at first?

thanks in advance
 
Last edited:
Hi everybody,
my first post here, although I've read (almost) all 372 pages in this thread.
I have a 21.5 iMac 11,2 (mid 2010) with a faulty GPU, so I decided to replace it along with other upgrades.
Basically I replaced:
the CPU fom i3 540 to i5 680
the RAM with 2x 8GB DDR3 10600
the GPU from HD4670 to Nvidia Quadro K1100
the HDD with an SSD (not done yet, wanted to have it up and running before a fresh install).
I took a lot of photos and could set up a wiki if needed.

Problem is: it seems I cannot install *any* operating system.
I tried with High Sierra (latest officially supported OS), Sierra and El Capitan, but to no avail.
got to the utility page, I select to install on the freshly cleaned HDD (, install start and interrupts with 3 to 4 minutes left.
restarts and does the same again and again.
I made the bootable USB in various ways (downloaded DMG file and used TransMac, used dosdude utility from a running imac and terminal instructions, etc.) but ended up with the same output.
on the installer log the critical error is: "The Quartz framework's library couldn't be loaded".
could anybody please point me to the right direction? booting into recovery partiton seems to work ok, screen is visible and internet connection is running flawlessly

could the issue be related to the K1100 GPU? which OS should be easier to install at first?

thanks in advance
first did you flash the k110m?
did you try installing high sierra directly from internet (use ethernet) > alt cmd + r ?
what kind of thermal compound did you use?
 
first did you flash the k110m?
did you try installing high sierra directly from internet (use ethernet) > alt cmd + r ?
what kind of thermal compound did you use?
thank you for your help

I bought it off ebay and the seller specified the VBIOS has been flashed with the ios compatible one (Nick [D]vB)s.
Anyway the display shows the startup apple, the progress bar and all the menus without any issue.

I tried reinstalling via internet but the globe kept on turning for a few hours without any sign of changes.

For the GPU chip I used thermal grizzly Kryonaut, plus the proper white thick paste for the memory chips (unfortunately only half of them, as the memory chips are also on the back of the card on the K1100).
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.