Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
@PowerMac G4 MDD hay is your card new or used? did you flash it with new bios? under windows is it crashing with default Wattman settings ?
are you watching it in windows with wattman (or like app) to see why it crashes, heat,power, stability?

it will be relay odd if it's a new card and crashing with no changes made to it.

@h9826790 have you modded the power limits of the card in a new way, THX for the GPU monitor thing :D ill try it now

O and if any one who has not used liquid metal before try's it be vary safe, it's not like normal thermal compounds.
it has much better heat transfer but is also conductive and melts some mettle so you have to be make shore not to spill any, make shore you follow instructions and maybe watch a good video to see how it's done.
if your new to it a good branded thermal past that is not conductive is fairly safe (and may well be a upgrade over the factory past) as long as you dont use to little your fine.

it will void the sapphire warranty (at least in the UK/EU)

@calmasacow look at the first two pages of this topic, it has all you need. but also dont forget not all cards are the same, my card has a low ASIC so it dose not work as well as some also it was used to mine cripto and i suspect the mem controller may be a tad sad from that :oops: or something.

for high lux mark just use the one click mem timing button and your done, that will lift your score
for low fan nose drop it down to something close to 1300mhz

voltage seems to be dependent on ASIC quality, better quality and it looks like it auto uses lower voltage (im not 100% but think so, at least up to a point) low ASIC quality and higher voltage (thats my card)

edit thanks h98 that script is super easy compared to using terminal with the self updating :D
 
Last edited:
@PowerMac G4 MDD hay is your card new or used? did you flash it with new bios? under windows is it crashing with default Wattman settings ?
are you watching it in windows with wattman (or like app) to see why it crashes, heat,power, stability?

it will be relay odd if it's a new card and crashing with no changes made to it.

@h9826790 have you modded the power limits of the card in a new way, THX for the GPU monitor thing :D ill try it now

O and if any one who has not used liquid metal before try's it be vary safe, it's not like normal thermal compounds.
it has much better heat transfer but is also conductive and melts some mettle so you have to be make shore not to spill any, make shore you follow instructions and maybe watch a good video to see how it's done.
if your new to it a good branded thermal past that is not conductive is fairly safe (and may well be a upgrade over the factory past) as long as you dont use to little your fine.

it will void the sapphire warranty (at least in the UK/EU)

@calmasacow look at the first two pages of this topic, it has all you need. but also dont forget not all cards are the same, my card has a low ASIC so it dose not work as well as some also it was used to mine cripto and i suspect the mem controller may be a tad sad from that :oops: or something.

for high lux mark just use the one click mem timing button and your done, that will lift your score
for low fan nose drop it down to something close to 1300mhz

voltage seems to be dependent on ASIC quality, better quality and it looks like it auto uses lower voltage (im not 100% but think so, at least up to a point) low ASIC quality and higher voltage (thats my card)

edit thanks h98 that script is super easy compared to using terminal with the self updating :D

My card was used (like new). Didn't do anything to it. In OSX, it's fine. In Windows, it would crashed when under high load (such as when playing games), so I researched the issue and came upon a video whose creator recommended different Wattman settings. Those settings, AFAIR, included voltage and memory adjustments. Now, no crashes in Windows.
[doublepost=1540948216][/doublepost]
I just apply some liquid metal to my RX580. Since I need to test it anyway, so, I did that in Windows. No crash even with OC. So, unless you know the temperature during crash. I personally still treat that as number 1 suspect.

Anyway, if you want to monitor the GPU parameter in macOS, you can use my attached Automater workflow. Simply open it and run (play button), then it will display the GPU parameter, and refresh every 2 seconds.

Thanks! Is this something I just kill within the Activity Monitor, later on?
 
My card was used (like new). Didn't do anything to it. In OSX, it's fine. In Windows, it would crashed when under high load (such as when playing games), so I researched the issue and came upon a video whose creator recommended different Wattman settings. Those settings, AFAIR, included voltage and memory adjustments. Now, no crashes in Windows.
[doublepost=1540948216][/doublepost]

Thanks! Is this something I just kill within the Activity Monitor, later on?

Automator is another independent software build-in inside the macOS, you can open and close it like any other normal software, no need to kill it inside activity monitor.
 
if it's a used card it may be like the one i got, if it's been used for mining criptocoins it may have some relay craze bios on it.

the screen shot i posted on the first page (post 16 Sapphire PULSE RX580 8GB VBIOS Study )
shows the mining bios with craze low power settings and relay high mem speed.
i was told it crashed if you tried to play a game on it before i got it, after flashing with the default bios it worked fine, i then moved to using @thunder72fr bios which has slower core speed (ends up faster), faster mem settings and patched for the mac frame buffer im much happier.

now the card is faster (much faster for compute ie bigger lux mark score) and with a slower core clock it's much cooler so the fans stay much quieter + power use is down.

if you look at my post (25) you can see i played in windows and it looks like voltage is linked to core speed (& maybe ASIC quality) so around 1300mhz i hit relay low fan speed & power use. (AMD did it as there where lots of RMA's with from people who bricked cards with stupid bios settings :rolleyes: )

for me close to 1300mhz is super nice as the fans say super slow/quiet and thunder72fr changes to ram speed i end up with the card working faster for compute ie video work (to be fair i cant tell, did not do a lot of testing but super happy with lower noise)

& at default settings the card got so hot that under load it thermal throttled so hard the core dropped down to ~1313mhz so it relay was a net win for me :D

edit
only thing to mention is i have the same Samsung mem as thunder72fr so if you want you can do the changes yourself to your or the default bios for the card
and there is some risk to flashing too so be safe
 
Here is my Original BIOS settings:
original.PNG
How card is detected by GPU_Z:
Снимок.PNG

Here is how i modded it according to your High Performance mode with 1 click timing patch:
modded after 1 click timing patch.PNG

By the way, why are there two VRAMS? Samsung and Hynix?
vrams.PNG

Card was flashed successfully, i tried to launch game in WIndows after flashing (Batman Arkham Night), Wattman showed this usage:
gaming.PNG


But after 10 min of playing PC crashed. ;-(

Back to Macos, i did LUXMark test. I can not say that anything has changed, i got it around 13000 earlier, it is the same after flashing, i don't know why:
Снимок экрана 2018-11-01 в 17.10.29.jpg

Why Luxmark shows 300 mhz? System detects everything right (screenshot in idle):

Снимок экрана 2018-11-01 в 18.11.42.jpg


UPDATE: i discovered that LuxMark shows 300 mhz also in thunder72fr results here

Maybe i missed something?
 
Last edited:
well i can see you have oc'ed your ram, i mentioned in my post's that i was not able to oc ram & it gave lots of errors if i even went with a 50mhz oc ! you have added 250mhz which is the max
for me timings patch + default ram speed worked ok but any oc gave lots of problems

did you do tests in osx/windows and watch temps/power use? before and after?

there's 3 brands of vram for rx 580's, each one works at different speeds so millage vary & in windows you have to check to see what wattmand is doing as it may override settings.

best thing to do (like i did) is to play in windows with wattman and see what works and what dose not work if you want to push for max speed/power
not all cards are equal
i suspect luxmark only represent a small number of apps that rely on fast mem speed

if your not used to OC'ing things may be best you stick to safer settings.
 
well i can see you have oc'ed your ram, i mentioned in my post's that i was not able to oc ram & it gave lots of errors if i even went with a 50mhz oc ! you have added 250mhz which is the max
for me timings patch + default ram speed worked ok but any oc gave lots of problems

did you do tests in osx/windows and watch temps/power use? before and after?

there's 3 brands of vram for rx 580's, each one works at different speeds so millage vary & in windows you have to check to see what wattmand is doing as it may override settings.

best thing to do (like i did) is to play in windows with wattman and see what works and what dose not work if you want to push for max speed/power
not all cards are equal
i suspect luxmark only represent a small number of apps that rely on fast mem speed

if your not used to OC'ing things may be best you stick to safer settings.
Ok, i got back to original BIOS to take some more benchmarks:

Screenshot 2018-11-01 at 18.47.57.jpg
I will try original BIOS + timings patch, i think.
 

Attachments

  • Screenshot 2018-11-01 at 18.47.57.jpg
    Screenshot 2018-11-01 at 18.47.57.jpg
    1.2 MB · Views: 351
what did you want to do with the card?
games, compute etc
i just wanted something for video editing faster than my old GTX 770 and with more vram as im working on 4K video and resolve 15 will crash if your cards vram is to small on higher res video.

i dont think any of the changes i did gave any real boost to video games and taking my core clock down to 1300mhz id bet loss a tad of speed

but for video im much happier even with the slight down clock it's a lot faster than my GTX 770, with 8GB vram it's not crashing all the time (in windows i saw 6GB vram reported used when doing a test on my project) and with lower core clock the gpu fans where much slower = less noise which i love when working on video as it was distracting.

i posted lots of info on what i saw with my card so if your new to oc'ing it's worth a look and if it's working well depends how much time you want to spend, i gave up after spending two days reading about it

also you dont need to re flash you can just do some tests in windows to see how it works at different mem speeds core clocks etc and watching temps power use and so on

edit
one tip your gpu z screen shot shows what brand of vram your card has ;) different brands work at different speeds use HWINFO64 to look for mem error's and watch power use and so on in windows
gpu z is vary hand and can show ASIC quality as well as lots of info
wattman will let you play with different settings
i used luxmark and Superposition Benchmark + resolve with sample projects to check settings.
in osx there's a few apps to watch power use i used hardware monitor, activity monitor will show gpu use and the terminal script will give you info in osx for temps/fan speed/gpu use etc

and in post 25 i talk about using cinematic mode in superposition to relay test the gpu for longer, im not a fan of fur mark
 
Last edited:
what did you want to do with the card?
games, compute etc
i just wanted something for video editing faster than my old GTX 770 and with more vram as im working on 4K video and resolve 15 will crash if your cards vram is to small on higher res video.

i dont think any of the changes i did gave any real boost to video games and taking my core clock down to 1300mhz id bet loss a tad of speed

but for video im much happier even with the slight down clock it's a lot faster than my GTX 770, with 8GB vram it's not crashing all the time (in windows i saw 6GB vram reported used when doing a test on my project) and with lower core clock the gpu fans where much slower = less noise which i love when working on video as it was distracting.

i posted lots of info on what i saw with my card so if your new to oc'ing it's worth a look and if it's working well depends how much time you want to spend, i gave up after spending two days reading about it

also you dont need to re flash you can just do some tests in windows to see how it works at different mem speeds core clocks etc and watching temps power use and so on
Thank you for reply, i use my card only for video editing (FCPX, Motion) and in Adobe products only in macOS. RX580 does not seem to boost my FCPX timeline usage ( i described it here).
So i thought to try some BIOS tweaks.
 
  • Like
Reactions: orph
you may just be limited by cpu or video codecs ram id need more info, if you look at my screen shots of resolve you can see im showing cpu, power ram etc

if you can give more info ill see if i can help, im a tad rusty PP cs6 i know a lot about but CC PP iv not used im ok with FCX but not massive knowledge on optimization for it

mostly use resolve now (and resolve is GPU reliant in a way PP cs6 and FCX is not)
 
Here is my Original BIOS settings:
View attachment 800243
How card is detected by GPU_Z:
View attachment 800244

Here is how i modded it according to your High Performance mode with 1 click timing patch:
View attachment 800245

Card was flashed successfully, i tried to launch game in WIndows after flashing (Batman Arkham Night), Wattman showed this usage:
View attachment 800248

But after 10 min of playing PC crashed. ;-(

Back to Macos, i did LUXMark test. I can not say that anything has changed, i got it around 13000 earlier, it is the same after flashing, i don't know why:
View attachment 800249

By the way, why are there two VRAMS? Samsung and Hynix?

Because there are more than one VRAM supplier. Sometimes, the graphic card manufacture will make a VBIOS that universal to different VRAM. So, no matter which VRAM is used, the VBIOS can automatically select the code to drive them.

Why Luxmark shows 300 mhz?

A bug in Luxmark. Purely cosmetic. If your card can have this score with just 300MHz core speed, that will be a high efficiency GPU.

---------------------------------------------

For me, it looks like you didn't flash the modded ROM at all. If your ROM is modded, according to your screen capture, the VRAM should run at 2250MHz. However, ioreg shows 2000MHz (stock setting). May be you actually flashed the original unmodded ROM back in, but not the modded one.

Your Wattman capture didn't show that if you hard tuned the GPU parameter, or really reading that from the ROM.

Same as GPU-Z, it read the parameter from the RAM, not the ROM. If you OCed the card by Wattman, GPU-Z will shows you the software OC parameter, not the ROM's parameter.

Anyway, run Unigine Heaven in Windows (Extreme preset), and use Hardware Info to check if there is any VRAM error. If yes, reduce the VRAM clock speed. Not all card can do 2250MHz without crash.
 
Last edited:
  • Like
Reactions: orph
Finally have some time to study the VBIOS's fan / temperature setting.

What I found is actually all settings are working as expected in macOS.

What I tested are the following four parameters.
OC_high_fan.PNG


1) Target temperature - As expected, the cooler will try to keep the GPU at that temperature. However, seems will only keep it until the fan reach "Med PWM". So, the default setting is 40%. Once reach that, the temperature can still go up, but the fan speed will go up faster. I changed it to 65C, no specific reason, just give it 10C buffer below my own defined Max temperature.

2) Max RPM - Default setting 2280RPM (60%). If you don't alter this number, the GPU will continue to heat up until "Max Temp", then thermal throttling kick in, but the fan still limited to 60%. In other words, the performance will be lowered due to the fan can't go beyond 2280RPM. I changed it to 3200, this is not a random number, but the real 100% fan RPM from AMD Wattman.

3) Acoustic Limit - still not 100% sure what's that precisely mean. Should be something like "when the fans can go above Min PWM". Anyway, the lower (GPU clock) you set, the earlier the higher fan speed can kick in. I set it to 900MHz, because once beyond that, my card is no more in the "low power state".

4) Max Temperature - Once reach this point, throttling will kick in, the GPU won't go significantly above this temperature (but momentarily 1-2C should be still possible). Default is 84C. Sure the RX580 can do that without any issue. However, I just set it to 75C. That's my personal preferred continuous max temperature for GPU. And it's a reasonable number for testing purpose.

So, I modded the above 4 parameters. Flash the card, and test the RX580 again in macOS. End up as expect. The graphic card's fans kick in earlier than before. Once reach 65C, the fan spin up quicker, and able to go above 2280RPM. So, even in Furmark with 37C system ambient, the GPU can still stabilised at 75C with just slightly thermal throttling. But won't continue to warm up to 85C (factory setting).
Furmark high fan.jpg
 
Last edited:
Finally have some time to study the VBIOS's fan / temperature setting.

Wondering if you can help me with the fan curve of my Sapphire RX 580 Nitro+

The fans are constantly starting when reaching 50-52C while the computer is idle and then spinning down to off, it's quite annoying! I would prefer the fans spin up at 60C (or always set at a lower RPM if required).

Reading your fan analysis, I am confused on how I would achieve this.

1) I could change the Target Temp, however I think the stock 75C is fine (I don't want scorch the card!)
2) I will leave the Max RPM alone I think, as I am not looking to change this behaviour
3) I don't want the fans to kick in earlier. I could increase the Min, Med and High PWM by 10%? (This is my best answer)
4) I don't think I should change change when throttling kicks in (I don't want scorch the card!)

Attached is my ROM. As you can see it's similar to the Pulse, except for a few changes, which I have highlighted.

rom.png
 
Wondering if you can help me with the fan curve of my Sapphire RX 580 Nitro+

The fans are constantly starting when reaching 50-52C while the computer is idle and then spinning down to off, it's quite annoying! I would prefer the fans spin up at 60C (or always set at a lower RPM if required).

Reading your fan analysis, I am confused on how I would achieve this.

1) I could change the Target Temp, however I think the stock 75C is fine (I don't want scorch the card!)
2) I will leave the Max RPM alone I think, as I am not looking to change this behaviour
3) I don't want the fans to kick in earlier. I could increase the Min, Med and High PWM by 10%? (This is my best answer)
4) I don't think I should change change when throttling kicks in (I don't want scorch the card!)

Attached is my ROM. As you can see it's similar to the Pulse, except for a few changes, which I have highlighted.

View attachment 815360

You may try

Fuzzy fan mode = 0

Min Temp = 60

Med Temp = 70
 
  • Like
Reactions: richard.mac
You may try

Fuzzy fan mode = 0

Min Temp = 60

Med Temp = 70

Thanks. However, fans are still spinning up at 52C until reaching 45C and then stop.

Also my primary monitor (connected via Displayport) is dropping out intermittently and returns after a few sec. This didn't occur before the BIOS flash.

Below is my ROM edited with PolarisBiosEditor (I am using atitool https://github.com/kellabyte/atitool to read the edited BIOS in macOS).

For some reason "Legacy or Fuzzy Fan Mode" is showing as "1". I'll boot into Windows and check this.

----------------------------------------
ROM
----------------------------------------
VendorID: 0x5249
DeviceID: 0x1002
SubID: 0xe366
SubVendorID: 0x1da2
Firmware signature: 0x4d4f5441

----------------------------------------
Powerplay
----------------------------------------
Max GPU freq (Mhz): 2000
Max memory freq (Mhz): 2250
Power control limit (%%): 50

----------------------------------------
Powertune
----------------------------------------
TDP (W): 105
TDC (A): 135
Max Power Limit (W): 150
Max Temp. (C): 82
Shutdown Temp. (C): 90
Hotspot Temp. (C): 105

----------------------------------------
Fan
----------------------------------------
Temp. Hysteresis: 3
Min Temp. (C): 60
Med Temp. (C): 70
High Temp. (C): 85
Max Temp. (C): 109
Legacy or Fuzzy Fan Mode: 1
Min PWM (%): 60
Med PWM (%): 40
High PWM (%): 60
Max PWM (%): 1
Max RPM: 2200
Sensitivity: 4836
Acoustic Limit (MHz): 1411

----------------------------------------
GPU
----------------------------------------
300 mV: 800 Mhz
550 mV: 950 Mhz
650 mV: 950 Mhz
750 mV: 950 Mhz
850 mV: 950 Mhz
950 mV: 950 Mhz
1050 mV: 950 Mhz
1411 mV: 950 Mhz

----------------------------------------
VRAM
----------------------------------------
MT51J256M3

[doublepost=1547284938][/doublepost]Ok so Fuzzy Fan Mode is set as "1" in PolarisBiosEditor on my edited ROM.

I set Fuzzy Fan Mode to "0" and saved and reopened - still set to 1.

Opened my default ROM and only set Fuzzy to 0. Saved as a new ROM and then reopened in Polaris. Still set to 1.
 
Last edited:
Thanks. However, fans are still spinning up at 52C until reaching 45C and then stop.

Also my primary monitor (connected via Displayport) is dropping out intermittently and returns after a few sec. This didn't occur before the BIOS flash.

Below is my ROM edited with PolarisBiosEditor (I am using atitool https://github.com/kellabyte/atitool to read the edited BIOS in macOS).

For some reason "Legacy or Fuzzy Fan Mode" is showing as "1". I'll boot into Windows and check this.

----------------------------------------
ROM
----------------------------------------
VendorID: 0x5249
DeviceID: 0x1002
SubID: 0xe366
SubVendorID: 0x1da2
Firmware signature: 0x4d4f5441

----------------------------------------
Powerplay
----------------------------------------
Max GPU freq (Mhz): 2000
Max memory freq (Mhz): 2250
Power control limit (%%): 50

----------------------------------------
Powertune
----------------------------------------
TDP (W): 105
TDC (A): 135
Max Power Limit (W): 150
Max Temp. (C): 82
Shutdown Temp. (C): 90
Hotspot Temp. (C): 105

----------------------------------------
Fan
----------------------------------------
Temp. Hysteresis: 3
Min Temp. (C): 60
Med Temp. (C): 70
High Temp. (C): 85
Max Temp. (C): 109
Legacy or Fuzzy Fan Mode: 1
Min PWM (%): 60
Med PWM (%): 40
High PWM (%): 60
Max PWM (%): 1
Max RPM: 2200
Sensitivity: 4836
Acoustic Limit (MHz): 1411

----------------------------------------
GPU
----------------------------------------
300 mV: 800 Mhz
550 mV: 950 Mhz
650 mV: 950 Mhz
750 mV: 950 Mhz
850 mV: 950 Mhz
950 mV: 950 Mhz
1050 mV: 950 Mhz
1411 mV: 950 Mhz

----------------------------------------
VRAM
----------------------------------------
MT51J256M3

[doublepost=1547284938][/doublepost]Ok so Fuzzy Fan Mode is set as "1" in PolarisBiosEditor on my edited ROM.

I set Fuzzy Fan Mode to "0" and saved and reopened - still set to 1.

Opened my default ROM and only set Fuzzy to 0. Saved as a new ROM and then reopened in Polaris. Still set to 1.

Then I will say the best method is to do it manually.

Use any Hex Editor to open your ROM image.

Search 01 17 00 00 02
Screenshot 2019-01-12 at 7.19.13 PM.png


And 3 rolls below the "02". This 01 means "zero fan mode ON". If you want the fan can stop at certain temperature. Keep it at 01. And if you want the fan always spin, change that to 00.
Screenshot 2019-01-12 at 7.21.35 PM.png


The byte next to it is the "fan stop temperature". For my PULSE, the default setting is 2E, which mean 46C. e.g. if you want it to stop below 50, then change it to 32.
Screenshot 2019-01-12 at 7.21.41 PM.png


Last one, the most important one for you. Is the follow byte, "fan start temperature". My PULSE default is 36 (equivalent to 54C). If you want the fan start at 60C, change it to 3C.
Screenshot 2019-01-12 at 7.21.44 PM.png


And if still doesn't work. Then you may disable Fuzzy fan mode. Change the byte between "max temp" (2A in this case), and "max PWM" (64 in this case) from 01 to 00 will change the fan mode from Fuzzy mode to Legacy mode.
Screenshot 2019-01-12 at 7.34.15 PM.png


For the PULSE RX580 8GB card. I am 100% the above mod can work. e.g. I turn OFF Zero Fan Mode. And now my RX580's fan always spin, even already below the Fan Stop Temperature 46C, the fan still spinning to assist cooling.
Screenshot 2019-06-03 at 10.22.44 PM.png
 
Last edited:
Then I will say the best method is to do it manually.

Use any Hex Editor to open your ROM image.

Search 01 17 00 00 02
View attachment 815382

And 3 rolls below the "02". This 01 means "zero fan mode ON". If you want the fan can stop at certain temperature. Keep it at 01. And if you want the fan always spin, change that to 00.
View attachment 815383

The byte next to it is the "fan stop temperature". For my PULSE, the default setting is 2E, which mean 46C. e.g. if you want it to stop below 50, then change it to 32.
View attachment 815384

Last one, the most important one for you. Is the follow byte, "fan start temperature". My PULSE default is 36 (equivalent to 54C). If you want the fan start at 60C, change it to 3C.
View attachment 815385

And if still doesn't work. Then you may disable Fuzzy fan mode. Change the byte between "max temp" (2A in this case), and "max PWM" (64 in this case) from 01 to 00 will change the fan mode from Fuzzy mode to Legacy mode.
View attachment 815386

Interesting thanks!

Which mac Hex editor are you using? I couldn't find the one you use. I tried Hex Fiend and 0xED and I could not find "01 17 00 00 02". I ended up using http://hexed.it and could find the hex series.

I completed screenshot 4 - replaced 36 with 3C - and exported to .rom

Attempted to flash with atiflash and I get error "VBIOS image not found" a couple of seconds after the progress starts.

ROM is attached
[doublepost=1547295967][/doublepost]Edit: rom does not open in Polaris. I'll await your advice.
 

Attachments

  • RX580nitroVbiosLEG_fan2.zip
    108 KB · Views: 378
Interesting thanks!

Which mac Hex editor are you using? I couldn't find the one you use. I tried Hex Fiend and 0xED and I could not find "01 17 00 00 02". I ended up using http://hexed.it and could find the hex series.

I completed screenshot 4 - replaced 36 with 3C - and exported to .rom

Attempted to flash with atiflash and I get error "VBIOS image not found" a couple of seconds after the progress starts.

ROM is attached
[doublepost=1547295967][/doublepost]Edit: rom does not open in Polaris. I'll await your advice.

Oh, forgot to mention that you still need to fix the CRC etc. Use the PolarisBIOSeditor to open the ROM, fix the CRC, and try again.

The one that I use is Hex Miner, no longer available on AppStore.
[doublepost=1547298117][/doublepost]
Edit: rom does not open in Polaris. I'll await your advice.

Will look into that once back home
 
Last edited:
I opened the rom in Polaris and saved as and then I could successfully flash.

As per screenshot, in macOS it seems the card got to 58C and then the fans came on and cooled to around 44C and then switched off.

It seems macOS is very GPU intensive at idle! In Win 8.1 I am getting 39C at idle.

history.png


I'll try disable Fuzzy fan mode tomorrow.

How do I convert temperature to hex? I tried an online hex convertor http://string-functions.com/string-hex.aspx and text "60" converts to hex "3630"?

Additionally I am still getting the intermittent primary monitor shut off (goes black for a moment and and comes back) ☹️ Any thoughts?

Thank you @h9826790 I really appreciate it.
 
Last edited:
How do I convert temperature to hex? I tried an online hex convertor http://string-functions.com/string-hex.aspx and text "60" converts to hex "3630"?

Use the macOS build in calculator (scientific mode).

Select "10" in the upper right panel, then enter 60
Screenshot 2019-01-12 at 10.48.48 PM.png


Then click "16". It will now show you 60 in Hex, which is 3C.
Screenshot 2019-01-12 at 10.48.55 PM.png


30 36 is NOT the Hex representation for 60 (numeric value), but break it down to 6 and 0 (into two single "characters").

36 is the ASCII code (in Hex) of 6. 30 is the ASCII code of 0. Since the representation in coding is reverted. Therefore, if we want the computer to display character 6 and 0 on the screen, we will give it the ASCII code 3036. That's not the proper conversion of 60 (numeric value).

[doublepost=1547305014][/doublepost]
Additionally I am still getting the intermittent primary monitor shut off (goes black for a moment and and comes back) ☹️ Any thoughts?

If the card is not faulty, and you 100% sure the monitor is good. Then most likely is the cable's issue.

I tried multi monitors setup with my PULSE RX580, no such issue. So, quite safe to assume it's not macOS driver problem.

Of course, still possible graphic card's compatibility issue. But the chance should be much lower than cable's issue.
 
Last edited:
I opened the rom in Polaris and saved as and then I could successfully flash.

As per screenshot, in macOS it seems the card got to 58C and then the fans came on and cooled to around 44C and then switched off.

It seems macOS is very GPU intensive at idle! In Win 8.1 I am getting 39C at idle.

Not 100% sure, never try leave RX580 in idle with multi monitors connected in Windows.

But macOS does use quite a bit of the GPU power for most OS UI animation.

Anyway, for multi monitors setup. The RX580 will not able to enter the real low power state, but the normal 2D high power profile (true in any OS).
 
WARNING: ONLY DO THIS IF YOU KNOW EXACTLY WHAT YOU ARE DOING!!!

For dual ROM PULSE RX580 users, this is quite safe to do. But for SINGLE ROM PULSE RX580 users. This can BRICK YOUR CARD! For other RX580 users, this should work. But I can't guarantee.

After dig deeper to the PULSE ROM image, I finally able to make the RX580 run at the exact clock speed with the voltage I want by limiting the Max Vcore inside ROM (by hex editing).

In the original post, we can only mod the voltage pointer, and hope the firmware will assign a "low enough" voltage for the card to let it run cooler (and draw less unnecessary power). However, even I know that my card can run at 1340MHz with 1000mV stably, but I can't simply mod the voltage for 1340MHz to 1000mV, because the 2nd loop in the VBIOS will pickup this abnormal parameter, and revert the voltage back to the factory setting.

However, there is a Max Vcore setting inside ROM. It seems the VBIOS will base on this number to apply a calculated voltage to the GPU at different clock speed. So, I try to hex edit my VBIOS and see if this will eventually make the GPU run at Max Vcore (when at stage 7 clock speed).

The result is positive. I hex edited the ROM to limit the Max Vcore to 1000mV. Then leave the voltage pointer untouched, but set stage 7 GPU clock to 1340MHz, VRAM to 2150MHz, and patch memory timing. After reboot, reload default setting in Wattman to make sure everything as per VBIOS setting. And my GPU now actually able to run at 1340MHz with just ~0.98V (default should be about 1.10V)
15min.PNG


So, I boot back to macOS, and ran the same test again. I can't read the actual voltage in macOS, but from the GPU temperature and fan speed, 99% my RX580 actually running at the same voltage range (as in Windows). A 15min Unigine Heaven loop make the GPU stabilise at 75C with ~1200RPM.
15min macOS.jpg


I know some guys don't like Furmark, because that's way beyond "normal". But I personally love to use that to make sure my card can handle even the most extreme case.

Before I edit the max voltage. If I run Furmark in macOS, the card will hit its power limit, and automatically down clock the GPU to 1263MHz to fit itself inside the power envelope. GPU temperature also able to go above 75C easily (as per post #1, max temperature is 84C)
1366 Furmark.jpg


And now, if I run Furmark in macOS again. The GPU will able to stay at the assigned max clock speed 1340MHz. Even run for 10min, still no throttling. And the GPU temperature will able to stay at 75C with no more than 2280RPM.
1340+2150.jpg


Despite I can't read the GPU voltage in macOS. It's quite clear that my mod has some effect. And most likely, everything is working as expected.

So, how to do it? (due to the risk of this mod, I won't make it too details for each single step. If you can't follow it, which most likely means you should not apply this mod)

At this stage, I assume you know how to dump / flash the ROM, if not, stop here. Do not attempt to do this mod. It's too danger for you.

And I assume you ran lots of tests to find out the min operating voltage for your card. There is no work around. Each GPU is different. 1000mV may be good for my RX580, but may crash your RX580 at the same clock speed. So, you have to run few different benchmarks, slowly reduce the voltage, find the minimum stable voltage. Set that in Wattman, and run the benchmarks again, then record down the PEAK voltage (from GPU-Z or hardware info, etc). And the Max Vcore MUST above this peak voltage, otherwise, no way to guarantee stability (in worst case, if the voltage is too low, the card may not able to boot properly again, which may effectively brick your card). Again, if you don't know how to find out this minimum stable voltage, stop here.

N.B. No matter what min voltage you found, you MUST round it up to the next valid Max Vcore step. Each Max Vcore step is 25mV. So, let's say I find out my GPU can run at 988mV, the next step will be 1000mV. If your GPU can run at 1003mV, the next step will be 1025mV.

If you want Orinoco Framebuffer, I suggest you apply the part number patch before this mod.

After we open up a RX580 ROM in Hex editor, search 0C 01 03 06, and you will see something like this
Screenshot 2019-01-13 at 5.14.27 AM.png


The C0 D4 01 right after is the Max Vcore. To convert it back to the voltage

C0 D4 01 -> 0x1D4C0 (Hex) -> 120000 (Dec) -> 1200mV.

In my case, I want to limit the Vcore to 1000mV. So

1000mV -> 100000 (Dec) -> 0x186A0 (Hex) -> A0 86 01
Screenshot 2019-01-13 at 5.28.01 AM.png


At this point, we finished the 1st half, and need to fix the 2nd half. Otherwise, the ROM will not work.

So, now we search 8A 00 EB FF FF FF (which should be just 16 row below)
Screenshot 2019-01-13 at 5.30.21 AM.png


The C0 12 right after that are the bytes that we need to fix. So, what's that mean?

C0 12 -> 0x12C0 (Hex) -> 4800 (Dec) -> 1200 mV (4800 / 4)

And since I want to limit the Vcore to 1000mV. Therefore, I need

1000mV -> 4000 (Dec, 1000 x 4) -> 0xFA0 (Hex) -> A0 0F
Screenshot 2019-01-13 at 5.34.33 AM.png


Yes, all we need to do is just mod these 5 bytes. Then this VBIOS now will limit my GPU's Vcore to no more than 1000mV. However, this manual hex edit will "void" the ROM. This time, we don't need to do any manual calculation (like what we did in the part number patch). We can simply open this modded ROM in PolarisBiosEditor 1.6.7 (in Windows), the software will warn that the ROM is broken, then we just need to save the ROM again, PolarisBiosEditor 1.6.7 will fix the CRC automatically.

Of course, if you want to mod the clock speed, or apply memory timing patch, etc, you can also do it inside PolarisBiosEditor 1.6.7 now. You should already know your target setting during found out the min stable voltage.

N.B. Do NOT touch the voltage pointer. Leave it at default setting. e.g. 65288 for stage 7 clock speed. I never test this mod with any other voltage pointer.

And now, after you save this newly modded VBIOS, you finally have ROM that will let the GPU run at the clock speed and voltage you want.

As usually, I perform a Luxmark test run with my newly modded ROM. Pretty good result.
1340+2150 (1000vM).png


So, 4 months after I try to downvolt the card, I finally able to make it perform exactly as I want in macOS by just editing the ROM. A permanent solution that won't be affected by OS / kext update.
 
Last edited:
Below is my ROM edited with PolarisBiosEditor (I am using atitool https://github.com/kellabyte/atitool to read the edited BIOS in macOS).

For some reason "Legacy or Fuzzy Fan Mode" is showing as "1".
...
I set Fuzzy Fan Mode to "0" and saved and reopened - still set to 1.

Opened my default ROM and only set Fuzzy to 0. Saved as a new ROM and then reopened in Polaris. Still set to 1.

I had this issue also. Found another PolarisBiosEditor that works with this setting (keeping it at 0, legacy mode).

https://github.com/vvaske/PolarisBiosEditor

Confirmed working.

At 0 (legacy), I have been able to set pwm speed to 0 and with min temp settings get the fans to kick in later that default around 50 degrees. Nice. Working with a Sapphire RX580 Nitro+ with dual bios, modding the Power Saving bios (hardware switch towards card end with display connectors).

I do see excessive power usage and temperatur in macOS compared to Windows.

ioreg -l | grep PerformanceStatistics | cut -d '{' -f 2 | tr '|' ',' | tr -d '}' | tr ',' '\n' | grep -i 'Temp\|Fan\|%\|(W)\|Hz|state|(V)|volt|vddc'
...
"Device Utilization %"=3
"Fan Speed(%)"=0
"GPU Activity(%)"=0
"Fan Speed(RPM)"=0
"Temperature(C)"=51
"Total Power(W)"=109

As you see, over 100W at idle. Core and memory clock by iStat report idling around 300 MHz (core often with small spikes). Looking for a way to lower that usage and temp. Haven't actively undervolted (yet), but using Power Saving bios setting that according to Polaris runs at 900 mv iirc.

Also have an issue with clock throttling often not working after waking from sleep. A display resolution change (to something else, then back again) resolves that issue and it can throttle down again at idle - but annoying. Hope to solve this also. Close to having a nice setup then.
 
I had this issue also. Found another PolarisBiosEditor that works with this setting (keeping it at 0, legacy mode).

https://github.com/vvaske/PolarisBiosEditor

Confirmed working.

At 0 (legacy), I have been able to set pwm speed to 0 and with min temp settings get the fans to kick in later that default around 50 degrees. Nice. Working with a Sapphire RX580 Nitro+ with dual bios, modding the Power Saving bios (hardware switch towards card end with display connectors).

I do see excessive power usage and temperatur in macOS compared to Windows.

ioreg -l | grep PerformanceStatistics | cut -d '{' -f 2 | tr '|' ',' | tr -d '}' | tr ',' '\n' | grep -i 'Temp\|Fan\|%\|(W)\|Hz|state|(V)|volt|vddc'
...
"Device Utilization %"=3
"Fan Speed(%)"=0
"GPU Activity(%)"=0
"Fan Speed(RPM)"=0
"Temperature(C)"=51
"Total Power(W)"=109

As you see, over 100W at idle. Core and memory clock by iStat report idling around 300 MHz (core often with small spikes). Looking for a way to lower that usage and temp. Haven't actively undervolted (yet), but using Power Saving bios setting that according to Polaris runs at 900 mv iirc.

Also have an issue with clock throttling often not working after waking from sleep. A display resolution change (to something else, then back again) resolves that issue and it can throttle down again at idle - but annoying. Hope to solve this also. Close to having a nice setup then.

That 109W is NOT the GPU power draw, but something else. You better ignore that.

It can indicate the system drawing more or less, but can't indicate how much the graphic card is drawing.
 
That 109W is NOT the GPU power draw, but something else. You better ignore that.

OK. "Wattever" the watt usage reported, temps are still at least 10 degrees higher idling in macOS than Windows, so it must burn more watts in macOS. Would love to improve on that.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.