Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
My Luxmark scores improved from ~13200 to ~15000 just by doing this. My temps also fell from 70c max to 67c max load. I will try the timing patch next to see if it improves some more...

I just did the "One Click Timing Patch" and my Luxmark score jumped to ~15200 now. Excellent results from original ~13200 score with stock BIOS.

I checked your modded BIOS thunder72fr and your fan settings are at defaults, only GPU/MEM settings are changed. When do your fans spin up? 50c too? I want 55c or ON at idle at a low 10-15% speed would be good...

I tried different fan settings, min temp = 50c (from 40c) max temp = 110c (from 109c) and target temp = 80c (from 75c) but my fan still seems to come on at ~50c @ 17-20% (~800-900rpm). My idle temp remains ~46c but when I open Google Chrome etc (slight load) things jump to 50-51c and fans kick in to cool things down back to 46c.

How can I reduce this "yo-yo" effect to keep fans from spinning up/down less often? Anyone know what Temp Hysterisis does?
 
Last edited:
  • Like
Reactions: h9826790
I just did the "One Click Timing Patch" and my Luxmark score jumped to ~17200 now. Excellent results from original ~13200 score with stock BIOS.

I checked your modded BIOS thunder72fr and your fan settings are at defaults, only GPU/MEM settings are changed. When do your fans spin up? 50c too? I want 55c or ON at idle at a low 10-15% speed would be good...

I tried different fan settings, min temp = 50c (from 40c) max temp = 110c (from 109c) and target temp = 80c (from 75c) but my fan still seems to come on at ~50c @ 17-20% (~800-900rpm). My idle temp remains ~46c but when I open Google Chrome etc (slight load) things jump to 50-51c and fans kick in to cool things down back to 46c.

How can I reduce this "yo-yo" effect to keep fans from spinning up/down less often? Anyone know what Temp Hysterisis does?

Temp Hysterisis is the “buffer”.

E.g. if the setting is 50C then 20% fan.

Without Temp Hysterisis, which may means
49C = 0RPM, once hit 50C, fan spin up to 20%. But once the temperature drop back to 49C, fan stop straight away. This can be extremely annoying.

Therefore, if temp Hysterisis =3C, when the GPU drop back to 49C, the fan still spinning, until it hit 47C (50-3), then the fan spin down to zero again.

Usually, the real annoying thing is the fan noise change, but not the noise itself (for low RPM). Increase Temp Hysterisis should give you less fan noise change.

However, the PULSE 580 use fuzzy fan logic, and I really don’t know how fuzzy it is. If you want better control of the fan profile, you may want to turn it off as well.

After I applied the WX7100 profile + voltage pointer mod + memory timing patch. I am very happy to keep the native fan profile, and didn’t touch this yet. If you don’t mind, please let us know what you find when you go through the fan profile mod. Thanks!
 
up to a point the card is ignoring the power settings in the bios, i did tests in windows that showed it.
from looking in google as i mentioned i saw reports that AMD had "fixed" the power setting to stop people killing cards with crazy power settings when trying to mine (ie newbie idiot miners on the gold rush)

i only had it on stock for a bit and did not do full tests to see how far it relay changed, from the tests on the modded bios voltage seems to be linked to mhz speed of the card more than the pointers

in the post i linked, it showed how to mod the power settings in a big way, you have to use a hex editor.

and at 1300mhz its relay keeps the fan on low and with almost no real speed loss, with the ram timings you do end up faster in all (and my card throttled hard at stock bios so actually was way slower even at the higher speed settings)
 
I don’t think it’s quite possible to keep 12% OC + 2250MHz memory to run superposition without throttling.

Even there is no power draw limit. The card should hit the thermal throttling point pretty quickly with this setting, unless the outside air temperature is really very low. Also. you may have to open the side penal, and put a fan to wind chill the whole card.
 
  • Like
Reactions: BillyBobBongo
As always my friend, you are indeed correct. Whilst it can complete the Superposition Benchmark at 12%, it is not always stable. It can run one Benchmark cycle of Heaven at this setting, but there are artefacts when it gets to the end of the Benchmark cycle in Valley. If I let Heaven run for more than one cycle it crashes.

If I drop it to 10% (1505 MHz...quite a stable line too) it will run Heaven for extended periods of time and the temperature sits around 78 degrees Celsius. At this setting artefacts in Valley are also resolved.

In macOS I do a lot of visual design, as well as a spot of video editing, so the stock clocks are more than adequate. However in Windows I Sim Race in VR so I'm always trying to milk out every last spare FPS from my system. I know that I should really just build a Windows system and drop in a GTX 1080, but I can't help but fiddle to see how far I can push it.
 
Well after playing around with fan speeds, nothing seems to make a difference in fan adjustment apart from the fuzzy fan mode (1) to legacy fan mode (0) setting. This allows the fans to spin at min PWN value which is 20% by default even when the card is at idle.

Unfortunately, temp hysteresis set at 6 or even up to 10 makes no difference from the default 3 value. Neither does adjusting max temp, target temp or acoustic limit. No matter what I tried, my fans always seem to come alive at 50-51c @ 20% or 800-900rpm.

It looks like the AMD9500Controller.kext takes over BIOS settings after it boots to adjust its own fan speeds?

There is a thread on another forum here showing users editing their AMD10000Controller.kext (Vega56/64) to adjust core speed/voltage, memory speed/voltage and temp target/idle fan speeds instead of flashing the BIOS.
 
Well after playing around with fan speeds, nothing seems to make a difference in fan adjustment apart from the fuzzy fan mode (1) to legacy fan mode (0) setting. This allows the fans to spin at min PWN value which is 20% by default even when the card is at idle.

Unfortunately, temp hysteresis set at 6 or even up to 10 makes no difference from the default 3 value. Neither does adjusting max temp, target temp or acoustic limit. No matter what I tried, my fans always seem to come alive at 50-51c @ 20% or 800-900rpm.

It looks like the AMD9500Controller.kext takes over BIOS settings after it boots to adjust its own fan speeds?

There is a thread on another forum here showing users editing their AMD10000Controller.kext (Vega56/64) to adjust core speed/voltage, memory speed/voltage and temp target/idle fan speeds instead of flashing the BIOS.

Thanks for the tests and report. So, as quick fix, turn OFF fuzzy fan mode should keep the fan at 20% most of the time, and as long as the fan noise is constant, that shouldn’t be too annoying (the cMP’s own fans are not completely silent anyway).

I would like to study that kext edit thing as well. Even though I prefer BIOS mod more (won’t break after any OS update). But still want to know more about this backup method, especially we can mod the Vega’s ROM, it’s time to move on and learn something new.
 
the best info i found on BIOS mods was this 514 page topic on it
https://www.overclock.net/forum/67-...ios-editing-rx5xx-rx4xx-326.html#post26109883
but after reading 300+ pages it hit me that im happy with 1300mhz :D and dont want to read it all, did think about asking for help there but now im just watching this topic.
also that forums have a few more dedicated topics on Polaris and apps for bios changes too

also dont want to re flash my card to much as it's single bios and im new to it :oops:

ill play with setting the card in the first and second PCI slot later & PCI fan speed's in osx to see how it effects the auto fan startup thing (always wondered how ventilated the back of the card wants/needs to be) at some point.

& i do think working with power play tables in the kext is an good option (and did find the fan spin up annoying at first but now i hardly notice it)
 
Just got a SAPPHIRE PULSE RADEON RX 580 8GB from a guy but it has a mining Bios config on it. What would be the optimal bios to put on this card. I want to run it in Mojave on my 2010 MacPro. I have the Y to 8pin power cable. The sticker on the card has a PN: 299-4E353-010SA SKU#: 11265-05

Also I want to say I remember somewhere someone saying that some of these cards could have a MacEdition Bios put on them. does this enable HDMI Audio by default?
 
  • Like
Reactions: itadampf
Just got a SAPPHIRE PULSE RADEON RX 580 8GB from a guy but it has a mining Bios config on it. What would be the optimal bios to put on this card. I want to run it in Mojave on my 2010 MacPro. I have the Y to 8pin power cable. The sticker on the card has a PN: 299-4E353-010SA SKU#: 11265-05

Also I want to say I remember somewhere someone saying that some of these cards could have a MacEdition Bios put on them. does this enable HDMI Audio by default?

This: https://forums.macrumors.com/threads/sapphire-pulse-rx580-8gb-vbios-study.2133607/#post-26395634
 
Ok I got my Sapphire pulse in and installed I have the stock bios form tech power up installed. But here is the thing my scores all went down from the Low Power Dell RX480/580 card can someone tell me whey I'm getting such bad numbers.

LUxMark went all the way down to a 13557 I have seen people getting scores over 16K with this same card what is the deal?

here is the bios in the editor.

Screen Shot 2018-10-21 at 8.09.45 PM.png
 
Last edited:
@calmasacow 13557 lux score is about the default score of a RX 580, have a look at the old Polaris thread you can see how most people hit around (with a RX 580) 13000-14000 for lux ball and vega hits 24000 (from memory)
https://forums.macrumors.com/threads/amd-polaris-vega-gpu-macos-support.2083168/
just have a look you will see lots of screenshots with big variance in scores.
variance can be from different card bios/models/heatsinks etc etc so it's never simple.

the people with higher scores have overclocked there cards (or have better cards)
see
https://forums.macrumors.com/threads/sapphire-pulse-rx580-8gb-vbios-study.2133607/

now lux mark is not not like games it score seems massively dependent on memory speed, so i got my score higher in lux mark but in say games i saw no change (maybe slower as i under clocked my card gpu core)
but lux mark i hope dose repents compute workloads so in that way a high score may show slight gains in compute.

a good example is what people have done to RX cards for cripto coins (which is a compute workload not game like), they tend to under clock the core and push the memory.

any way all info you want is in the vbios study topic and it is the same i told you when you had your dell card, be happy with it dont worry over every little bit and just use it to do things.
im able to have more fun playing with video now :D

ps my pulse card thermal throttled hard when it was default some of the models with bigger heat sinks relay can be pushed much harder, 1500mhz gets you close to a GTX1070 (but the power and heat :eek: lol not worth it, the

1300mhz core is what i went with to keep the fans slow and heat down, for me a slight speed loss was worth it for less noise.

ps editing the bios will brake your warranty, my card was used so had no warranty + came with a bad bios for me so i had to change it
 
Last edited:
@calmasacow 13557 lux score is about the default score of a RX 580, have a look at the old Polaris thread you can see how most people hit around (with a RX 580) 13000-14000 for lux ball and vega hits 24000 (from memory)
https://forums.macrumors.com/threads/amd-polaris-vega-gpu-macos-support.2083168/
just have a look you will see lots of screenshots with big variance in scores.
variance can be from different card bios/models/heatsinks etc etc so it's never simple.

the people with higher scores have overclocked there cards (or have better cards)
see
https://forums.macrumors.com/threads/sapphire-pulse-rx580-8gb-vbios-study.2133607/

now lux mark is not not like games it score seems massively dependent on memory speed, so i got my score higher in lux mark but in say games i saw no change (maybe slower as i under clocked my card gpu core)
but lux mark i hope dose repents compute workloads so in that way a high score may show slight gains in compute.

a good example is what people have done to RX cards for cripto coins (which is a compute workload not game like), they tend to under clock the core and push the memory.

any way all info you want is in the vbios study topic and it is the same i told you when you had your dell card, be happy with it dont worry over every little bit and just use it to do things.
im able to have more fun playing with video now :D

ps my pulse card thermal throttled hard when it was default some of the models with bigger heat sinks relay can be pushed much harder, 1500mhz gets you close to a GTX1070 (but the power and heat :eek: lol not worth it, the

1300mhz core is what i went with to keep the fans slow and heat down, for me a slight speed loss was worth it for less noise.

ps editing the bios will brake your warranty, my card was used so had no warranty + came with a bad bios for me so i had to change it


yeah my card is used as well so I'm not really worried about warranty
That being said would be possible to get a screen shot of you pulse setting or perhaps a copy of you rom?
 
see linked topic :rolleyes: it's in the post you quoted

edit must have been tired when i posted that, sorry if i was a tad harsh.
re look at the first two pages of this topic to see bios talked about in detail and explained.

there is risk to flashing with no back up bios so be safe and dont worry to much about score changes, the mem timings (ie mem speed) is what gets high luck mark scores but i suspect that only relay affects compute workflows. (ie not games much)

and as i mentioned even before playing with the card it's much much faster than my old GTX 770 so it's as is a relay nice gpu
 
Last edited:
My RX 580 8GB Pulse used to crash, in Windows, after ~15 minutes of gaming. I found some Wattman settings which were meant to fix this issue, and I haven't had a crash since.

Granted, I'd like to find out how the card runs in OSX and whether or not I should do this flashing process which you've posted about. How should I check, and what should I look out for?

Thanks.
 
My RX 580 8GB Pulse used to crash, in Windows, after ~15 minutes of gaming. I found some Wattman settings which were meant to fix this issue, and I haven't had a crash since.

Granted, I'd like to find out how the card runs in OSX and whether or not I should do this flashing process which you've posted about. How should I check, and what should I look out for?

Thanks.

If your RX580 never crash in macOS (e.g. 15min Furmark / Unigine Heaven (or Valley) / Luxmark stress test). then I would say nothing need to be "fix". But you can make it run better.

Of course, you can simply play games in macOS as well to see if it crash.

Anyway, my RX580 can run 1366MHz in macOS with 1000mV. Which works really well for me. Reasonably cool and quiet, but able to maintain the factory max boost clock with no thermal / power throttling.
 
If your RX580 never crash in macOS (e.g. 15min Furmark / Unigine Heaven (or Valley) / Luxmark stress test). then I would say nothing need to be "fix". But you can make it run better.

Of course, you can simply play games in macOS as well to see if it crash.

Anyway, my RX580 can run 1366MHz in macOS with 1000mV. Which works really well for me. Reasonably cool and quiet, but able to maintain the factory max boost clock with no thermal / power throttling.


It has never crashed in macOS, as far as I can recall, but it used to crash in Windows—with games—before I changed around some Wattman settings.

Did yours ever crash in macOS? Also, while mine hasn't crashed before (both under games and Luxmark), I still am unsure as to whether or not it's running at its full potential.
 
It has never crashed in macOS, as far as I can recall, but it used to crash in Windows—with games—before I changed around some Wattman settings.

Did yours ever crash in macOS? Also, while mine hasn't crashed before (both under games and Luxmark), I still am unsure as to whether or not it's running at its full potential.

My RX580 never crash, I made the original post is just I try to make the card works better. Not because any crash.

And the GPU tends to able to work harder and better in Windows, that's very normal. Very high chance that because only the Windows driver can fully release the card's power, therefore, once too stressful, the card become unstable.

Did you monitor the card's temperature before it crash?

I am with AMD cards for almost 10 years now. They are always come with too high voltage, or very poor optimisation (use lots of extra power to get little bit extra performance). Therefore, the 1st day I get my RX580. I already start to do the downvolt test. And found out my card can run stably at 1000mV in Windows with default clock speed.

I never really run the card at default setting in Windows for a prolong period of time. So, no idea if my card will crash as well if I play games on it.
 
My RX580 never crash, I made the original post is just I try to make the card works better. Not because any crash.

And the GPU tends to able to work harder and better in Windows, that's very normal. Very high chance that because only the Windows driver can fully release the card's power, therefore, once too stressful, the card become unstable.

Did you monitor the card's temperature before it crash?

I am with AMD cards for almost 10 years now. They are always come with too high voltage, or very poor optimisation (use lots of extra power to get little bit extra performance). Therefore, the 1st day I get my RX580. I already start to do the downvolt test. And found out my card can run stably at 1000mV in Windows with default clock speed.

I never really run the card at default setting in Windows for a prolong period of time. So, no idea if my card will crash as well if I play games on it.


In Windows, it seems fine now, but I'm wondering if it could be better in OSX, with some tweaks. Any way to monitor its temperature in OSX? I don't want to it to run inefficiently (regardless of the fact that it hasn't crashed yet).
 
In Windows, it seems fine now, but I'm wondering if it could be better in OSX, with some tweaks. Any way to monitor its temperature in OSX? I don't want to it to run inefficiently (regardless of the fact that it hasn't crashed yet).

I just apply some liquid metal to my RX580. Since I need to test it anyway, so, I did that in Windows. No crash even with OC. So, unless you know the temperature during crash. I personally still treat that as number 1 suspect.

Anyway, if you want to monitor the GPU parameter in macOS, you can use my attached Automater workflow. Simply open it and run (play button), then it will display the GPU parameter, and refresh every 2 seconds.
 

Attachments

  • GPU monitor.workflow.zip
    112 KB · Views: 459
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.