Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
How does the rx 580 connect with the 8 pin connector.
Does it just connect to one 6 pin, or does it use both.

Are there any worries with a 6 to 8 pin adapter, i've always avoided 8 pin cards, as i have heard some people concerned with power issues.....is this rubbish?
You can buy an adapter that uses two 6 pin female connectors that goes to one 8pin male connector to power it safely.
 

Attachments

  • 1428605450950.jpeg
    1428605450950.jpeg
    21 KB · Views: 280
Thanks Billy, I presume the issues people where concerned about where when using just one 6 pin slot.
Need to try and find a card though now.....:mad: damn prospectors.
 
Keep in mind that Vega releases in about a week, that should hopefully change things. Also it'll be interesting to see what sort of support Vega receives once it out.

In the Netherlands we have a great site called Tweakers, it has a pricewatch section for every product. It's crazy to see how much people have been paying for these GPUs, given the price they released at.
 
Yep, you can pre-order a 580 on amazon uk for £288, on the same page there selling for £560 from marketplace :eek:
 
Yep, you can pre-order a 580 on amazon uk for £288, on the same page there selling for £560 from marketplace :eek:
Yep, and my Pre-order on Amazon is for £250 so they've gone up significantly in the month that I've been waiting for it to ship.
 
has anyone tried to use HEVC hardware acceleration with the RX 580 in a Mac Pro ?
HEVC hardware acceleration is only CPU based in macOS unfortunately which limits it to Intel's last two gens.
There is no GPU based HEVC acceleration and Apple don't seem to have much interest in changing that.
 
has anyone tried to use HEVC hardware acceleration with the RX 580 in a Mac Pro ?

HEVC on GPU doesn't exist in macOS yet. But it does exist in Windows, with the default Boot Camp drivers too.

You need to hit the feedback button to insist that Apple implements it otherwise they are slapping you in the face with half baked graphics drivers so that they can force you to upgrade to a computer that does HEVC on CPU.

Even worse, software based decoding on CPU in High Sierra is unplayable on a quad core Skylake. You will get maybe 5FPS and then video will freeze up.

In Windows a dual core Skylake Pentium can do HEVC 60FPS 10 bit 4K software decoding at 25% CPU usage.
 
  • Like
Reactions: Synchro3
HEVC on GPU doesn't exist in macOS yet.

Exaggeration. It isn't mainstream and not with random GPUs. To claim it doesn't exist at all is deeply lacking in evidence.

"Introducing HEIF and HEVC "

https://developer.apple.com/videos/play/wwdc2017/503/

"Working with HEIF and HEVC"
https://developer.apple.com/videos/play/wwdc2017/511/


You need to hit the feedback button to insist that Apple implements it otherwise they are slapping you in the face with half baked graphics drivers so that they can force you to upgrade to a computer that does HEVC on CPU.

Before hitting the feedback button it helps to actually read the documentation.

Even worse, software based decoding on CPU in High Sierra is unplayable on a quad core Skylake. You will get maybe 5FPS and then video will freeze up.

Software HEVC is only on Macs lacking the fixed function GPU support for it. That isn't the only option.
As for decode speed, you have played this on a production, official release of the software?
 
Exaggeration. It isn't mainstream and not with random GPUs. To claim it doesn't exist at all is deeply lacking in evidence.

"Introducing HEIF and HEVC "

https://developer.apple.com/videos/play/wwdc2017/503/

"Working with HEIF and HEVC"
https://developer.apple.com/videos/play/wwdc2017/511/




Before hitting the feedback button it helps to actually read the documentation.



Software HEVC is only on Macs lacking the fixed function GPU support for it. That isn't the only option.
As for decode speed, you have played this on a production, official release of the software?

Refer to Apple documentation instead of generally accepted standards in the computing industry? This is the exact same excuses we have seen from the blind faithful for 20 years and precisely the reason why Apple has always been able to sell a half assed operating system with poor drivers and old APIs...for years.

If you're not pressuring the company to give you the best standards then they won't.

Apple claims every year that they have the world's most advanced operating system. So you just have to ask one question. Only one question...

If Windows in Boot Camp can do it why can't macOS on the same machine?

That question doesn't only apply to HEVC. As we know in this board Apple has tried many ways to use non-standard drivers and hardware to block technologies and force upgrade paths.

But we know that full well here so wasting time debating it does nothing except to wave egos around in a forum and look for any web link that offers confirmation biases. Do something useful. Hit the feedback button and start telling Apple what you expect from "the world's most advanced OS".
 
  • Like
Reactions: itdk92
Last edited:
As an Amazon Associate, MacRumors earns a commission from qualifying purchases made through links in this post.
As an Amazon Associate, MacRumors earns a commission from qualifying purchases made through links in this post.
Yep, and my Pre-order on Amazon is for £250 so they've gone up significantly in the month that I've been waiting for it to ship.

Woow, i didn't realise the wait was that long.
I thought maybe 1-2 weeks, will have to get one in the system now.
 
Has anybody used RX 580 for FCPX and seen a performance difference over and above the Sapphire 7950 Mac Edition card?

I'm considering selling my 7950 and later upgrading to RX 580 or similar...
 
Has anybody used RX 580 for FCPX and seen a performance difference over and above the Sapphire 7950 Mac Edition card?

I'm considering selling my 7950 and later upgrading to RX 580 or similar...

I really doubt if that can make huge difference, unless your workflow is very VRAM limiting.

The RX580 shows ~30% improvement in Luxmark (I consider this as a general compute benchmark). And since FCPX is so optimised for the HD7xxx series card. The difference in FCPX may be smaller than that.

I am quite sure the RX580 can perform better, but I suspect may be 10-15% in general (again, if not VRAM limiting).

Anyway, I am also looking for someone to post their own real world results.
 
  • Like
Reactions: foliovision
I really doubt if that can make huge difference, unless your workflow is very VRAM limiting.

The RX580 shows ~30% improvement in Luxmark (I consider this as a general compute benchmark). And since FCPX is so optimised for the HD7xxx series card. The difference in FCPX may be smaller than that.

I am quite sure the RX580 can perform better, but I suspect may be 10-15% in general (again, if not VRAM limiting).

Anyway, I am also looking for someone to post their own real world results.

Okay thanks for the info. I would be interested in seeing some real world results as well. Really the main thing is having an upgrade path for an AMD GPU after selling the 7950, even if the performance gain isn't "linear" as such.

Anyone have success yet with boot screens using these cards in either Sierra or High Sierra?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.